Evaluation of the Electronic Libraries Programme
Synthesis of Annual Reports



1. INTRODUCTION

1.1 Rationale for Annual Reporting

This Report provides an overview of thirty-six eLib projects [ See Appendix 2 for list of projects.] , based on information presented in their first annual report [ The remaining projects were not required to report this year due to their late starting dates.] .

As part of the overall evaluation design for the eLib programme, an annual reporting framework was put in place so as to ensure that a consistent and coherent set of data was collected from all projects about their activities and progress, the process of implementation, reflections on what has been learned and their revised understandings and expectations about project innovation.

The analysis and synthesis of projects’ annual reports presented in this report represents a ‘stock-taking’ of the first year of the programme that will inform the final summative evaluation of eLib as well as providing information and knowledge to those responsible for the overall direction and management of the programme, and to programme participants. An edited version of this synthesis will also be made available for wider circulation.

This report needs to be read in the context of the Policy Mapping Study [ Policy Mapping Study - The Set-up, Operation and Content of the Electronic Libraries Programme , Tavistock Institute, October 1996.] which sets out the starting intentions of eLib and tracks its evolving nature. Not only are the projects confronted with changing circumstances but the programme as a whole functions within a dynamic environment with rapid developments in technology and fast-changing user requirements.

Whilst our analysis is confined to the material included within the annual reports, it should be noted that most projects have produced other deliverables that amplify different aspects of their progress. Particularly noteworthy are the technical reports which document the choices, decisions and changes made by projects in technical areas. This report does not deal with the strictly technical matters or the technical learning which has accumulated within the programme, important though it is. We do however include a summary table that provides pointers to projects’ technical choices and readers are directed to the individual projects for further information.

1.2 Structure of this Report

In preparing their annual reports, projects were requested to structure their annual reports around four key areas, each with a set of open-ended questions to be addressed (see Appendix 1 for the annual report format). These were:

This synthesis report follows the same framework in its overview of findings from the first year of the programme. Such is the diversity between projects, however, that it has proved necessary to present and discuss the findings at different levels of analysis. At the broadest level, we seek to synthesise findings across all the projects. A second level of analysis relates to the domain areas, each of which has a characteristic set of activities, concerns, experiences and future prospects that we report on. Finally, we comment at times on individual project findings where there are clear and strong implications for other projects or where they are noteworthy in some respect. There remains, however, a wealth of detailed material in the annual reports which is not captured in our synthesis because of its largely idiosyncratic nature. Reading this overview is therefore not a substitute for reading the individual project reports, most of which are available on the Web.

In the concluding section to this synthetic report, we offer our own reflections on what projects have reported, drawing attention to what we perceive to be the strengths and weaknesses in the development and implementation stages of projects during their first year.

1.3 Timeframe

Projects’ annual reports covered the period from the start of the projects to August 1996 (when reports were due). For the large majority of projects, set up between May and September 1995, the reporting period covered the first nine to twelve months of the project lifecycle. A few projects funded in the initial round, however, only became operational at the end of 1995 or the start of 1996 and a further three included in the overview were second round projects that began in April 1996. Thus the thirty-six projects are reporting on their activities, experiences and lessons for periods varying between sixteen and four months. Table 1 below summarises this information.

As well as noting this wide variation between projects in start-up dates and the phase they have reached in the lifecycle by the end of the first reporting period, account should also be taken of the varying numbers of projects in the different domain areas (see Table 2 below). Our overview is thus weighted towards the domain areas with a preponderance of projects, most of which are also the more established ones. Digitisation/Images is at the opposite extreme, with only two projects in our sample, both of which began in 1996.

Table 1: Project Timeframe Covered by Annual Reports (to August 1996)

Start date of project

Reporting period (months)

number of projects

April 1995

16

1

May

15

3

June

14

2

July

13

2

August

12

7

September

11

7

October

10

2

November

9

4

December

8

4

January 1996

7

1

February

6

2

March

5

1

April

4

1

Table 2. Numbers of projects in domain areas (included in our analysis)

Project Domain

No.

Electronic Journals

10

Access to Network Resources

5

Electronic Document Delivery

6

On-Demand Publishing

4

Digitisation/Images

2

Training and Awareness

9

1.4 Quality of the Data

In general, annual reports display an openness about their experiences of the start up phase and the process of implementation, in the spirit of allowing others to learn about the difficulties and what has not gone to plan as much as about what has worked well. This in itself is noteworthy as funded projects are frequently under pressure to present themselves in the best possible light, so precluding important sharing of lessons often painfully learned.

Notwithstanding this commendation, the reports are variable in their quality, both in the extent to which the ‘soft’ data that has been gathered is systematically presented as well as in the nature of their reflections on what has been learned from their experiences so far. Those reports that reveal a depth of analysis and understanding of the underlying design process of technology innovation come not surprisingly from projects led by key actors experienced in the design, development and implementation of information and communication technology applications in real world contexts. Thus, one contribution that this overview can make is to identify ‘good practices’ and to distil the lessons and insights from which others might learn.

We would however wish to emphasise that the eLib projects have effectively been working for only a short period in their funded lifecycle and that for many actors, this is their first experience of multi-partner working in an innovation context. We would expect projects to produce a different kind of report next year, not only because they will have more to report in terms of findings and results but also because of increased familiarity with the innovation development process.

Table of Contents


[ Top of Page ] - [ Up ]

The Electronic Libraries Programme (eLib) was funded by the Joint Information Systems Committee (JISC)
Page version: 1;
Web page maintained by UKOLN Systems Team and hosted by UKOLN - feedback to systems@ukoln.ac.uk .
Site last revised on: Tuesday, 24-Nov-1998 14:21:04 UTC
DC Metadata