Evaluation of the Electronic Libraries Programme
Synthesis of Annual Reports



4. EVALUATION FINDINGS

Projects were asked to report on the findings that are emerging from their evaluation activities, commenting in particular on any general outcomes, effects and impacts. Whilst it was recognised that the focus of most evaluative activity during the first year was likely to be formative, some projects were nonetheless likely to have interim evaluation findings of a more summative nature to report.

Most projects reported on the formative aspect of evaluation in describing their activities during the first year and their experience of implementation. Evaluation was commonly undertaken as an integral part of the development process, entailing a variety of methods for the collection of systematic, structured feedback. The lessons learned from ongoing formative evaluation of this kind have been described above in Section 2: Learning from the Process of Implementation.

In this section, we focus mainly on what projects had to say about outcomes, effects and impacts - whether there are early signs that the project is contributing to the wider programme goals of mobilisation, cultural change, cost-effectiveness and sustainability.

We are also interested in the findings from projects about demand for eLib products and services, their perceived usefulness and aspects of their performance.

Many projects were not in a position to report on evaluation findings at all, either because of delays in project start-up, unfamiliarity with evaluation such that it had been deferred to a later phase of the project lifecycle, or because they had only just begun to collect data or had not yet analysed it. A good few of these commented on the need to re-assess their evaluation plans in the light of the Tavistock workshops on eLib evaluation, or mentioned that they were currently developing their plan to take account of the Guidelines and workshops.

The projects that were best placed to report on interim evaluation findings were those that were already piloting a prototype or demo in early field trials or were implementing a well developed pilot (whether of a training module, software application, course pack, etc.). Such projects were most commonly found in the On Demand Publishing, Access to Network Resources and Training and Awareness domains.

Electronic Journals

Projects in the Electronic Journals domain had little to report on interim evaluation results. A common set of responses was that they were only developing their evaluation plans, that they had not yet begun any systematic data collection or that it was too early for results. Others among the EJ projects had gathered monitoring data on their performance indicators, collected data on usage levels and patterns that were indicative of likely demand and perceived usefulness, or were using a variety of techniques aimed at understanding users’ behaviour including disincentives to interaction.

Access to Network Resources

The Access to Network Resources projects were further advanced in the development of a pilot service or demo which could be used to elicit systematic feedback from users. The range of methods in use included on-line questionnaires, usage logs, observations of use, survey of users, informal feedback from users in training workshops, and focus group discussions. Most projects have captured usage data from computer log files, although there is acknowledgement in some reports that these data can be very misleading and need to be filtered to provide a more accurate picture of usage levels. The importance of treating many access claims with caution is urged by the evaluator of one project. Several projects reported disappointment with the level of responses to on-line questionnaires, indicating a need to combine this method with other forms of structured feedback.

The data gathered by these ANR projects addressed feedback on the user interface, users’ experiences of using the service, usage data and exploratory questions about evolving information search strategies. The feedback from users on their initial impressions and behaviours of the pilot service was used primarily for iterative development of the full service. In some instances, though, the formative feedback has contributed to shaping the ongoing evaluation agenda relating to the longer term issues of sustainability, effectiveness and usefulness. In one project, for example, it has become apparent that a more detailed study is needed of the exact nature of the information that users are looking for, how they intend to use that information and what methods they are currently using to discover it. Another project is already working in close association with a cognitive researcher who is investigating users’ information seeking and information use behaviour and the cognitive processes involved in deciding whether or not to use information retrieved from the Internet. A preliminary finding is that users have developed a range of criteria for assessing the quality of sources which fall primarily into areas of: authority, accuracy, currency and presentation.

Beyond a common finding that most users are pleased with the services being provided and would wish for more extensive coverage, the feedback from users tends to relate to the specific design features or subject base of the gateway. The project reports provide various examples of interim evaluation findings that are being used to develop or refine the pilot service. For example, in one project it has become apparent that evaluating network resources is not yet a staple part of respondents’ daily work, even though most were information professionals. The project response has been to include a session relating to evaluation and selection strategies for networked information resources into the training workshops. In another, usage data suggest that users are employing relatively unsophisticated search techniques. The response in this case has been a project decision to provide more documentation and support as part of the service (eg. user guide and email help desk).

Two projects whose target groups included practising professionals in addition to a university constituency, were surprised in one case by the lack of computer awareness among this practitioner sector and in a second case by the finding that this sub-domain of their target group had not yet found the service useful in a practical way.

Digitisation

The Digitisation projects reported on their evaluation methods which have included feedback on an initial prototype, a questionnaire on image quality, and a walk-though of the prototype by steering committee members - but none has included any findings from these initial evaluation activities.

On Demand Publishing

In the On-Demand Publishing domain area, there are interesting evaluation findings emerging from the field trials of the coursepacks and materials being developed and introduced into teaching environments. Through the use of various qualitative methods including questionnaires, interview schedules and observations as well as data on take-up and usage patterns, the annual reports of the various projects present interim findings that deal mainly with the barriers to successful implementation and the potential for diffusion of ODP.

Such barriers include resistance by academic staff to changes in the learning process, the low level IT literacy levels in the target group and cost factors. The high level of resistance encountered in one project was attributed to the time involved in preparation of learning materials and the need to think one semester ahead whilst coping with increased demands for managing the current semester. In other words, appropriate work processes and organisational arrangements are not yet in place to support these new modes of electronic delivery. A second project reported a disappointing low level of take-up of the courseware packs produced, a finding which reflected the inadequate integration of IT materials into teaching such that the IT course was the only module in which students had to engage with IT. Furthermore, the staff member responsible for teaching this particular course had not been involved in the development of the course materials and hence had no stake in the project. The challenge of the cultural change dimension to ODP was also highlighted by the poor attendance of academic staff at ODP awareness raising seminars organised by the project.

The sustainability of ODP is also dependent on the cost structure of ODP coursepacks. The ODP projects are modelling the various price factors that are likely adversely to affect take up of ODP materials and interim evaluation findings in two or three projects seriously question their cost effectiveness. Field testing of coursepacks in one institutional setting has identified three contributory reasons for why these materials have not proved to be cost effective as per the original project plan:
  • - the difficulties of obtaining copyright permission and the high cost of copyright for materials to be included in the pack
  • - the low demand for course packs from courses when set against the high costs of technology and maintenance
  • - the unwillingness of students to pay for complete course packs when they might just photocopy one or two vital pages from someone else’s copy or the original text

Another ODP project has encountered a perception among students that coursepacks represent a displacement of library costs directly onto students, hence a certain resistance to take-up.

Electronic Document Delivery

The Electronic Document Delivery projects are with one exception not in a position to report on evaluation results. Several projects are either re-working their evaluation plan, or do not have an evaluation plan in place; another project has undertaken a use survey but not yet analysed the results.

The one project which does have interim evaluation findings to report has put in place a monitoring system that gathers detailed transaction statistics based on totals for incoming requests and totals for outgoing requests in terms of requests satisfied and unsatisfied. The statistical returns are collated at weekly intervals at each site and a monthly bulletin provides a clear picture of demand flows, ‘market’ shares, level of throughput, service growth etc. The transaction statistics provide feedback on the overall satisfaction rate of the service as well as the reasons for non-supply but the project acknowledges that more detailed work is required on the economics of cost-recovery or profitability in the shift from support funding to commercial cost-recovery. Other interim evaluation findings from this project concern the organisational arrangements, staffing patterns and work processes that need to be in place for the system to function adequately.

Training and Awareness

The various T&A projects have been monitoring their workshops and other events as part of a process of continuous improvement, or have plans to do so. The methods used for this purpose include surveys, focus group discussions and action feedback from workshops.

In providing feedback on the relevance and effectiveness of the training materials and the individual courses and events that have been organised, participants have been generally favourable in their response. The longer term issues to do with the impact and cost-effectiveness of training will however be addressed by the IMPEL 2 project as part of its wider brief to consider the cultural change dimension.

Reference has already been made above to the interim evaluation findings of one T&A project which shed some doubt on the viability of its cascade model for diffusion. Survey data reveals that there will be problems in establishing formal accredited teaching qualifications in the majority of the institutions linked with the project in its present phase. The evaluation has identified the various factors inhibiting likely diffusion and goes on to propose future remedial options and strategies open to the project.

Table of Contents


[ Top of Page ] - [ Up ]

The Electronic Libraries Programme (eLib) was funded by the Joint Information Systems Committee (JISC)
Page version: 1;
Web page maintained by UKOLN Systems Team and hosted by UKOLN - feedback to systems@ukoln.ac.uk .
Site last revised on: Tuesday, 24-Nov-1998 14:21:03 UTC
DC Metadata