Evaluation and Impact assessment for NOF digitise projects

Peter Brophy and Susi Woodhouse

Introduction

The purpose of this section is to outline approaches to the measurement of the impact of resources created as part of the NOF-digitise programme. The section explores the concept of ‘impact’ and examines different approaches that may be taken to assessing it. Different stakeholder perspectives are considered, in particular the differences between demonstrating impact to politicians and using impact measures to manage services. There is a particular focus on the impact of nof-digi on learning, reflecting the overall objective of the programme.

NOF and the evaluation of the nof-digitise programme

The Fund is taking a three-fold approach to the evaluation and impact assessment of the nof-digitise programme, viz:

At the same time, the Fund recognises that individual projects will wish to develop their own evaluatio and impact assessment plans in order to inform future development and sustainability.

What do we mean by ‘impact’?

Impact can be defined in different ways, but in this context, it may be thought of as any effect of nof-digi resources on an individual or group. It

A good way to approach the concept is to think of ‘levels’ of impact which migjht, for example, be:

· Hostility: a user may be so disappointed that he or she decides that it is a total waste of time and money. Perhaps the result is a letter of condemnation to an influential third party such as a councillor. Hopefully, such impacts are very rare.

· Dismissive: the user is not actively hostile, but simply feels that the project is not worthwhile. It is a waste of personal effort to get involved, even if no attempt is made to undermine the project.

· None: the user has neither positive not negative feelings or views. It is almost as if the project didn’t exist.

· Awareness raised: here the project has just about had a positive impact, but simply in terms of the user being made aware of something which he/she was not aware of before. They know the project exists, do not dismiss it out of hand and might turn to it in the future if they feel a need.

· Better Informed: as a result of using project resources, the user has better information than before. This information may have been memorised or recorded for future use.

· Improved knowledge: the information obtained has been considered and the user is now more knowledgeable about the subject.

· Changed perception: the knowledge gained has resulted in a change to the way that the user looks at a subject. Real learning has taken place.

· Changed world view: here the user has been transformed through the use of project materials. His or her view of the world has shifted significantly, and constructive learning has taken place which will have long term effects.

· Changed action: the new world view has led to the user acting in a way he or she would not have done before. Learning has turned into action, so that the encounter with the project has changed not just that user, but - in some way - how they interact with the broader world.

Although nof-digi projects can aim to produce particular levels of impact, they need to be aware that what is achieved may very well be different from what was intended.

What impact is being sought?

Impacts do not, of course, just happen. They occur as a result of exposure to an influence, in this case to nof-digi project materials. Clearly, the project needs to be planned so as to achieve impacts on the identified target audience(s) (not just, for example, to expend funding!). One possible issue for nof-digi projects is that many have several aims and objectives and a diverse target audience and these overlap with other available digital resources. It is all too easy, in these circumstances, to find that the impact of an individual project is unclear because the user may be working with a range of electronic resources. However, there are a number of aspects of a nof-digi project which should be aiming to create impact. Here are some examples of impacts for different elements of a nof-digi project:

The important thing is that there is absolute clarity as to what the objectives of a project are before any attempt is made to assess performance. Otherwise there is a real danger that the results of an assessment will not aid decision making at any level.

Why measure impact?

Before turning to the question of how impacts can be measured, it is worth asking why we should undertake impact measurement in the first place. There are a number of different reasons, and it is very important to be clear which apply in a particular case. Thus the aim of impact measurement might be:

The approach which is taken to impact measurement should reflect the purpose(s) for which the information is needed.

Where does impact measurement fit into the broader picture?

It is important to be clear about how impact assessment fits into the broader picture. One way to think of the place of impact is to consider it within the often-used systems model of organisations:

This is transferrable into the context of a nof-digi project where, for example:

Impact and Learning

The driving force behind the nof-digitise programme is to provide material that supports learning in its broadest sense. It is therefore useful to look at what learning impacts might be and how we might measure them. There is an enormous literature on this subject and many expert educational researchers have devoted time to it, so what can be said here can be no more than a very brief introduction.

It is useful to appreciate two different types of learning: ‘surface’ and ‘deep’.

· Surface learning is characterised by short-term memorising of facts, as they are presented and usually in order to re-present them for a specific purpose. Material may be skimmed but is unquestioned. There is little long-term impact.

· Deep learning occurs where the learner becomes involved with the subject, questions the basic premises and any conclusions which are presented and tries to relate new knowledge to what is already known. As a result the learner changes and the impact is long-term.

One explanation of the learning process suggests that successful learning is usually approached through five key stages (Garrison, 1992), and nof-digi project materials could usefully be deployed to reflect these - particularly project learning packages:

· Problem identification

· A triggering event arouses interest.

· Problem definition

· The subject is clarified; ways of exploring it are identified; links are made to personal experience.

· Problem exploration

· New approaches & new solutions are explored; issues are understood; ideas are disentangled.

· Problem applicability

· Solutions are judged; practical knowledge is assessed.

· Problem integration

· Solutions are acted upon; ideas are applied.

Project can be influential in a variety of different ways in supporting effective learning. For example, projects might design packages that provoke ‘trigger events’, arousing their users’ curiosity - at a simple level a selection of images or sound clips on a topical subject might qualify. As learning progresses, users will need different resources and support - problem exploration, for example, might need in-depth materials taking a multi-media approach. It is also useful to remember that different users will have different learning styles, so that alternative ways of delivering learning packages may be appropriate.

Impact and other policy agendas

There are other policy agendas apart from lifelong learning to which nof-digi projects will contribute. They include:

As with learning, measuring impact is a complex issue which requires considerable contextual understanding and appreciation. It is worth stressing, however, that in all cases the key matter is the impact of the resource on its intended users.

Impact on project staff

So far, most of what has been said relates to the impact of the project on users, but the management and delivery of nof-digi projects clearly impacts significantly on staff and institutions involved. Aspects which which will need to be understood include:

· organisational change: the effects of delivering and sustaining electronic content on organisations are profound as roles change and new responsibilities and relationships are developed. Impacts on staff need to be understood and managed, as do those on the organisation as a whole.

· the challenge of cooperation across organisational boundaries and the need for staff to coordinate their work with that of others employed elsewhere. NOF-digi consortia are exploring new ways of working which will provide many valuable lessons and models for the future.

· the need for skills to be continuously updated and deployed. Technologies change, standards develop and staff need to be in a position to use appropriate developments in the course of their work.

Techniques for Assessing Impact

It will be clear from the above discussion that when we attempt to assess impact we are nearly always forced to use surrogate measures. There are so many variables involved when impacts take place that systematic studies which focus on one aspect have to develop and use indicators which merely suggest whether or not impacts have resulted. For example, individual case studies are valuable, but it will never be feasible to interview every user in depth and then to measure precisely the positive and negative impacts. Bearing in mind the comments above about the role of input, process, output, outcome and impact measures, the following techniques may be useful.

Quantitative techniques

Even a crude input measure, such as resources expended, could suggest that - if we could assume that all else was equal - the impact of one project was likely to be greater than that of another. But such crude indicators are unlikely to be acceptable to most stakeholders these days. Nevertheless, collecting, analysing and presenting numerical data is an important part of assessing any service. For a nof-digi project, one of the major sources of quantitative data will be web server logs. These record every page that is requested by a user (or, more precisely, by a workstation address - which is not quite the same thing). There is a wide variety of software available to analyse web logs including Webaliser ( http://www.mrunix.net/webalizer ); WebTrends ( http://www.webtrends.com ) and

Analog (http://www.analog.cx) - click on ‘See a sample report’ – this leads to ‘Web Server Statistics for the Statistical Laboratory’. Or click on ‘Even prettier reports’ and note the differences in presentation, especially the difference that full-color graphics can make.

Before making use of web log statistics though, its important to be very aware of their limitations, and that they are only a part of the armoury. Two useful descriptions of how the web works and the pitfalls of web statistics are at http://www.statslab.cam.ac.uk/~sret1/analog/docs/webworks.htm and http://www.goldmark.org/netrants/webstats/: this last one is called Why Web Statistics Are (Worse Than) Useless ! Presentations on the collection, analysis and presentation of statistical data for web-based resrouces is available via the UKOLN NOF-digitise support pages at http://www.ukoln.ac.uk/nof/support/workshops/evaluation-2002/

Some more considered approaches to measuring the extent and impact of IT-based services are starting to emerge. In Europe the EQUINOX project (http://equinox.dcu.ie/) recommended a set of performance indicators for electronic libraries while in the United States the Association of Research Libraries (ARL) has launched a ‘New Measures Initiative’ (http://www.arl.org/stats/newmeas/newmeas.html) - the report by

Miller and Schmidt (http://www.arl.org/stats/newmeas/emetrics/miller-schmidt.pdf) is particularly useful.

Qualitative Techniques

When attempting to measure impact it will usually be found that qualitative techniques are most useful. Observation of user behaviour, case studies, interviews, focus groups and similar techniques help to provide a ‘rich picture’ of what is happening as a result of service delivery.

Firstly, it is very important to bear in mind the intended audience of the investigation. How you would conduct an interview with a user who has undertaken a nof-digi learning journey would depend on why you were investigating this particular case. If, for example, you want to contribute a report to the national debate on social inclusion the questions you ask would be rather different from those you would use if the intention was to explore the details of how well the web service had been provided. Results of such interviews could usefully be turned into scenarios demonstrating the impact of a digi project.

Broadly, there are two main approaches. Sometimes we measure the views of users directly – by asking them questions and encouraging them to talk – and sometimes we find indirect ways to measure, for example by observing their behavior, or inviting them to complete a web-based survey. Bear in mind that whatever approach we take, qualitative assessment cannot be divorced from the experience of the users. For that reason, user satisfaction surveys are often prominent, though it is worth remembering that you will often need to dig much deeper than simply satisfied/dissatisfied answers and give users a context within which to provide their responses.

Among the techniques which will be found valuable are:

There is a wealth of sources which offer more detail on these techniques including: TAVI stuff? e-lib stuff? A very good guide, albeit with an American slant, is provided by Hernon and Altman (1998).

As with quantitative approaches, there is useful work going on in relation to assessing the use of and impact of electronic library services. The EQUINOX project (see above) covered some of this ground, while in the USA the LibQUAL+ Project (http://www.libqual.org/) is developing tools for measuring library user perceptions and experiences.

Some Studies of Impacts of Digital Content

eLib

The Joint Information Systems Committee (JISC) established the Electronic Libraries programme (eLib) in 1995 to engage the HE community in developing and shaping the implementation of the electronic library. A range of different aspects was explored the three phases through over 70 projects and supporting studies with funding of £19 million. Evaluation and impact assessment was regarded as a critical element of the programme. The Tavistock Institute undertook a number of evaluation studies into projects, programme areas and the programme as a whole, as well as organising and participating in evaluation workshops and similar events. Their papers can be found at http://www.ukoln.ac.uk/services/elib/papers/tavistock/ and http://www.ukoln.ac.uk/services/elib/papers/other/intro.html#elib-evaluation, together with an evaluation of phase three of the programme by ESYS. Guidelines for eLib Project Evaluation, also prepared by the Tavistock Institute are available at: http://www.ukoln.ac.uk/services/elib/papers/tavistock/evaluation-guide/intro.html

Stories from the Web

http://www.storiesfromtheweb.org/

This is an ongoing project which is integrating ICT and traditional service delivery methods to encourage children to read and to explore their creative talents in writing stories and poetry.

A report of the evaluation and impact assessment approach can be found in: Everall , A. et al. Stories from the Web: a project investigating the use of the Internet in children's libraries to stimulate creative reading among children aged 8-11 (LIC research report 77). London: Library & Information Commission.

DCMS ITChallenge Fund for museums

The Fund provided £500,000 over two years to 11 projects. Each project was developed in collaboration between three or more partners. The aim of the Fund was to encourage innovation in the application of new technologies to museums, and the documentation and evaluation provide a fascinating and valuable insight for anyone considering how ICT can be applied in their museum.

A key purpose of the IT Challenge Fund was to encourage and enable project managers to gain new skills and experience in the application of ICT to delivering museum services. Now that the project websites are up and running, we have been conducting a summative evaluation project in order to capture these skills and experiences and use them to inform guidance for others. The initial results of the project are presented on these pages as a work in progress

http://www.peoplesnetwork.gov.uk/content/eval.asp