UKOLN AHDS Implementing A Quality Assurance Methodology For Digital Library Programmes



Implementing A Quality Assurance Methodology For Digital Library Programmes

The JISC vision for the Information Environment seeks to provide users with seamless access to quality resources which are distributed across a range of providers, including JISC services, the institutions themselves and commercial vendors. The vision is based on use of open standards, which will allow developers and end user institutions freedom of choice in the application they use to develop and provide access to resources. This approach is reliant on use of open standards to ensure interoperability. This paper outlines the work of JISC's QA Focus advisory service which has been developing a quality assurance methodology and support service which aims to ensure that project deliverables will be interoperable.

Background

Although there is an awareness of the importance of open standards across many institutions and particularly those involvement in development work for the JISC there has not been a culture of rigorous checking to ensure that project deliverables comply with open standards. This is due in part to the developmental culture within the higher education sector, which is supportive of self-motivation and willingness to experiment.

This approach was probably sensible in the early days of Web development: if the eLib programme [1] had begun in the early 1990s use of Gopher rather than the Web could well have been mandated. We would then have faced difficulties similar to those which arose when use of the OSI networking standard and Coloured Book software was mandated and institutions were discouraged from using Internet protocols.

Fortunately however we are now in a more stable environment: the Internet and the World Wide Web have been accepted as the killer applications for the development of a rich set of distributed network services. The underlying architectural framework for the Web has also matured, and it is widely acknowledged that XML provides the meta format for the development of new data formats.

In light of the growing maturity of the network environment infrastructure we are now in a position to progress from the experimental phase and seek to adopt more rigorous approaches to ensuring that project deliverables are interoperable and future-proofed.

Such an approach will be necessary in order to implement the seamless access to resources which the JISC's Information Environment [2] seeks to provide. The development of self-contained Web sites (the approach is the late 1990s) is no longer desirable; instead resources will need to be capable of being processed in a consistent manner by automated tools. This is a significant contrast with the development of Web pages for processing by Web browsers: an environment in which browsers were tolerant of errors.

QA Focus

In light of the growing need for more rigorous compliance with standards the JISC-funded QA Focus to support initially JISC's 5/99 [3] and later the FAIR [4] and X4L [5] programmes. The aim was the development of a quality assurance methodology to help ensure that project deliverables were interoperable through the deployment of appropriate quality assurance procedures.

QA Focus was launched in January 2002. Initially it was provided by UKOLN [6] and ILRT [7], University of Bristol. However, following ILRT's decision to refocus on their core activities, in January 2003 the AHDS [8] replaced ILRT, strengthening its work by being able to exploit AHDS's broad range of service experiences and extensive knowledge in the area of digitisation and service provision.

A Developmental Approach

From the start QA Focus felt the need to take a developmental approach to its work. A hardline policing approach, in which project deliverables would be closely checked for compliance with standards and, in cases of non-compliance, recommendations to JISC that project funding should cases, was not felt to be appropriate.

The approach taken is developmental. We seek to ensure that projects have an understanding of the importance of open standards. Although we are not in a position to advise on best ways of implementing solutions, we have developed an infrastructure which allows projects to share their approaches. We also encourage projects to share the problems they have experienced and the limitations of their solutions.

User Feedback

Prior to beginning our work it was clearly important to talk to our users project developers funded by the JISC 5/99 programme in order to get an understanding of the challenges projects faced in implementing standards-based solutions.

A questionnaire and two focus group meetings sought to gain feedback on (a) the standards framework [9]; (b) implementation issues and (c) service deployment. The responses indicated a number of concerns in implementing the standards:

Lack of awareness of standards:
In a small number of cases there appeared to be a lack of awareness of the Standards document.
Difficulties in choosing appropriate standards:
There was more widespread concern that it could be difficult to establish which standards were applicable to projects.
Concerns over maturity of standards:
There were concerns that in some cases the standards may not be sufficiently mature for deployment.
Concerns over change control of the standards document:
There were concerns that the standards framework may change during the project lifetime.
Concerns over lack of tools:
There were concerns that in some cases tools which implement the standards may not be widely available.
Difficulties in checking compliance with standards:
There were concerns over the difficulties in ensuring that standards were being used correctly.

The feedback on implementation issues had many overlaps with the concerns listed above. The poor support for standards by some browsers, for example, was identified as a concern for many.

Although useful feedback on standards and implementation challenges was provided, it was noticeable that the issue of deployment of project deliverables into a service environment did not appear to have been given as much thought. There was an exception in the case of projects being undertaken by JISC Services themselves. In other cases, there appeared to be a feeling that deploying project deliverables was an area to be addressed by the service providers and this was not a top priority for projects themselves.

Surveying The Community

The user feedback was complemented with a number of semi-automated surveys [10] which helped us to profile the approaches taken by the projects in the provision of their Web sites. The surveys also helped us to identify common problem areas. This helped us to prioritise the areas in which advice needed to be provided.

Providing Advice

Our approach to providing advice has been to produce brief, focussed documents. The documents seek to provide either an explanation of a standard, approaches to using the standard, common problems encountered with a standard and approaches to checking compliance.

To date (June 2004) we have published 70 briefing papers, covering areas of standards, digitisation, Web provision, metadata, software development and service deployment.

Sharing Best Practices

The focus groups identified the need for specific advice on the deployment of standards and on appropriate implementation frameworks. Due to the wide range of areas being addressed by projects and the different approaches they may take and the different organisational cultures to be found across the institutions and organisations involved in project work it is neither possible nor desirable to recommend a particular implementation frameworks.

Our approach has been to encourage the community to document how they have approached use of standards and best practices. The case studies we have commissioned have been brief describing the issue being addressed in the case study, the solution chosen, the effectiveness of the approach chosen and details of lessons learnt or things that would be done differently in the future.

To date (June 2003) we have published 34 of these case studies, covering the areas of standards, digitisation, Web provision, metadata, software development and service deployment.

The QA Focus Methodology

Although the feedback on the resources we have made available has been positive, our ultimate aim has been wider than this: our goal was to develop a quality assurance (QA) infrastructure which projects could deploy in order to embed best practices within their development work.

The QA methodology we have developed is based on well-established approaches to QA which can be implemented within the technical development framework for the projects. We are advising projects that they should adopt the following framework:

Documented policies:
Projects should document their choice of standards and architectural framework.
Compliance checking:
Projects should document their approaches to ensuring that they comply with their policies.
Audit trails:
Projects should provide an audit trail which documents their compliance monitoring.

We recognise that this may be felt to be time-consuming to implement. In order to address such concerns and to illustrate that this framework can be implemented in a lightweight fashion the following examples have been provided.

Policy Area: Web Standards

Policy: The QA Focus Web site is primarily based on XHTML 1.0 and CSS 2.0. Web pages should comply with these standards.

Framework: The Web site uses PHP to include HTML fragments. Part of the Web site provides access to an SQL Server database.

Simple HTML editing tools (e.g. HTML-kit) are used to create and maintain the Web site.

Exceptions: Files automatically derived from other applications (e.g. MS PowerPoint) need not comply with HTML standards until conversion tools which generate compliant HTML are readily available.

Change Control: The project manager is responsible for the policy, ensuring policies are implemented and for changes to policies.

In order to ensure that the policies in this policy document are implemented it is necessary to document the compliance testing procedures. For example:

Compliance Area: Web Standards

Compliance Testing: When pages are created or updated they should be checked for HTML compliance using the ,validate tool.
When new CSS files are created or CSS is embedded within a page, the ,cssvalidate tool should be used.
At least quarterly a survey of the Web site should be carried out using the ,rvalidate (or equivalent) tool.
W3C's Web log validator tool should be run monthly to report on the top 10 pages which are not compliant.

Audit Trail: The output from the periodic bulk audits should be published.

We hope these examples illustrate that QA procedures need not be time-consuming to develop. We also hope that the implementation of such QA procedures will be seen as a normal part of ensuring that a Web site is functioning correctly and are not excessively time-consuming to implement, especially if they are implemented from the start of a project.

A Matrix Approach For Standards Selection

We have outlined our recommendations on QA policies and procedures for standards and best practices. In addition to this, we have also produced a matrix for the selection of the standards and best practices.

Although in an ideal world the richest open standards and best practices would be deployed in reality it is often necessary to make compromises: the best choices may be difficult to implement due to lack of times, skills or resources.

There is a need to acknowledge such issues, without losing sight of the underlying principles of use of open standards. In order to ensure that open standards are not ignored because projects can't be bothered, but have legitimate reasons for a compromise solution we recommend a matrix approach in which the following issues are addressed.

Openness of format: Is the file format to be used open or proprietary?

Openness of proprietary format: If the file format is proprietary has the specification been published openly?

Availability of viewers: Are viewers available for free and/or as open source? Are viewers available on all relevant platforms?

Availability of authoring tools: Are authoring tools available for free and/or as open source? Are authoring tools available on all relevant platforms?

Maturity of standard: Is the standard mature or new?

Richness of standard: Is the standard rich and capable of being used to support complex applications?

Complexity of standard: Is the standard complex or relatively simple to understand and use?

Resource implications: Does the organisation have the resources necessary to make effective use of it?

Organisational culture: Does use of the standard reflect the organisation's culture?

Clearly addressing such issues has a subjective element and there may be conflicts (e.g. richness versus complexity). However if projects address such issues at an early stage in the project's life, it can help ensure that there is an awareness of the decisions made, the reasons for the decisions and the implications.

The QA Focus Toolkit

In order to help projects embed a QA approach within their work, we have developed a toolkit [11] which seeks to ensure that we provide more that a static repository of documents, but also provide an interactive aspect to our service.

An example of the toolkit is illustrated below.

Figure 1: The QA Focus Toolkit
Figure 1: The QA Focus Toolkit

Testing Tools

Although the remit of QA Focus's work is primarily in the development of a QA methodology and does not cover software development, we have addressed the issues of tools and approaches for checking compliance with standards. The work has focussed on tools which can check that Web sites comply with standards and best practices since this is an area for which remote testing can be carried out.

We have sought to overcome the lack of integration of many Web testing tools with the publication process by describing an approach which provides authors with an interface to a range of testing services which can be accessed using the URL area of a Web browser [12]. This has been implemented by a simple change to the Apache configuration file on the UKOLN Web server, enabling HTML validation to be carried out by appending ,validate to any URL on the UKOLN Web site.

In addition to documenting this approach we have also highlighted the limitations of commercial tools. For example, some link checkers, tools fail to check links to external resources such as JavaScript or CSS files; some link checkers and HTML validation tools cannot process resources which make use of features, such as frames, redirects, personalised interfaces, etc. There is a danger that use of such tools could give the impression that a Web site is compliant when this is not the case.

Service Deployment

The main purpose of quality assurance procedures is to ensure that project deliverables can be deployed in a service environment easily, that deliverables are future-proofed against new developments and can be accessed in a wide range of environment.

We have been working with JISC services to ensure that an awareness of the challenges which services face in taking project deliverables and deploying them is gained across the development community. It appears to be not widely appreciated that even if projects comply fully with standards and best practices that there may still be potential difficulties in deploying the deliverables: for example, if a project makes use of a specialist content management system it may be resource intensive to deploy this application within a service environment. Use of open source software does not necessarily overcome such barriers as there is still a potential learning curve to be overcome.

As well as deployment by JISC services, there are a number of other environments in which project deliverables may be deployed, such as within institutions, for example as services to be managed within institutions or desktop applications; reports may need to be archived by a records management system or learning objects deposited in a repository. There is a need for the recipients to consider issues such as security and performance implications; legal issues; resource implications and the relevance to the institution.

As well as the deployment of project deliverables there are also long term preservation and records management issues which need to be addressed. It has been observed that project Web sites funded under eLib and the EU's Telematics For Libraries programme have disappeared shortly after funding has finished [13] [14]. This is an area of relevance to QA Focus. We have provided a number of recommendations on the availability of project Web sites after funding finishes [15] and provided a case study illustrating various procedures which can be used prior to 'mothballing' a project Web site [16].

Acceptance Within The Wider Community

It is clearly desirable that the QA methodology outlined in this article is embedded within the working practices within organisations involved in JISC project work. In order to help to gain wider acceptance we have sought to disseminate our work across institutions. The main focus for this has been a workshop session in the Institutional Web Management Workshop in 2003 [17], although a number of other seminars have also been given.

Gaining International Acceptance

We have sought international recognition of the approaches to QA outlined in this article. We are pleased to report that papers have been accepted at three peer-reviewed international conferences: a description of the QA Focus work was given at the EUNIS 2003 conference in a paper on ">Developing A Quality Culture For Digital Library Programmes" [18]; the approach to the selection of standards was described at the ichim03 conference in a paper on "Ideology Or Pragmatism? Open Standards And Cultural Heritage Web Sites" [13]; the deployment of the QA Focus methodology was described at the IADIS 2003 conference in a paper on "Deployment Of Quality Assurance Procedures For Digital Library Programmes" [14] and a paper on "Interoperability Across Digital Library Programmes? We Must Have QA!" [21] will be presented at the ECDL 2004 conference to be held at the University of Bath in September 2004.

What Next?

QA Focus has developed a repository of support materials which can help projects in ensuring their project deliverables are compliant with standards and best practices. More importantly we have developed a QA methodology which we feel can be deployed by projects without providing too onerous a burden on the projects.

The JISC is looking to integrate aspects of the QA Focus work into the JISC Technical Standards Framework. The QA Focus outputs continue to be of relevance to ensuring the quality and interoperability of digital resources.

References

  1. eLib, UKOLN,
    <http://www.ukoln.ac.uk/services/elib/>
  2. Information Environment, JISC,
    <http://www.jisc.ac.uk/index.cfm?name=about_info_env>
  3. 5/99 Learning and Teaching Programme, JISC,
    <http://www.jisc.ac.uk/index.cfm?name=programme_learning_teaching>
  4. Facilitating Access to Institutional Resources, JISC,
    <http://www.jisc.ac.uk/index.cfm?name=programme_fair>
  5. Exchange for Learning, JISC,
    <http://www.jisc.ac.uk/index.cfm?name=programme_x4l>
  6. UKOLN,
    <http://www.ukoln.ac.uk/>
  7. Institute for Learning & Research Technology (ILRT),
    <http://www.ukoln.ac.uk/>
  8. Arts and Humanities Data Service (AHDS),
    <http://www.ahds.ac.uk/>
  9. Standards and Guidelines to Build a National Resource, JISC,
    <http://www.jisc.ac.uk/index.cfm?name=projman_standards>
  10. QA Focus Surveys, QA Focus, UKOLN,
    <http://www.ukoln.ac.uk/qa-focus/surveys/>
  11. Self Assessment Toolkit, QA Focus, UKOLN,
    <http://www.ukoln.ac.uk/qa-focus/toolkit/>
  12. A Proposal For Consistent URIs For Checking Compliance With Web Standards, B. Kelly, IADIS Internet/WWW 2003 Conference,
    <http://www.ukoln.ac.uk/qa-focus/documents/papers/iadis-2003/poster/>
  13. WebWatching eLib Project Web Sites, Ariadne issue 26, 2001,
    <http://www.ariadne.ac.uk/issue26/web-watch/>
  14. URLs for Telematics for Libraries Project Page, Exploit Interactive issue 1, 1999,
    <http://www.exploit-lib.org/issue1/urls/>
  15. Mothballing Your Web Site, QA Focus,
    <http://www.ukoln.ac.uk/qa-focus/documents/briefings/briefing-04/>
  16. Providing Access to an EU-funded Project Web Site after Completion of Funding, QA Focus,
    <http://www.ukoln.ac.uk/qa-focus/documents/case-studies/case-study-17/>
  17. Catching Mistakes: Using QA to Address Problems on your Web Site,
    <http://www.ukoln.ac.uk/web-focus/events/workshops/webmaster-2003/sessions/#workshop-14>
  18. Developing A Quality Culture For Digital Library Programmes, B .Kelly, M. Guy and H. James, EUNIS 2003 Conference Proceedings,
    <http://www.ukoln.ac.uk/qa-focus/documents/papers/eunis-2003/>
  19. Ideology Or Pragmatism? Open Standards And Cultural Heritage Web Sites, B .Kelly, M. Guy, A. Dunning, and L. Phipps, ichim03 Conference Proceedings,
    <http://www.ukoln.ac.uk/qa-focus/documents/papers/ichim03/>
  20. Deployment Of Quality Assurance Procedures For Digital Library Programmes, B. Kelly, A. Dawson and A. Williamson, IADIS Internet/WWW 2003 Conference,
    <http://www.ukoln.ac.uk/qa-focus/documents/papers/iadis-2003/paper/>
  21. Interoperability Across Digital Library Programmes? We Must Have QA!, B. Kelly, ECDL 2004 Conference,
    <http://www.ukoln.ac.uk/qa-focus/documents/papers/ecdl-20043/>

Contact Details

Brian Kelly
UKOLN
University of Bath
BATH
BA2 7AY
Tel: 01225 383943
Email: B.Kelly@ukoln.ac.uk