Selection criteria for quality controlled information gateways
Work Package 3 of Telematics for Research project DESIRE (RE 1004)
Title page
Table of Contents

Previous - Next

Appendix IV: Selection criteria of selective subject gateways

Introduction

There follows short accounts of the selection criteria used by the contacted organisations based on information supplied and on published information.

1. ADAM

A draft document entitled Criteria for including resources was published in April 1996 (Bradshaw 1996a) and describes the criteria and methods used to determine the suitability of resources for inclusion in the ADAM database. The full guidelines were published later the same year (Bradshaw 1996b)

An important part of ADAM's selection process is that certain items are eliminated before the quality evaluation process begins. This includes resources that do not contain any unique information (e.g. only provide links to other resources), resources that have been created by individuals for personal use and those resources which are out-of-date, defunct, inaccurate or superseded.

ADAM's evaluation criteria are divided into three main areas:

Information content

To what level of detail does the resource go? how superficial / exhaustive is it?

Does the resource contain sufficient basic information, i.e., in a WWW document, contact details, last update details, etc.?

Is the information presented accurately?

Is the information composed well? Is the information within a resource phrased unambiguously?

Is the resource authoritative? who is responsible for the resource? Are they a reputable source?

Structural design and navigability

Is the information arranged logically?

Is it easy to navigate the resource?

Are hyperlinks ambiguous, i.e., is it obvious where a link is leading you to?

Are there good back and forward links between pages?

Do you ever find yourself in a position where there are no hyperlinks to anywhere else?

Is the information within a resource arranged consistently?

Is the grammar and spelling accurate?

Are images used effectively or are they over-done?

Is the resource 'viewable' effectively (i.e., without loss of essential information and navigability) in non-graphical browsers?

Overall appearance and usability

Is the aim and purpose of the resource obvious at first sight?

Is it attractive/functional?

Does it encourage you to explore further?

Is the balance of links and text good?

Is the balance of text, images and white space good?

How big is the resource?

If a WWW document is long, is it navigable? How long does it take to download?

Are there single document options for those resources that may be printed?

Are there alternative options for those WWW resources which contain Netscape

specific features such as tables?

2. EELS

Selection for EELS is carried out by one of ten to fifteen subject editors, who are usually subject librarians in one of the participating libraries. To ensure quality, consideration is given to "such factors such as accessibility, maintenance, documentation and reliability of producer information" (Jansson 1996). There are no firm criteria used or published by EELS, but the guidelines are:

Accessibility

The resource has to be accessible! There should be no dead links. Except for some important exceptions, resources should be free of charge. Commercial databases are only included if they contain rare or otherwise unreachable information.

Documentation and maintenance

There should be a minimum of information available about the resource itself - who is providing the information? When was it published, last updated, etc. Editors are supposed to check the resources from time to time, to see whether they are still 'alive'.

Reliability of producer organisation

The editors have to use their subject knowledge for this.

Interest

The resource must be of interest to the technical universities involved, from a research or educational point of view.

EELS have not tried to make the criteria more exact. The editors have an occasional meeting to discuss selection criteria questions and to make sure that they follow the same guidelines.

3. EEVL

Selection is currently carried out by EEVL staff and by additional voluntary team members who are all engineering librarians with an interest in the Internet. Selection criteria are currently under review, but the Project Officer says that when "resources are investigated for inclusion in the database a number of criteria are considered including information content, provenance, authority, usability, durability, reliability of access, and uniqueness within the context of the overall collection. Items which are out of date, inappropriate, strictly local in context or are no longer available are filtered out" (Moffat 1996a). More detailed criteria can be found in the EEVL team manual (EEVL 1996) which states that the following type of questions should be asked of a resource:

Does the resource contain substantive information?

Is the subject matter appropriate for the EEVL target audience?

Is the resource unique within the context of the total collection?

Is the information durable in nature?

Is the information from a reputable source?

Is the information current?

Is there any form of quality control?

Is access reliable?

Is access free and unrestricted?

Is there on-line help, or contact details?

Is there printed documentation?

In practice, resource evaluation will be a combination of many of the above factors, some of which may be in opposition, but the resource has still been considered as a valuable for inclusion.

Content criteria would appear to be the most important on this list, but there is also an interesting interest in the reliability and stability of the information provided.

4. NBW

The selection criteria are published on the NBW Working Home Page (NBW 1996) but can be summarised as follows (information supplied by Marianne Peereboom):

Content:

1. Relevance for the academic or scientific communities

2. Quality of content

Formal:

1. Bibliographic 'units' only

2. Full text, multimedia, and referential resources are all included.

These criteria are still being discussed, changed and expanded.

5. OMNI

OMNI have an Assurance Officer (Betsy Anagnostelis) and have an Advisory Group on Evaluation Criteria comprising Alison McNab of the Pilkington Library, Loughborough University and Alison Cooke, a Research Student at the Department of Information and Library Studies, University of Wales, Aberystwyth. (OMNI Consortium 1996a)

In order to ensure the comprehensiveness of the OMNI database, resource descriptions can be created by volunteer helpers who would monitor certain subject areas or inform the OMNI Consortium about important resources. These volunteers select resources and create resource descriptions for them. Although the resource descriptions are checked before they are transferred to the OMNI public databases, it is important that the quality selection criteria used is widely available. Therefore OMNI's quality selection criteria have been published in a document entitled Evaluating resources for OMNI (OMNI Consortium 1996b) Fundamentally, OMNI will include a resource if it contains substantive information and is of relevance to the OMNI user community (Ibid.). The criteria used is broken down into fifteen sections:

Scope

Audience

Authority

Provenance

Accuracy of information content

Uniqueness / comparison with other sources

Currency / frequency and regularity of updating

Accessibility and usability

Charging policy

Special requirements

Software reliability

Copyright

Language

Design and layout / user interface

User support / documentation

The main emphasis is on content criteria, with some importance being given to design and ease of use issues. "... [OMNI] are primarily interested in the value of a resource in terms of information content; quality of design or appearance are of secondary interest, even though they may affect the overall usefulness of a resource" (Ibid.). OMNI also comment that although the evaluation process will take in a combination of the criteria listed above, the important thing is the assessor's "overall impression about the value of a resource to the OMNI user community" (Ibid.).

6. RUDI

The RUDI project intends to build up a collection of hypermedia materials on urban design - mostly stored on its own server. Selected resources from other sites will be included on the service. Selection will be carried out by team members (subject librarians) at Oxford Brookes University in collaboration with the RUDI Internal Advisory Group and Steering Committee.

As of the end of July 1996 selection criteria were still in the process of being formulated.

7. SOSIG

SOSIG only selects resources perceived to be of quality. An e-mail cited on the SOSIG home page states that "Given the amount of information on the net, the real value of a resource such as yours [SOSIG] is, paradoxically, not that it is comprehensive but that it is selective of high quality resources" (SOSIG 1996). The same document states that SOSIG filters out resources that are of little or no use to our users. This process also weeds out material that is out of date, inappropriate, strictly local in context or refers to resources that are no longer available" (Ibid.).

Relevance

Is it relevant to the subject area, user profile (i.e. education and research)

What is the scope of the resource (geographical limitations, etc.)?

Is the information substantive - a resource which consists of a collection of links to other resources will normally not be included unless there is substantial annotation or value-added information.

Note - UK academic departmental pages are included in the database (as researchers often want to make contact with other departments) but similar pages from other countries are not deemed suitable for inclusion.

Features

Is the information accurate, comprehensive? (often hard to validate this information)

Is there any on-line help/information?

Perceived value

Is it from an authoritative source?

Reputation of information provider?

Is the information peer reviewed?

Uniqueness

Is the resource available elsewhere (different formats, sites)?

Is there similar/better subject material available?

Does the material have any relation to other works?

Is it an often cited source?

Maintenance

Is the information being maintained/updated?

Is the information being maintained/updated?

Is the information provider likely to be able to maintain the information (unlikely in the case of information provided by students)

Meaning over time

Does the information have a time limit to its usefulness i.e. timetables, schedules, conference announcements etc.?

Presentation of information

Is it presented well, easy to use and manage?

Physical access

Are the connections to the site providing the information reliable and stable?

Next Table of Contents


Page maintained by: UKOLN Metadata Group,
Last updated: 2-Apr-1998