UKOLN AHDS IMS Question And Test Interoperability



Introduction

This document describes an international specification for computer based questions and tests, suitable for those wishing to use computer based assessments in courses.

What Is IMS Question And Test Interoperability?

Computers are increasingly being used to help assess learning, knowledge and understanding. IMS Question and Test Interoperability (QTI) [1] is an international specification for a standard way of sharing such test and assessment data. It is one of a number of such specifications being produced by the IMS Global Learning Consortium to support the sharing of computer based educational material such as assessments, learning objects and learner information.

This new specification is now being implemented within a number of assessment systems and Virtual Learning Environments. Some systems store the data in their own formats but support the export and import of question data in IMS QTI format. Other systems operate directly on IMS QTI format data. Having alternative systems conforming to this standard format means that questions can be shared between institutions that do not use the same testing systems. It also means that banks of questions can be created that will be usable by many departments.

Technical Details

The QTI specification uses XML (Extensible Markup Language) to record the information about assessments. XML is a powerful and flexible markup language that uses 'tags' rather like HTML. The IMS QTI specification was designed to be pedagogy and subject neutral. It supports five different type of user response (item selection, text input, numeric input, xy-position selection and group selection) that can be combined with several different input techniques (radio button, check box, text entry box, mouse xy position dragging or clicking, slider bar and others). It is able to display formatted text, pictures, sound files, video clips and even interactive applications or applets. How any particular question appears on the screen and what the user has to do to answer it may vary between different systems, but the question itself, the knowledge or understanding required to answer it, the marks awarded and the feedback provided should all remain the same.

The specification is relatively new. Version 1.2 was made public in 2002, and a minor upgrade to Version 1.2.1 was made early in 2003, that corrected some errors and ambiguities. The specification is complex comprising nine separate documents. Various commercial assessment systems (e.g. Questionmark [2], MedWeb, Canvas Learning [3]) have implemented some aspect of IMS QTI compatibility for their assessments. A number of academic systems are also being developed to comply with the specification. These include the TOIA project [4] which will have editing and course management facilities, the SToMP system [5], which was used with students for the first time in 2002, and a Scottish Enterprise system called Oghma which is currently being developed.

Discipline Specific Features

A disadvantage of such a standard system is that particular features required by some disciplines are likely to be missing. For example, engineering and the sciences need to be able to deal with algebraic expressions, the handling of both accuracy and precision of numbers, the use of alternative number bases, the provision of randomised values, and graphical input. Language tests need better textual support such as the presetting of text entry boxes with specific text and more sophisticated text based conditions. Some of these features are being addressed by groups such as the CETIS assessment SIG [6].

What This Means To You

If you are starting or planning to start using computer based tests, then you need to be aware of the advantages of using a standard-compliant system. It is clearly a good idea to choose a system that will allow you to move your assessments to another system at a later time with the minimum of effort or to be able to import assessments authored elsewhere.

A consideration to bear in mind, however, is that at this early stage in the life of the specification there will be a range of legacy differences between various implementations. It will also remain possible with some 'compliant' systems to create non-standard question formats if implementation specific extensions are used. The degree of conformity of any one system is a parameter that is difficult to assess at any time. Tools to assist with this are now beginning to be discussed, but it will be some time before objective measures of conformance will be available. In view of this it is a good idea to keep in touch with those interested in the development of the specification, and the best way within UK HE is probably via the CETIS Assessment Special Interest Group Web site [7].

It is important that the specification should have subject specific input from academics. The needs of different disciplines are not always well known and the lack of specific features can make adoption difficult. Look at the examples on the CETIS Web site and give feedback on areas where your needs are not being met.

References And Further Information

  1. QTI Specification,
    <http://www.imsglobal.org/>
  2. Questionmark,
    <http://www.questionmark.com/>
  3. Canvas Learning Author and Player,
    <http://www.canvaslearning.com/>
  4. TOIA,
    <http://www.toia.ac.uk>
  5. SToMP,
    <http://www.stomp.ac.uk/>
  6. CETIS Assessment Special Interest Group,
    <http://www.cetis.ac.uk/assessment>
  7. CETIS,
    <http://www.cetis.ac.uk/>

The following URLs may also be of interest.

Acknowledgments

This document was originally written by Niall Sclater and Rowin Cross of CETIS and adapted by Dick Bacon, Department of Physics, University of Surrey, consultant to the LTSN Physical Sciences Centre.

The original briefing paper (PDF format) is available on the CETIS Web site. The version available on this Web site was originally published in the LTSN Physical Science News (Centre News issue 10).