Cultivate Interactive Home Page *
*
spacer

Search

  Home | Current Issue | Index of Back Issues
  Issue 4 Home | Editorial | Features | Regular Columns | News & Events | Misc.

Print All - Complete Issue

This page is intended for printing purposes.

-------------------------------------------------------------

Cultivate Interactive Issue 4: Editorial

Welcome to the fourth issue of Cultivate Interactive!

Since our last publication there have been a fair number of changes in the DIGICULT world. In March fifteen new projects were added to the 'IST projects in the cultural heritage area' Web page from the last call for proposals [1]. From these new editions the ARION and AMITCITA projects are both dealt with in this issue, more projects are to come. Cultivate Interactive is also pleased to welcome the recently established Cultivate CEE which will be carrying on the good work of Cultivate throughout Eastern Europe. The kick-off meeting for Cultivate CEE is taking place in Torun, Poland during the launch weekend of Cultivate Interactive issue 4, so we wish them luck. Hopefully the establishment of Cultivate CEE will mean more articles from Eastern Europe in the future.

This issue as usual has lots to offer. Our first feature article is an important piece on the new dot-museum top level domain that was approved by the Internet Corporation for Assigned Names and Numbers (ICANN) in November last year. It is written by Cary Karp, Director of Internet Strategy and Technology at the Swedish Museum of Natural History, and the President of the Museum Domain Management Association, and considers what the dot-museum's implementation will mean to those working in memory institutes in Europe.

Emma Wagner of the European Commission's Translation Service contributes an interesting article on the origins of the Euro-English that plagues the EC and offers some practical advice to those struggling to fight the Eurospeak disease.

Roger Smith, the founder and director of Global Museum, a highly successful international Webzine, offers us some insight into how he has established a museum-based compendium site. Global Museum boasts current readership in 88 countries and maintains a weekly mailbase of more then six thousand museum professionals; something for Cultivate Interactive to aspire to!

Other feature articles include an outline of the work carried out by the European Museums' Information Institute, a consortium of key organisations in the cultural heritage field, by Rosa Botterill. An introduction to the Visual Arts Data Service, an outfit that have a role in the preservation of high quality digital materials for Higher and Further Education, by Phill Purdy. And a report by Steve Glangé of LIFT (Linking Innovation Finance and Technology) on how you can turn your Research and Development results into successful ventures.

In the regular column section this issue's National Node column has been written by Pascale Van Dinter, the Belgium national node. The At the Event column covers the Internet Librarian International conference held in London in March and the Open Archives Initiative (OAI) open meeting held in Berlin.

We are also pleased to introduce a new regular column called 'Praxis'. (Thanks to Philip Hunter, editor of Ariadne, who contributed the name). Praxis aims to give advice on how to put various applications and theories into practice. The first two offerings give some insight into streaming video. Streaming video is the art of sending moving images in a compressed form over the Internet. The benefits are that the user does not have to wait to download a large file before seeing the video or hearing the sound; the media is sent in a continuous stream. Neil Francis and David Cunningham's article offers an introduction to the technologies available and some of the problems encountered, while Jim Strom provides a number of examples and case studies to learn from.

Finally, In the metadata column, continuing our coverage of digitisation projects, Elhanan Adler and Orly Simon talk about the Jewish National and University Library's Ketubbot project which aims to put a unique collection of some 1200 ketubbot (Jewish marriage contracts) online.

And of course don't forget our 'Spot the European City' Competition, which gets more popular and difficult to find photos for with every issue!

Thanks to everyone who carried on reading and supporting Cultivate Interactive during my absence. I was lucky enough to spend a month in Australia bushwalking round the Northern Territory and snorkelling on the Great Barrier reef!

Marieke Napier (Editor)

Australia

References

  1. IST projects in the cultural heritage area
    URL: <http://www.cordis.lu/ist/ka3/digicult/en/projects.html>l

-------------------------------------------------------------

Cultivate Interactive Issue 4: Features

DIGICULT Projects:

Other Areas:

-------------------------------------------------------------

DIGICULT Projects

ARION: An Advanced Lightweight Architecture for accessing Scientific Collections

By Catherine Houstis and Spyros Lalis - May 2001

Catherine Houstis and Spyros Lalis describe the work of Project Arion. ARION, an advanced lightweight architecture for accessing scientific collections, aims to provide a new generation of Digital Library services for the searching and retrieval of digital scientific collections that reside within research and consultancy organisations.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

Scientific data and programs have long been treated as ‘private’ resources to be used only by the people/organisation who created/developed them. This ‘private ownership’ however is usually a situation arising from inaction rather than from policies restricting data reuse. Data and models are uniquely collected/developed as part of a scientific study, but post-study it is not a priori clear what should happen to the data used. There are literally thousands of scientific collections/data sets that are getting lost at the end of a study that produced them. This is tremendously valuable information, which is getting lost because of non-existent cataloguing (metadata), unreachable because of heterogeneity of software/hardware it is stored, poor documentation and etc, all at a great expense of the taxpayer’s money. Research is very expensive, as it requires specialised expertise to be carried through, thus a poor return on this investment can be prohibited if scientific collections could be shared and reused. This is the premise of a digital library, making such resources electronically available to a large number of –possibly remote– users.

Internet-based techniques have been developed to make scientific resources available to the wider scientific community and improve this situation. However, even state of the art systems typically come with four main flaws, which make them unattractive both to resource providers and users. First, the scientific data resource export procedure remains complicated involving programming effort and expertise that is alien to the data providers. Second, users are offered a simple search interface with little guidance on how to track down or create specific information. Thirdly, once a resource is found there is little support for flexible reuse, i.e. one can either take/use the resource as is or not at all. Thus, dynamic combination of several resources belonging to different providers to create new resources is virtually impossible. Last but not least, current solutions do not work with existing practices and financing methods used in the organisations that produce data and as such they are regarded as a ‘burden’ rather than as an ‘assistance’.

ARION, a recently funded international research and development project, is aiming to provide a new generation of Digital Library services for the searching and retrieval of digital scientific collections that reside within research and consultancy organisations. This functionality will be achieved via an appropriate distributed system that can be easily installed and administered by the various participants.

ARION advances the findings of previous studies in areas, such as, management of networked scientific repositories, metacomputing, intelligent information integration and digital libraries. ARION is a federated open system and is developed in association with national data providers, scientific researchers and SME’s to ensure that the project meets their needs. The ARION consortium is composed of research organisations: the Institute of Computer Science-Foundation for Research and Technology (GR) as the leader, the National Technical University of Athens (GR), the Consiglio Nazionale delle Ricerche CNR-IMA (IT), the Commission of the European Communities, Joint Research Centre (IT), the University of Crete (GR); and the SMEs HR Wallingford Ltd (UK), the Oceanographic Company of Norway ASA and the Enterprise LSE Limited (UK). The ARION started in January 2001 and will be completed in 3 years.

Digital Libraries: State of the Art

The rapid development of distributed computing infrastructures and the growth of the Internet and the WWW have revolutionised the management, processing, and dissemination of scientific information. Repositories that have traditionally evolved in isolation are now connected to global networks. In addition, with common data exchange formats, standard database access interfaces, information mediation and brokering technologies in the context of Digital Libraries Initiatives and I3, Intelligent Information Integration, emerging data repositories can be accessed without knowledge of their internal syntax and storage structure. Furthermore, search engines, are enabling users to locate distributed resources by indexing appropriate metadata descriptions. Open communication architectures provide support for language-independent remote invocation of legacy code thereby paving the way towards a globally distributed library of scientific programs. Finally, workflow management systems exist for coordinating and monitoring the execution of scientific computations. The standardisation and interoperability is pursued by the W3C.

This technology has been so far successfully used to address system, syntactic, and structural interoperability of distributed heterogeneous scientific repositories [1]. However, interoperability at the semantic level is needed to overcome the problem of identifying the scientific resources that can be combined in a meaningful way to produce new data [2]. This is of key importance for providing widely diversified user groups with advanced, value-added information services.

Another body of work addresses integration of heterogeneous information over a number of networked distributed repositories [3]. In this context the aim has been in building global environmental systems. Integration has also benefited from workflow technology, which has been used originally in business processes.

Solutions for Scientific Collections

A Digital Library of Scientific Collections: Concept Innovation

A Digital Library of scientific collections is a new and unprecedented concept. It encompasses the characteristics of a traditional library and in addition, it creates new content on line. In traditional libraries humans create new knowledge after having used the library content. In the case of scientific content (and in the ARION digital library), new content is created continuously upon user demand. Any scientific area is represented not only by means of multimedia document information but also in terms of data sets, programs and tools which can produce new information, interactively, either by analyzing data or by predicting physical phenomena, in terms of simulation of physical processes. Data analysis can be statistical analysis or extraction of information from satellite pictures for instance, or data acquisition from databases belonging to the library content via data mining tools, etc.

Another difference with traditional libraries is that the content of such a library is not within the walls of a building, nor can be stored using a single centralised computer system. Scientific objects such as programs for instance, are in general not portable and in addition they may need specialised software/hardware to execute. In ARION they reside in the provider’s organisation servers and are remotely invoked via the ARION system. Thus, the content of the digital library is distributed over the provider’s servers. In addition, the library documents not only the scientific object descriptions (metadata), but also scientific expertise in terms of data production rules (workflows), to make their reuse possible to the users. Visualisation tools are used to convey information to the users, statistical tools, and any other tools scientists use with their data sets and programs all supplied by the provider’s organisation. A WWW interface makes the library services accessible from anywhere via a web browser and an Internet connection. Thus, it provides an international collaborative environment. This adds tremendous value to a worldwide community of users.

ARION has the potential of becoming an international forum of scientific content and lead the effort of creating digital libraries of scientific objects worldwide. To the best or our knowledge the generalisation of ideas presented in ARION have not been put forward previously. Previous work has addressed management of scientific information for specific scientific areas and as such in all cases is a much simpler or very specific context. In the case of ARION, the scalability of the problem, the generality of the content, and the automated thus attractive ways to add new content are dealt within the architecture.

A Digital Library of Scientific Collections: Technical Innovation

The ARION Digital Library provides lightweight and straightforward tools to the repository providers, to automate the publication and export of their repository collections. It provides to the user an automated fast and accurate system to locate, retrieve and visualise data on demand. In scientific collections, the existence of scientific programs provide the possibility of computing data on demand by making complex combinations of data and programs existing in various heterogeneous geographically distributed and autonomous collections. The ARION advanced architecture supports these functions. Support is based on the coupling of ontologies with metadata and workflows to be able to address the needs of multiple scientific collections.

This functionality yields several technical innovations, which are indicated below:

ARION: An Advanced Lightweight Architecture for a Digital Library of Scientific Collections

ARION promotes advanced features of Digital Library technology and in addition it promotes features that take into account the content and characteristics of scientific collections. Specifically, it is based on an advanced middleware architecture that seamlessly integrates Digital Library, Intelligent Information Integration, and Workflow technologies. It is comprised of three main modules: the Metadata Search Engine, the Knowledge Base System, and the Workflow Runtime System, which co-operate to provide the user with the desired functionality. The architecture is shown in Fiure 1. The functionality of each component is briefly described in the following.

The Metadata Search Engine is responsible for locating external resources, either data sets or programs. It may also retrieve complementary information stored in the repositories, e.g. user documentation on the available resources. The Search Engine accepts metadata queries on the properties of resources and returns a list of metadata descriptions and references. References point to repository wrappers, which provide an access and invocation interface to the underlying legacy systems (repositories) where the data and programs reside. The Knowledge Base System accepts queries regarding the availability of ontology concepts. It generates and returns the corresponding data productions based on the available resources and the constraints imposed by the ontology rules. These productions provide all the information that is needed to construct workflow specifications. The KBS regularly communicates with the Metadata Search Engine to update its database. The Workflow Runtime System monitors and coordinates the execution of workflows. It executes each intermediate step of a workflow specification, accessing data and invoking programs through the repository wrappers. Checkpoint and recovery techniques are employed to enhance fault tolerance.

In addition, a user interface designed to work on a web browser at the user computer (with Internet access), is reached via a web address and provides access to the ARION system. A number of tools are developed for the provider in order to publish and install scientific collections into a scientific digital library in a provider friendly manner. These tools are part of the ARION architecture.

This architecture ensures the scalability and extensibility required in large, scientific collections systems. It allows operationally autonomous and geographically dispersed organisations to selectively “export” their resources. Publishing/installing a new resource with the system requires merely supplying appropriate metadata/ontology, workflow descriptions and wrappers.

Figure 1: A middleware architecture for distributed scientific repositories
Figure 1: A middleware architecture for distributed scientific repositories.
The system consists of interoperable Knowledge Base, Metadata Search, and Workflow Runtime components.

To enhance performance and fault tolerance, the Metadata Search Engine can be distributed across several machines. Also, several knowledge units adhering to different domains of scientific knowledge can be plugged into the Knowledge Base System to support a wide variety of scientific applications and user groups.

Efficient execution and administration of the system are achieved via special data and program export wizards for wrapper generation, automated use of filters to transform data between different formats, and use of mobile code that is downloaded and used at the user’s request.

Conclusion

The ARION architecture has been presented, forming a library of data sets programs and tools, all components of scientific collections. This library is a federation of heterogeneous systems, which interoperate to provide data services to its users. These services are access of data sets when they are stored into the system archives or dynamic production of data sets when they can be produced on the fly, upon user demand. Retrieval occurs via special tools to either visualise or statistically analyze the data sets.

Due to its modularity, participants may install only parts of the system on their premises, depending on their needs and limitations, both organisational and commercial. A provider may include the entire system architecture in order to organise his in house collection or various down scale versions of it, like only search engine or metadata storage. The system architecture supports different versions with a variety of capabilities at the provider’s end, in addition to a system-wide server featuring all architectural components for everyone’s use. This important architectural feature of the ARION system addresses the scalability problem of global (Internet accessible) digital libraries of scientific collections.

This work is supported by the EU 5th framework program. IST-2000-25289

References

  1. THETIS: A Data Management and Data Visualization System for Coastal Zone Management of the Mediterranean Sea. Contact person C. Houstis
    URL: <http://kos.ics.forth.gr:8000/> Link to external resource
  2. V. Christophides, C. Houstis, S. Lalis, H. Tsalapata. (1999) Ontology-driven Integration of Scientific Repositories, NGITS’99, New Generation Information Technologies, Lecture Notes in Computer Science, Elsevier, Habart Habaron, Israel, July 1999.
  3. C. Houstis, S. Lalis, N.M. Patrikalakis, W. Cho. (1999) Federated Scientific Information Systems, position paper for the invitational workshop for the EU-NSF cooperation on Large Scientific Database Systems.
    URL: <http://www.cacr.caltech.edu/euus/documents/houstis.html> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Catherine Houstis
Institute of Computer Science-Foundation for Research and Technology
Heraklion Greece

Houstis@ics.forth.gr Link to an email address

Catherine Houstis received her Ph.D. from the Electrical Engineering Department of Purdue University, USA, in 1977. In 1978 she was a Postdoctoral associate at the EE Dept. of Purdue University. In 1979 she joined the National Cashier Register (NCR) corporation as a research scientist in the Advanced System Research and Development department. From 1980 to 1983 Catherine worked as an assistant professor at the Electrical and Computer Engineering Department of the University of South Carolina. In 1984 she became an associate professor. From 1984-1987 she was a visiting associate professor at the EE Dept. of Purdue University.

In 1987 Catherine joined the Computer Science Department of the University of Crete. She was also a research associate at the Institute of Computer Science of FORTH. She is now a full professor and the Leader of the Distributed Systems Laboratory at the Institute of Computer Science FORTH. She has lead and participated in research projects funded by NSF in the USA, and ESPRIT, AIM, RACE, Telematics and Digital Libraries for scientific data collections in the EC. Her main research interests are in Internet based scientific information systems, Metacomputing, commercial aspects of scientific information systems and performance evaluation of global distributed systems.

Spyros Lalis
Institute of Computer Science-Foundation for Research and Technology
Heraklion Greece

lalis@ics.forth.gr Link to an email address

Spyros Lalis received a doctorate in Technical Sciences and a Diploma in Computer Engineering from the Swiss Federal Institute of Technology Zurich, in 1989 and 1994 respectively. Since 1997 he has been a Research Associate of the Institute for Computer Science at the Foundation for Research and Technology Hellas and an Adjunct Professor of Computer Science at the University of Crete.

Currently Spyros is Visiting Assistant Professor at the Computer and Communications Engineering department at the University of Thessaly. He is actively involved in the design of distributed systems, two of them developed through funded European projects. He is also leading a European research project in the area of ubiquitous computing. His interests include Programming Languages and Systems, Software Engineering, Distributed and Parallel Systems, Metacomputing, Ubiquitous and Pervasive Computing, and Economies of Electronic Services.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Houstis, C and Lalis, S. "ARION: An Advanced Lightweight Architecture for accessing Scientific Collections", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/arion/>

-------------------------------------------------------------

AMICITIA – New Solutions for Today’s Challenges in Digital Audiovisual Archives

By Stephan Schneider - May 2001

Stephan Schneider reports on the 'Asset Management Integration of Cultural Heritage In The Interexchange of Archives' (AMICITIA) project (IST1999-20215). AMICITIA is a demonstrator project within the key action III (Multimedia content and tools), action line III.2.4: “Digital preservation of cultural heritage”. AMICITIA started on 1st of October, 2000 and will run for 2 years.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

Audiovisual archives are one of the most valuable cultural heritage resources of modern times. By providing living images, music and speech they preserve a picture of life in the past. Like other archives, such as library or art collections, they are in need of protection. Modern archiving not only serves future generations but also allows us to turn the unique wealth of European cultural heritage into value for the today's people.

The AMICITIA project [1] aims to build the base for a continued and viable digital preservation of, and access to, television and video content. It will achieve this through the construction of various vital components enabling a digital archiving system to serve all required roles in ingest, management, access and distribution of audiovisual material. Special focus has been placed on enabling remote, multilingual access to archival content stored in a distributed environment. The system is being designed to serve both the needs of professional users (regarding preservation, quality, access flexibility and usability) and the needs of public access (regarding simplicity of use, security and availability). As a demonstration project AMICITIA aims at getting its results into practical, marketable use as fast as possible.

Project Partners

Partners have been chosen to both develop innovative solutions and to test these solutions in real world environments. This has resulted in the consortium consisting of two groups:

i. A technology providing and researching group: Tecmath AG (DE, [2]) and Joanneum Research (AT, [3]).

ii. A strong user group including broadcasting companies and audiovisual archives such as the British Broadcasting Corporation (BBC) (UK, [4]), the Austrian Broadcasting Cooperation (ORF) [5], the South-West Broadcasting Cooperation (SWR) (Germany, [6]). One new partner from the archives sector is currently being integrated in the consortium.

The work is distributed in a straightforward way: The tecmath AG acts as coordinator and provides the basic technology for a content management system. Joanneum Research, an Austrian Research Institution, develops technologies for distributed access (see the Distributed Access section).

The user partners analyse together the current workflow, finding weaknesses and developing a new idealised workflow using digital technologies. This work is led by the SWR. The user partners then define requirements for the new components to be developed in close collaboration with the technology partners. Some of the user partners have responsibility for several components: e.g. the ORF is responsible for the rights management system and for the storytelling interface (see the Access and Exchange Mechanisms section) and contributed to the multilingual (see the Distributed Access section). Extensive testing of the components integrated by the tecmath AG is also a main task of the user partners. The BBC will coordinate the evaluation process.

Foregoing and Concertating Projects

Most of the AMICITIA project partners were involved in the research project EUROMEDIA (ESPRIT 20636, [7]). Which ran from 1995 to 1998, this project's stated and achieved objective was to design and implement an asset management system for use in a broadcast environment to enable cooperative and efficient television production. The results of EUROMEDIA have been commercialized and are being exploited by TECMATH under the brand name media archive®. This product is currently installed at several European broadcasters and a market expansion into North America and Asia is foreseen for the future.

Two related IST projects have been started by one or more organizations involved in AMICITIA: PRIMAVERA [8] and Preservation Technology for European Broadcast Archives (PRESTO) [9]. These projects supplement each other and will work together closely to ensure that no redundant work is being done. The objectives of the these two concertating projects are:

The areas of collaboration are shown in the graph below:

Areas of collaboration
Figure 1: Areas of collaboration

Concertation is by no means limited to these projects only. There is for example an exchange of knowledge and experience with the Forum for Metadata Schema implementers (SCHEMA) [10] project, which concerns metadata. This collaboration is also based on personal contacts.

Working Areas

The AMICITIA project aims to develop and demonstrate new solutions in 4 working areas:

4 working areas
Figure 2: 4 working areas
  1. Distributed Access
  2. Access and Exchange Mechanisms
  3. Preservation of Digital Content
  4. Protection in the Digital Shelter

The challenges and their respective solutions will be described in detail below.

Distributed Access

The Challenge:

Distributed AccessDistributed Access is now a problem beyond the premises of a company or of an institution. Producing content is very expensive and there is high pressure to reduce cost. There are several ways to reduce production costs:

  1. Reusing content several times
  2. Using or purchasing Archive footage from others
  3. Selling content to others

These ways require a distributed access to archives across company premises and across content owners. Many content producers, especially broadcasters, plan to reuse their content via online media. They intend to display and sell contents not only to partners within the same business but also to other professionals and to the general public.

Another challenge within Europe is multilingualism. Content Metadata is predominantly written in the native language of the content producer, which makes it difficult to retrieve for non-native users.

The Solutions:

The AMICITIA projects responds to these challenges in two ways. The content management systems will be improved for distributed access across the Internet. For professional users there will be an interface which allows search and retrieval across the Internet, i.e. to get connected to external content management systems. This requires special protocols to ensure an overall system security on the one hand and to collaborate with the existing security mechanisms such as firewalls. To overcome the language barrier a thesaurus is under development, which aids the query in foreign language archives.

A Web interface developed separately will present selected contents to the general public.

Access and Exchange Mechanisms

The Challenge:

Access and Exchange MechanismsThe Distributed Access issue described above raises new challenges to content management systems: while copyrights can be cleared easily within one company, distributing content requires much a higher level of accurateness in rights issues. Professionals won’t purchase content if the rights situation is not clear or if it is very difficult to get the rights. Rights management is no easy matter and it is usually done by specialised rights departments e.g. at broadcasting companies.

Searching distributed archives requires special tools to store, select and sort the search results. Conventional search masks cannot fulfill these needs.

The Solutions:

AMICITIA is developing an integrated property rights management system. The existing traffic lights solution (“no problem”, “restricted”, “rights unclear”) will be improved to cover the regional and factual extent of licenses and their timeframe. Existing rights management systems within the partners are analysed and will be interfaced whenever this possible.

To match the needs of searching distributed archives a so-called “story-telling” interface will be developed. The user can sort, select and pre-arrange search results with it. The new tool will support complex research work providing store and recall functionalities for long-term work and a facility to share results to empower collaborative work.

Preservation of Digital Content

The Challenge:

Preservation of Digital ContentOnce digitised content was often thought to be immortal. This is a popular fallacy: Bits of digitised content are aging rapidly. Due to its medium digital content is exposed to degradation mainly because the physical media e.g. disks or tapes are degrading. This is in contrast to analogue recordings where the degradation may be visible or audible digital recordings are degrading stealthy. Suddenly bits flip from a “1” to “0” or vice versa or they drop and are unreadable. If such a bit error hits vulnerable areas of digital recordings such as file allocation tables a whole bunch of assets may get lost. Restoration of damaged digital media is very tedious, expensive and often impossible, because assets, such as video frames, are coded and compressed using complex algorithms. Most of the digital recording devices therefore employ error correcting codes and algorithms to overcome and correct single bit errors. Although they do work, they work invisibly for the user of this media. The user has no knowledge about the health of his/her media.

The Solution:

AMICITIA is developing a new strategy to protect digital contents. This strategy has 3 components:

  1. Early warning system
  2. Automatic migration
  3. Continuous Maintenance

The system alarms the operator before the number of defective bits exceeds the critical threshold where errors can no longer be corrected. To achieve this the system continuously monitors the bit error rate in the recording devices e.g. the tape drives in the tape libraries. Using a statistical approach the bit errors are analysed and compared with past results in order to estimate the threat of the digital content.

Once a real threat is detected the operator is warned and the threatened content is migrated to a new medium. If tapes are used the content of a damaged tape is copied to a new one. Restoring damaged media is not within the scope of the AMICITIA project but within the concertating PRESTO project.

The continuous maintenance tries to avoid bit errors through maintaining drives and media. This is done by periodicly cleaning the drives, monitoring their head adjustments and by caring the media e.g. by rewinding tape cartridges.

Protection in the Digital Shelter

The Challenge:

Protection in the Digital ShelterMost of the big broadcasting companies have been in operation for several decades. Their archives contain some 100.000 hours of analogue video material. This materials is ageing and is waiting to be digitised and annotated to protected it from further degradation. Even in the digital age recordings are made on (digital) tapes which have to be re-read and transferred into the digital domain of an content management system.

It is clear that such an amount of work cannot be done manually; especially if it is done in parallel. If the digitising is done automatically the quality of the digitised content needs to be supervised continuously. No human eye can watch these endless streams of digitised video.

The Solution:

The project is developing a robot digitising station based on a tape library system capable of handling mixed media. The operator puts the bunch of tapes into the robot system, defines and starts the batch digitising process. The system will then do the rest while the quality of the digitised video is monitored continuously. A separate module analyses the video quality of the digitised material employing digital signal processing techniques. If the quality of the digitised material is not sufficient a fail-over process starts and this event is logged. The operator can supervise the batch process, get the loggings and can start further actions.

Conclusion

The AMICITIA project is now at the stage of completing the workflow analysis and requirements engineering phase. Coarse system designs have been drafted and the first graphical user interfaces have been discussed. The system architecture will now be refined to prepare the implementation of the components.

The first working prototype modules are expected for the end of this year. These modules will then be integrated into the content management system and extensively tested under real world conditions at our broadcasting partners.

References

  1. AMICITIA Project’s Web site
    URL: < http://www.amicitia-project.de/> Link to external resource
  2. Tecmath AG Company’s Web site
    URL: <http://www.tecmath.com/> Link to external resource
  3. Joanneum Research Web site - Institute for Information Systems and Information Management
    URL: <http://www.joanneum.ac.at/ima/> Link to external resource
  4. BBC’s Web site
    URL:<http://www.bbc.co.uk/> Link to external resource
  5. Austrian Broadcasting Cooperation (ORF) Web site
    URL: <http://www.orf.at/> Link to external resource
  6. South-West Broadcasting Cooperation (SWR)
    URL: <http://www.swr.de/> Link to external resource
  7. Distributed Multimedia Archives for Cooperative TV Production - EUROMEDIA Project’s Web site
    URL: <http://www.foyer.de/euromedia/> Link to external resource
  8. PRIMAVERA Project’s Web site (under construction, soon to be released)
    URL: <http://www.primavera-ist.de/> Link to external resource
  9. Preservation Technology for European Broadcast Archives (PRESTO) Project’s synopsis
    URL: <http://www.cordis.lu/ist/ka3/digicult/en/projects.html> Link to external resource
  10. Forum for Metadata Schema implementers (SCHEMA) Project’s Web site
    URL: <http://www.schemas-forum.org/> Link to external resource
    Application Profiles, or how to Mix and Match Metadata Schemas, Makx Dekkers, Cultivate Interactive, issue 3, 29 January 2001
    URL: < http://www.cultivate-int.org/issue3/schemas/>

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Stephan Schneider
Project Manager
Tecmath AG
Content Management Systems Division
Sauerwiesen 2
67 659 Kaiserslautern
GERMANY

stephan.schneider@cms.tecmath.com Link to an email address
<http://www.tecmath.de/> Link to external resource

Phone: +49 6301 606 200
Fax: +49 6301 606 209

Stephan Schneider is employed as Project Manager at the Research Department of Tecmath AG. He is responsible for the IST-Projects AMICITIA (IST1999-20215) and PRIMAVERA (IST1999-20408).

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Schneider, S. "AMICITIA – New Solutions for Today’s Challenges in Digital Audiovisual Archives", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/amicitia/>

Autonomous Acquisition of Virtual Reality Models from Real World Scenes

By Michal Haindl and Josef Kittler - May 2001

Michal Haindl and Josef Kittler provide an overview of the joint research INCO-COPERNICUS project no. 960174 VIRTUOUS (Autonomous Acquisition of Virtual Reality Models from Real World Scenes). The article describes the project objectives, introduces the partners and summarises its main achievements.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

VIRTUOUS [1] (Autonomous Acquisition of Virtual Reality Models from Real World Scenes) was an international research project financed (1997 - 1999) by the Commission of the European Communities in frame of the INCO-COPERNICUS scheme.

Virtual reality systems can be used for a variety of applications in entertainment, medicine and manufacturing. Thus producing detailed models is of generic interest. Unfortunately the customary manual creation of virtual reality models of real world scenes is tedious and error-prone, particularly for scenes of high complexity. Any automation that can substantially reduce the laboriousness and consequently the cost of the whole process would be very beneficial. In this context visual sensors offer the ideal route to automation, especially when range and vision sensors are already common and their mutual registration can be accomplished using either standard photogrammetric techniques or an appropriate sensor setup.

Project Objectives

The objectives of this 3-year project were to build detailed texture mapped surface models of complex real world objects, to develop efficient ways of processing colour textures and to use these models in a robot arm trainer and simulator. The core of the project was to capture virtual reality models of real world robot cell scenes automatically, without interaction with a human observer and then to validate these models in a Virtual Reality Robot Arm Trainer application. To get a lifelike simulation of the manufacturing process, it was necessary to capture 3D graphic information about all objects located in the robot workcell and make it available to the trainer in a suitable form. The aim was to automate this process as much as possible and to avoid any errors. The key approach is to combine range and visual sensor data to build object and scene models. The models are processed by a scene properties extractor and used by the trainer.

The Partners

VIRTUOUS was a joint research project between the University of Surrey, Guildford, United Kingdom, Instituto Superior Tecnico, Lisboa, Portugal, Institute of Information Theory and Automation, Prague, Czech Republic, and the Institute of Control Theory and Robotics, Bratislava, Slovakia.

The University of Surrey (UoS) [2] was the coordinator of the whole project. Apart of the project management the work at UoS was primarily aimed at the building of detailed surface models of complex real world objects from range images.

The Instituto Superior Técnico (IST) [3] used computer vision techniques to acquire scene models from video sequences taken from a mobile platform. Although the objective was the same as that persued by the University of Surrey, this was a more challenging task. The advantage of scene reconstruction from video sequences is the low cost of the sensor. However, the software processing is considerably more complex.

The objectives of the Institute of Information Theory and Automation (UTIA) [4] part in the VIRTUOUS project were to segment colour and range images of a single scene, to develop algorithms for analysis of real textures found in this scene and to resynthesize these textures in an efficient way using appropriate mathematical models. Synthetic textures were finally fused with shape data and mapped to corresponding virtual objects faces.

The Institute of Control Theory and Robotics (ICTR) [5] addressed the main project application - development of a Virtual Reality Robot Arm Trainer, which also provided a mechanism to validate the scene models.

Results

A tool for registering partial surface fragments prior to fusion into a single model was developed [UoS]. Several algorithms [6], were developed for improving quality of registered and fused 3D data based on surface refitting, surface decimation and data recalibration. Further improvements were achieved using newly developed methods for n-views registration [7] and joint centers extraction [8]. This approach significantly decreased error accumulated by the traditional pair wise registration alternative. Because single real world objects have moving parts, work has been done on the extraction of joint centers. A novel technique based solely on the marker measurements was developed [8].

A technique for building 3D models from a sequence of uncalibrated images was developed by [IST]. A correspondence analysis method [9] has been developed based on robust matching criteria. The method has a breakdown point of 50% outliers. It allows both the integration of successive images into a mosaic and 3D reconstruction, which is accomplished either using a novel Maximum likelihood estimation algorithm [10], [11] for recovering jointly the structure, camera motion and camera intrinsic parameters or an approximate method which is much faster. As the approximate reconstruction method is sensitive to missing data, an algorithm has been devised for segmenting input data into subsets in which a set of features is visible in all images. The reconstruction results obtained for the different image subsets are then fussed to obtained a single model. Another method was described in [12] which computes a dense disparity or velocity field between two images captured with different viewpoints.

Three novel range image segmentation algorithms [13], [14], [15] and two algorithms [16], [17] for colour texture segmentation were published [UTIA]. One of range image segmentation algorithms [13], [14] is based on a combination of recursive adaptive regression model prediction for detecting range image step discontinuities and of a region growing on surface lines. The algorithm [14] assumes scene objects with planar surfaces but its segmentation quality is higher on noisy range data while keeping the numerical efficiency of the simpler method [13] published in 1997. This algorithm outperforms most of the existing range image segmentation algorithms of its category.

Range image and its segmentation
       
Range image and its segmentation
Figure 1: Range image and its segmentation.

Colour texture segmentation methods are based on underlying Markov random field models. One of them uses uses a novel recursive maximum pseudo-likelihood Gaussian Markov random field parameter estimation method [17]. Due to this new estimator the method is significantly faster then a similar method recently published in IEEE PAMI.

Figure 2: Natural texture mosaic (marble, sand, grass, stone) and its segmentation Figure 2: Natural texture mosaic (marble, sand, grass, stone) and its segmentation
Figure 2: Natural texture mosaic (marble, sand, grass, stone) and its segmentation.

Several multiscale colour Markov random fields - based texture models [18] were derived in the project. The main advantage of these models is the possibility to synthesize texture data using fast non-iterative computations. At the same time the models are flexible enough to model a large set of natural colour textures. The models assume spectral factorization of the original colour texture data space into an orthogonal Karhunen - Loeve space, where each spectral component can be independently modelled by its dedicated 2D (mono-spectral) multi-scale MRF. Multiple resolution decomposition is based on the Laplacian pyramid technique. The resulting band-pass mono-spectral factors can be efficiently modelled with lower order MRF models.

Figure 3: Natural textures (upper row) and their synthetic counterparts
Figure 3: Natural textures (upper row) and their synthetic counterparts.

Finally the trainer [19], which consists of a PC family computer running a real-time robot control software, was connected to a workstation used as a scene viewer. Virtual reality models acquired using the above mentioned algorithms are displayed by a dynamic viewer providing a high quality real-time visualization of the robotics scene. The more advanced is the robot workcell or other environment displayed on the workstation monitor, the more realistic impression is experienced by the robot user.

Conclusion

The Virtuous project was concern with the development of the technology for building detailed texture mapped surface models. During the project we have developed an advanced methodology for 3D surface registration, a method for 3D object model acquisition from video sequences, several techniques for colour texture modelling and synthesis, a feedback control strategy for registering 3D surface and texture models and finally a robot trainer has been developed.

Figure 4: Original colour scene, range image, and its virtual model in the original and upsidedown rotated view directions Figure 4: Original colour scene, range image, and its virtual model in the original and upsidedown rotated view directions Figure 4: Original colour scene, range image, and its virtual model in the original and upsidedown rotated view directions Figure 4: Original colour scene, range image, and its virtual model in the original and upsidedown rotated view directions
Figure 4: Original colour scene, range image, and its virtual model in the original and upsidedown rotated view directions.

The project research resulted in more than 20 publications apart from project research reports. These achievements have been accomplished with EU project funds but also with a significant contribution of funding from complimentary sources made available at each partner home institution.

References

  1. VIRTUOUS Web server
    URL: < http://www.ee.surrey.ac.uk/EE/VSSP/3DVision/virtuous/virtuous.html > Link to external resource
  2. University of Surrey Web site
    URL: <http://www.ee.surrey.ac.uk/Research/VSSP/index.html> Link to external resource
  3. The Instituto Superior Técnico
    URL: <http://www.isr.ist.utl.pt/> Link to external resource
  4. The Institute of Information Theory
    URL: <http://www.utia.cas.cz/> Link to external resource
  5. The Institute of Control Theory and Robotics
    URL: <http://savba.savba.sk/sav/inst/utrr/intro.html> Link to external resource
  6. Cunnington, S. J. and Stoddart, A. J. (1998) Self-calibrating surface reconstruction for the ModelMaker: British Machine Vision Conference, Vol 2, Southampton, UK, 1998, 790-799.
  7. Stoddart, A. J. Mrazek, P. Ewins, D. and Hynd, D. (1999) A Computational Method for Hip Joint Centre Location from Optical Markers: British Machine Vision Conference, Vol 2, Nottingham, UK, 1999, 624-632.
  8. Cunnington, S. J. and Stoddart, A. J. (1999) N-View Point Set Registration: A Comparison: British Machine Vision Conference, Vol 1, Nottingham, UK, 1999, 234-244.
  9. Gracias, N. and Santos-Victor, J. (1997) Robust estimation of the fundamental matrix and stereo correspondences: In 5th International Symposium on Intelligent Robotic Systems Stockholm, Sweden, July 1997.
  10. Grossmann, E. Santos-Victor J. (2000) Uncertainty Analysis of 3D Reconstruction from Uncalibrated Views. Image and Vision Computing, 2000.
  11. Grossmann, E. and Santos-Victor, J. (1998) The Precision of 3D Reconstruction from Uncalibrated Views: British Machine Vision Conference, Vol 1, Southampton, UK, 1998, 115-125.
  12. Grossmann, E. and Santos-Victor, J. (1997) Performance evaluation of optical flow estimators: Assessment of a new Affine flow method. Journal of Robotics and Autonomous Systems, vol. 21, no. 1, 1997.
  13. Haindl, M. and Zid, P. (1997) Fast Segmentation of Range Images. In: Image Analysis and Processing. Alberto Del Bimbo Ed., Lecture Notes in Computer Science 1310, ISBN: 3-540-63507-6, Springer-Verlag, Berlin, 1997, 295 - 302.
  14. Haindl, M. and Zid, P. (1998) Fast Segmentation of Planar Surfaces in Range Images: Proceedings of the 12th IAPR Int. Conf. on Pattern Recognition, Brisbane 1998, eds. Anil K. Jain, Sveth Venkatesh, Brian C. Lovell, ISBN: 0-8186-8512-3, vol. II, IEEE Press, 1998, 985 - 987.
  15. Haindl, M. and Zid, P. (1998) Range Image Segmentation by Curve Grouping: Proceedings 7th Int. Workshop RAAD'98, ed. K. Dobrovodsky, Bratislava: ASCO Art & Science, ISBN: 80-967962-7-5, 1998, 339 - 344.
  16. Haindl, M. (1998) Unsupervised Texture Segmentation, In: Advances in Pattern Recognition. Adnan Amin, Dov Dori, Pavel Pudil, Herbert Freeman Eds., Lecture Notes in Computer Science 1451, ISBN: 3-540-64858-5, Springer-Verlag, Berlin, 1998, 1021 - 1028.
  17. Haindl, M. (1999) Texture Segmentation Using Recursive Markov Random Field Parameter Estimation: Scandinavian Conference Image Analysis, Vol 2, 1999, 771 - 776.
  18. Haindl, M. and Havlicek, V. (1998) Multiresolution Colour Texture Synthesis: Proceedings 7th Int. Workshop RAAD'98, ed. K. Dobrovodsky, Bratislava: ASCO Art & Science, ISBN: 80-967962-7-5, 1998, 297 - 302.
  19. Kittler, J. Stoddart, A. J. Santos-Victor, J. Costeira, J.P. Haindl, M. Dobrovodsky, K. Andris, P. and Kurdel,P. (1997)
    VIRTUOUS: Autonomous Acquisition of Virtual Reality Models from Real World Scenes: 6th Int. Workshop on Robotics in Alpe-Adria-Danube Region. M. Ceccarelli Ed., Studio 22 Edizioni, ISBN: 88-87054-00-2, Cassino, Italy 1997, 487 - 492.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Dr. Michal HaindlDr. Michal Haindl
Senior Researcher
Institute of Information Theory and Automation
Academy of Sciences of the Czech Republic
18208 Prague
Czech Republic

haindl@utia.cas.cz Link to an email address
<http://www.utia.cas.cz/> Link to external resource

Phone: +420 2 66052350

UTIA logoDr. Michal Haindl is employed as a Senior Researcher at UTIA (Institute of Information Theory and Automation, Prague). From 1990 to 1992, he was visiting researcher at University of Newcastle, Newcastle; Rutherford Appleton Laboratory, Didcot; Centre for Mathematics and Computer Science, Amsterdam and Institute National de Recherche en Informatique et en Automatique, Rocquencourt working on several image analysis and pattern recognition projects. From 1992 to 1995, he joined the Centre for Mathematics and Computer Science,Amsterdam to work on a multimedia ESPRIT project. His present research interest concern random fields applications in pattern recognition and image processing. He holds Ph.D. and Doctor of Science degrees and he is the author of about 140 papers published in books, journals and conference proceedings.

Professor Josef Kittler
Director of the Centre for Vision, Speech, and Signal Processing
University of Surrey
Guildford
GU2 7XH
United Kingdom

j.kittler@surrey.ac.uk Link to an email address
<http://www.ee.surrey.ac.uk/Research/VSSP/index.html> Link to external resource

Phone: +44 1483 879294

SurreyProfessor Josef Kittler (Ph.D., ScD) is the director of the Centre for Vision, Speech and Signal Processing of the University of Surrey. He has been a Research Assistant in the Engineering Department of Cambridge University (1973--75), SERC Research Fellow at the University of Southampton (1975-77), Royal Society European Research Fellow, Ecole Nationale Superieure des Telecommuninations, Paris (1977--78), IBM Research Fellow, Balliol College, Oxford (1978--80), Principal Research Associate, SERC Rutherford Appleton Laboratory (1980--84) and Principal Scientific Officer, SERC Rutherford Appleton Laboratory (1985). His current research interests include Pattern Recognition, Neural Networks, Image Processing and Computer Vision. He has co-authored a book with the title `Pattern Recognition: a statistical approach' published by Prentice-Hall. He has published more than 200 papers. He is a member of the Editorial Boards of IEEE Transactions on Pattern Analysis and Machine Intelligence, Pattern Recognition Journal, Image and Vision Computing, Pattern Recognition Letters, Pattern Recognition and Artificial Intelligence. He has served as the President of the International Association for Pattern Recognition (IAPR).

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Haindl, M. and Kittler, K. "Autonomous Acquisition of Virtual Reality Models from Real World Scenes", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/virtuous/>

-------------------------------------------------------------

Other Areas

The Sign on the Door: Establishing a Top-level Museum Domain on the Internet

By Cary Karp - May 2001

In November 2000 the Museum Domain Management Association (MuseDoma) announced the approval of its proposal to establish dot-museum as a restricted top-level domain name on the Internet. The approval was made by the board of directors of the Internet Corporation for Assigned Names and Numbers (ICANN), the nonprofit organisation that provides oversight for domain names.

In this article Cary Karp, Director of Internet Strategy and Technology at the Swedish Museum of Natural History, and the President of the Museum Domain Management Association, explains why we need this new top level domain, details its evolution and gives the implications for Europe.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The massive brick-and-mortar edifices that are normally associated with the word "museum" are an easily recognised attribute of the urban landscape. Any such institution housed in quarters that do not correspond to the stereotype can easily reassure the public about its identity with a label near the doorway containing the familiar "museum" string of letters (or, of course, any number of equivalents to it using other languages and character sets). Museums have long since established annexes in the utterly intangible world of the Internet and, although such things as Web sites have an even greater need for clear labels, there is no digital way to convey the authority of the name of a museum that is graven in stone on its grand façade. Translated into technojargon, if on the Internet nobody knows you're a dog, on the Internet nobody knows you are a museum, either.

There is probably little reason to worry about the consequences of an inability to distinguish between, say, Web sites operated by pretend dogs and sites operated by real ones (bona fido canines). There may be greater need to take a less casual approach to material provided by organisations claiming to be museums. The ability to include the letter string "museum" in an Internet domain name can be purchased by anyone having about USD 10 per year to spend on it. There is no requirement, whatsoever, for the activity subsequently conducted in such a domain bearing the slightest relationship to anything that the professional museum community might regard as a legitimate museum purpose. Indeed, any one of the well over ten thousand MUSEUMSOMETHING dot-COMs, dot-ORGs and dot-NETs can as easily be used deliberately to conceal antisocial activity as it can to designate what, indeed, bears the best attributes of museumness.

The Domain Name System (DNS) was never intended to provide more than a convenient means for equating the names that people commonly use to identify the various computers that are connected to the Internet, with the numerical addresses that these computers use to identify each other. There was initially a clear semantic basis for differentiating among what have latterly come to be termed the generic top-level domains (gTLDs) dot-com, dot-org, and dot-edu (with dot-net coming later). This was expressed in rules that have never been more than loosely applied when evaluating requests for registration (with the erratic exception of dot-edu). The traditional response to the concerns expressed in the preceding paragraph would be to dismiss them with reference to their being based on an ascription of significance to domain names that they were never meant to have. A lot has happened, however, since the early days of the DNS.

The removal of all restrictions on commercial participation in the Internet resulted in a staggering inflation in the value of domain names, mostly particularly in dot-com. This gave rise to the identically named and ever so peculiar dotcom economic phenomenon. Domain names were no longer being used as loosely derived ID's for computers; they were being used to "brand" both products and corporate Web sites. Attractive domain names acquired monetary value of lunatic proportion and domain name disputes generated incessant legal action (lucrative, in turn, to specialised legal professionals). The basis for asserting that domain names were devoid of significant meaning eroded utterly despite persistent hard-line assertions to the contrary. At the same time, all pretence at enforcing the original meaning of the three-letter gTLDs was abandoned toward the unmasked end of generating as much revenue as possible from their operation.

These difficulties were seen looming on the horizon fully five years ago by the late Jon Postel, the creator of the DNS, who proposed their mitigation by the establishment of a large number of new gTLDs each intended to serve a clear purpose that could be recognised from the domain's name. This marked the start of an extraordinarily contentious and protracted discussion about the basis for what was termed Internet governance, with clear focus on modes for anchoring this on an international platform rather than leaving it the control of its initial sole guardian, the United States Government.

This process is far from over but it has passed at least two milestones. The first was the creation in 1998 of the Internet Corporation for Assigned Names and Numbers (ICANN) [1] and the second was a decision made by ICANN in November 2000 to introduce seven new gTLDs into the DNS. Although the negotiations necessary to formalise these domains are currently in progress there appears to be little doubt that an expansion of the generic top level of the DNS is imminent.

One of the new seven gTLDs is dot-museum, intended to provide the community of Internet users with a means for recognizing bona fide museums on the basis of their being registered in a gTLD specifically restricted to such use. The dot-museum charter will be on public record and anyone wishing to know the basis for entitlement to registration in the domain can easily find it.

Although this does nothing to provide the DNS with the ability to assist people who are trying to locate Net-based resources, it does allow for the recognition of the desired resources during the course of the search for them. Peering beyond the formal and narrow constraints of the DNS, having a shared name space for the global museum community can provide significant impetus and support for the development of a directory service that could permit unprecedentedly comprehensive searches for museum information in Net-based repositories. Although domain names traditionally designate computers and named services, the dot-museum nomenclature can easily be extended to provide name space for individual objects in museum collections. This would allow for a name such as monalisa.collections.louvre.museum.

One of the primary reasons for ICANN having selected dot-museum in the "first wave" of new TLDs is its suitability as the pioneer initiative in the envisioned creation of a larger number of TLDs, each dedicated to one sector of the cultural community. Taking a leap into a future where other sectors operate such domains, identically structured name spaces could be used in each of them that houses catalogable objects, such as dot-library containing magnacarta.manuscripts.british.library.

The DNS was never intended to be applied to the management of such information structures and it would be egregiously misused by any attempt at incorporating it in the implementation of what is being suggested here. That notwithstanding, the name constructs initially devised for the DNS can be extended in far-reaching regards. Museums have a fundamental mandate to describe and catalogue their holdings. The extreme utility of the various repositories of resulting information being interoperable has long been recognised, and means for implementing this has been the focus of much cost and effort. The availability of a single coherent name space into which every single museum object can be placed has potential for bringing the realisation of this goal immeasurably closer. The potential utility of extending this across the boundaries of adjacent cultural sectors should be apparent.

ICANN has entrusted the establishment and enforcement of dot-museum policy, as well as responsibility for the operation of its registry, to the Museum Domain Management Association (MuseDoma) [2]. Although it currently consists of no more than its founding members, the International Council of Museums (ICOM) [3] and the J. Paul Getty Trust [4], MuseDoma has been incorporated as an open membership organisation providing all interested parties with the ability to participate in the on-going discussion of the refining and development of domain policy. The core elements of this policy are a statement of the basis for entitlement to registration in the domain, and the principles used for the naming of subdomains. The primary normative instrument underlying the first of these concerns is the ICOM Definition of Museum as stated in that organisation's statutes [5]. The naming principles are being devised at the time of present writing. Anyone who is interested may follow this activity as it unfolds via MuseDoma's Web site and its e-mail distribution lists [6]. Relevant developments on ICANN's side of the fence may be followed via their equivalent channels.

Although currently absorbed entirely by the legal, administrative, and technical aspects of getting the new TLD up and running, MuseDoma looks forward to being able as soon as possible to turn its attention to the development of value-added services for the dot-museum registrants. Primary among these is participating in the development of directory services that will allow us to harness the potential residing in the broad name space that is at our disposal.

The Internet architects have clearly indicated that they feel the DNS to be inadequate for many of the requirements that users have and, lacking anything better, are imposing on it. The directory services to be devised for use with dot-museum will need to be coordinated with the central initiative, in turn calling broader attention to the needs and potential of this domain (unique among the New Seven in its belonging to a sector with a centuries-long tradition of devising and managing systematic nomenclatural hierarchies).

Once it is moderately comfortably in business, MuseDoma looks forward to sharing its experience in the manifold aspects of the creation of a TLD with its siblings in the cultural community. One of the more daunting aspects of creating a TLD is the prosaic but vital need for a robust technical infrastructure. This includes the various database servers needed for the DNS, for the internal administration of the domain, and for the public availability of key bits of information about subdomain holders -- the so-called WHOIS data. These servers need absolutely reliable high-speed connections to the Internet and, to avoid "single points of failure", need to be maintained redundantly at separate and distant sites. Establishing this technical infrastructure involves enormous headache and expense. Fortunately, multiple domain registries can be operated on a shared platform with each newcomer necessitating an incremental cost that is a fraction of the initial investment.

It would be inappropriate at the moment of present writing to discuss the various means by which MuseDoma may elicit the support of operators of pre-existing such infrastructure. (The matter is subject to negotiations that are currently in progress.) What can be noted, especially given the nature of this publication, is that the options are all centered in Europe. In fact, the initial four years during which the new TLD process had been tracked toward the end of establishing a museum top-level domain - starting with the Postel Proposal and ending with the creation of MuseDoma - were all centered in Europe. The leg work and lobbying was financed primarily by the Swedish Museum of Natural History (NRM) [7] in Stockholm, in which city ICOM's central Internet host is also located.

This activity is now being formalised by the establishment of the dot-museum network information center at NRM. Every top-level domain has its so-called NIC [8], serving as the central point for the coordination of various aspects of the domain's daily operation. With due pride in the dot-museum NIC being created in the capital city of the current President State of the EU, it is being given an acronym that highlights its European basis - musEnic.

This European connection is probably not as coincidental as it first appears. Europe may well be alone in the world as an area that simultaneously houses rich repositories of cultural property, shares them across many language and cultural borders, and is an extremely sophisticated participant in the technological arena on both the consumer and industrial levels. Europe is thus ideally suited as a development and initial deployment arena for the cross-domain initiatives mentioned above. We hope that we will be able to lash musEnic firmly to European ground and that we may then see the rings on the water radiate outwardly from Europe to the rest of the world as we undertake the exhilarating task of building a cultural sector on the Internet. The cradle of the Internet's technological development was the United States of America, which demonstrated its ability to do massive good work in the process. As the Net embarks on another grand phase of its development it would be entirely fitting for Europe to be at the helm.

References

  1. Internet Corporation for Assigned Names and Numbers (ICANN)
    URL: <http://www.icann.org/> Link to external resource
  2. Museum Domain Management Association (MuseDoma)
    URL: <http://www.musedoma.org/> Link to external resource
  3. International Council of Museums (ICOM)
    URL: <http://www.icom.org/> Link to external resource
  4. J. Paul Getty Trust
    URL: <http://www.getty.edu/> Link to external resource
  5. ICOM 's statutes
    URL: <http://www.icom.org/statutes.html> Link to external resource
  6. MuseDoma's e-mail distribution lists
    URL: < http://listserv.musedoma.org/archives/musedoma-discuss.html> Link to external resource
  7. Swedish Museum of Natural History (NRM)
    URL: <http://www.nrm.se/> Link to external resource
  8. The Network Information centre (NIC) register domain names.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Cary Karp
Director of Internet Strategy and Technology
Swedish Museum of Natural History

ck@nrm.se Link to an email address
<http://www.nrm.se/> Link to external resource

Phone: +46 8 5195 4055

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Karp, C. "The Sign on the Door: Establishing a Top-level Museum Domain on the Internet", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/museum/>

-------------------------------------------------------------

Eurospeak – Fighting the Disease

By Emma Wagner - May 2001

One of the key issues when working with the European Commission and in Europe in general is getting to grips with Eurospeak. Eurospeak can be confusing, complicated and sometimes elitist. It could also be avoided.

Emma Wagner discusses what she calls 'the disease of Eurospeak' and details guidelines for improvement, which the European Commission's Translation Service are trying to get across to authors inside the EC.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Eurospeak comes in all languages, believe it or not, and in all cases the symptoms and causes are the same. In this article I'll talk about the English variant, Euro-English.

Linguists love to be tolerant about the way languages grow. Just as we accept all sorts of regional accents, the argument goes, we must accept and even celebrate all variants in written language… Very politically correct, but it overlooks one simple fact: that when speaking, you can see immediately if your listener doesn’t understand, and re-phrase your statement or adjust your accent immediately; when writing, you don’t get that instant feedback. So it is perfectly possible to churn out reams of incomprehensible writing that no-one will understand – or read!

Anyone trying to communicate in writing, and who wants their message to end up in their readers' brains rather than their bins, is well advised to follow a few rules and stay anchored in the reality of a real language.

One linguist of the tolerant school, David Crystal, writes in English as a Global Language: "There is even a suggestion that some of the territories [...] in which English is learned as a foreign language may be bending English to suit their purposes. 'Euro-English' is a label sometimes given these days to the kind of English being used by French, Greek and other diplomats in the corridors of power in the new European Union, for most of who English is a foreign language" [1].

I work in those corridors of power – or in one of the offices at the European Commission, to be precise – and the prospect of Euro-English acquiring special status because it is spoken by a powerful elite fills me with dread. Surely that would spell the end of the European Union, because it would cut us off from the public, who have a right to read Commission documents in real English? In a bid to prevent the spread of Eurospeak, Euro-waffle and plain bad English in Commission documents some fellow-translators and I started the Fight the FOG [2] campaign in 1998. We wanted to encourage Commission writers and translators to write clearly, in real English (and real French, real German, real Finnish, etc.). We also instructed them to KISS - Keep It Short and Simple.

I spend much of my working time trying to eradicate Eurojargon and bad English from texts written in the European Commission. Here’s a sample: a paragraph from the minutes of an important committee meeting, 35 (!!) pages long. This came into my department last week, for translation into the 10 other official languages of the EU.

"Mr A welcomed the participants to the ZZZ meeting, in particular to the Malta delegation, that attended the meeting for the first time. He passed the floor to Mrs B who was going to intervene on behalf the French Presidency of the European Union. […]

Mr A informed about the present stage of the works on the Directive on scaffolding and works in height. He said that in October the Council had agreed a common position. In the other hand, the Parliament had presented comments to the project of Directive. A meeting between the Parliament's reporters and the Presidency of the Council had taken place for establishing a more official position in the agreement. There had been a second meeting between the Commission and the political groups of the Parliament for discussing the contents of some of the amendments. He said that the differences between the Parliament and the Council were small and that the Parliament wished scaffolds below the normal height to be included."

Why does this sort of Euro-English get written? Here are some of the causes of the disease:

1. Drafting by Non-native Speakers

Drafting by non-native speakers is unavoidable, for organisational reasons, and some of them do an excellent job. But it inevitably causes problems of interference in vocabulary and syntax. Non-native speakers can't be expected to know what sounds natural in English. Even native speakers lose this sensitivity when working outside their mother-tongue environment. When you've heard words like "eventual" and "payment delays" misused hundreds of times, you can lose touch with their real meaning.

2. Growth of English and Tolerance of Defective English

English has taken over from French as the main language used for communication inside the EU institutions. Of course, concessions have to be made for spoken communication in an organisation where fifteen different nationalities work together. But as the above example shows, the standard of "English" is often simply too low for written communication. It is certainly more defective than the French written here by non-natives. Why? Because Brussels is a partly French-speaking city? Because the French have stricter grammar and an Académie to outlaw barbaric imports, whereas English is a very flexible language that belongs to everyone and seems to know no rules? Or maybe (unfashionable view coming up here - sorry, Professor Crystal) because English grammar has not been taught in British schools for the past 40 years, so most native English speakers can't even explain to their non-native colleagues why paragraphs like the one quoted above are not real English? Only those of us who learnt foreign languages were lucky enough to acquire any grammar.

3. Fear of Brevity

Many authors in the EU institutions come from a tradition or a culture where concision is not a virtue. Recently the French arm of a highly respected firm of management consultants did a study for us on one aspect of the Translation Service's operation. Their report ran to 186 pages and paralysed our e-mail system. When I asked them to produce a summary, they did - 50 pages!

4. Eurojargon

Specialised language, or jargon as it is less politely called, aids communication between specialists. But if it spills over into the wrong context, it is irritating and sounds ridiculous. Acronyms such as CFSP, SANCO, SLIC and PECO are all pregnant with meaning for those who understand them, but alienating for those who don't [3]. We encourage authors to spell them out when first used, or to avoid them completely. Another nasty habit of Eurocrats is to use the names of towns to mean something quite different. "Schengen" is no longer a sleepy village in Luxembourg, but an agreement on a passport-free zone; "Amsterdam" is a Treaty, and "Gymnich" is an informal meeting of foreign ministers.

5. "Consensus Building"

In the desire to secure agreement at any cost, documents are sometimes inflated - and their logic distorted - by the inclusion of disparate material. The motives are excellent, but the result is a kind of patchwork, which is not. Foggy language helps to achieve an appearance of political consensus. But it invariably creates problems for the future, when foggy Treaties and laws have to be put into effect.

Some Antidotes to Eurospeak

The European Commission has recently started work on several solutions:

Maybe in addition there should be a major cutback in the number and length of publications, perhaps based on reader surveys to see which ones are really useful and which could be dispensed with. In addition we could use the power of the Internet to improve the quality of written communication from the Commission. For example, texts on the Europa server could incorporate:

  1. hyperlinks to definitions of key terms and acronyms
  2. short, clear citizen's summaries
  3. an invitation to submit feedback on substance and style
  4. testing of draft publications on focus groups, before they are finalised.

The Cure for Eurospeak

There is a simple cure for this disease called Eurospeak. Let people speak it, by all means, in the interests of cooperation and in-house communication with each other. But encourage them not to write it, if they want outsiders to get the message.

The Fight the FOG campaigners are trying to highlight these key principles of good writing:

Audience awareness. Remember that the defective language we use when tired and rushed is not good enough for the outside world. We must try to prevent jargon spilling over into general writing.

Honesty. Resist the tendency to be pompous, as if status and dignity could be increased by using long words and convoluted syntax.

Responsibility. Beware of "patchwork drafting". Someone must retain overall responsibility for the structure and logic of a document. This is also called accountability.

Planning ahead. Allow enough time for drafting and translation.

Expert editing. Allow experts to rewrite documents before they are translated into 10 and soon 22 languages. Experts can be outside consultants or editors - or translators can do the rewriting. Don't say "they don't know enough about our field to understand our documents". If intelligent, interested readers don't understand, that proves that the documents need to be rewritten.

KISS: Keep It Short and Simple.

References

  1. Crystal, D. (1997) English as a Global Language, Cambridge University Press, page 136.
  2. Fight the Fog Campaign
    URL: <http://europa.eu.int/comm/translation/en/ftfog/> Link to external resource
  3. Marieke Napier, Book Review: EUROJARGON, Cultivate Interactive, issue 3, 29 January 2001
    URL: <http://www.cultivate-int.org/issue3/review/>
  4. Europa Server
    URL: <http://europa.eu.int/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Emma Wagner
Head of Department
Translation Service
European Commission

Emma.Wagner@cec.eu.int Link to an email address

Emma Wagner studied Modern Languages at Cambridge received her MA in Translation and Interpreting from Bath University. She has worked for the European Commission since 1972 as a translator and translation manager. She is currently head of a translation department with 250 staff translating into and out of the 11 official languages of the European Union. In 1998 she started the Fight the Fog campaign at the European Commission because "foggy language is alienating for the general public and difficult to translate well".

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Wagner, E. "Eurospeak – Fighting the Disease", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/eurospeak/>

-------------------------------------------------------------

Bridging the Digital Divide in Asia: a European Commission Initiative

By Mike Robbins - May 2001

Mike Robbins gives a further introduction to Asia IT&C, a programme first mentioned in the news and events section of issue 2 of Cultivate Interactive. Asia IT&C is a five-year programme under the European Commission which co-finances projects in the Information Technology and Communications (IT&C) sectors. The projects must be joint activities between non-profit-making partners in at least two EU member states, and at least one of the participating countries/territories, which are: Afghanistan, Bangladesh, Bhutan, Brunei, Cambodia, East Timor, India, Indonesia, Laos, Malaysia, Maldives, Nepal, Pakistan, Philippines, Sri Lanka, Thailand and Vietnam. The lead partner can be either Asian or European. Co-financing of up to 80% and €400,000 is available, depending on the programme component.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Asia IT&C logoAsia IT&C began in late 1999. Its purpose is to address not only the ‘digital divide’ between rich and poor nations, but also between knowledge-rich and knowledge-poor members of the same society. It does this by co-financing co-operation between Asia and Europe. This is the core philosophy underpinning the programme; its intention is to strengthen links between the IT&C sectors from both continents, rather than just handing out grants. Like all EU activities, the programme is intended to involve more than one EU member country, which is why there must be partners from at least two different countries. If just one EU country was involved, bilateral assistance from that country’s Government would be more appropriate.

The partners must be non-profit-making, typically, they are Government departments, colleges, universities or NGOs (non-government organisations). However, this does not mean that the programme is irrelevant to the private sector, consideration is also given to projects that intend to strengthen the IT infrastructure for business. The programme is also happy to consider proposals from Chambers of Commerce and industrial or commercial associations and federations.

There are currently two Project Management Offices (PMOs); one in Belgium, and the other in Thailand. The Bangkok office is hosted by the Kingdom of Thailand through its National Electronic & Computer Technology Centre (NECTEC), a component of the Ministry of Science, Technology and the Environment (MOSTE).

The Digital Divide

A survey of delegates to the World Economic Forum (WEF) in Davos, Switzerland in January 2000 found that 50% believed information technology would widen, not narrow, the gap between rich and poor. A recent BBC report claimed that more than 80% of the world’s people have never used a telephone [1]; however, it quoted no source for this. The UNDP Human Development Report (HDR) puts the figure at about half the population, and indeed UNDP has recently set up mechanisms to try and reduce the digital divide [2].

According to the HDR, 26.3% of the United States population use the Internet, but just 0.8% of the population in Latin America and the Caribbean use it , while only 0.5% of the population in South-East Asia use it with an even lower percentage in the Arab world, Eastern Europe and Sub-Saharan Africa.

The divide is within, as well as between, countries, and not simply in the developing world. The BBC quotes the OECD as saying that in France, the highest income bracket had 74% PC penetration in 2000, against just 11% for the lowest income bracket [3]. However, the internal gap is more dramatic in developing countries. The number of computers in India has now reached 4.3 million [4] but this is just a fragment of the country’s one billion people, and the country is a leader in information technology. Statistics must be interpreted with care; in 1998 there were six times as many Indian-registered Hotmail accounts as there were Internet subscribers [5].

Poverty can make a terrible mockery of the information revolution. The point was made forcibly in May 1998 by the distinguished Indian scientist Dr M.S. Swaminathan, whose role in introducing dwarf wheat varieties to India in the 1960s was a key part of the Green Revolution. Speaking to scientists in the Middle East he pointed out that birthweights as low as 2.4 Kg were common in South Asia, causing what the U.N. had called “the cruellest form of inequity”- retarded intellectual development which would prevent people in the developing world from coming to grips with the new Information Age. How are such people supposed to compete in a world of the Internet and galloping information technology [6]?

But information technology can be used to benefit the poor; given sufficient imagination. A recent EC report quoted the case of “one poor village in southern India… where two solar-powered computers were installed in a room at the side of the village temple, giving access to a wealth of data. Sometimes, computers are obtained by an NGO through a donor agency; in other cases, they are bought by the village and franchised to an operator who charges a modest fee for use. Examples of results cited include finding a local veterinarian to cure a sick cow, to downloading a local map from the US Navy website, showing wave heights and wind directions in the nearby Bay of Bengal. This information was communicated to the local fishing village, which broadcast the daily weather report from loudspeakers fixed to poles along the beach" [7].

Strengthening Co-operation

In 1996, at the first Asia-Europe Meeting (ASEM) meeting, in Bangkok, it was decided that strengthening links between the IT&C sectors in Asia and Europe was an important part of strengthening economic links between the two regions in general. Three years later the European Commission launched the Asia IT&C programme, giving it an initial budget of €19 million to fund about 100 projects; so far (April 2001), it has selected 16, three more will be selected at the end of 2000 and the remainder in March 2001.

Applications must be from non-profit-making concerns. This is liberally interpreted; a limited company is acceptable, for example, provided it is clearly not intended to make a profit and this is clearly stated in the statutes. State-owned corporations of which small shares are held by the private sector may also be eligible, provided there are no payments of dividends to shareholders. And although profit-making concerns are not eligible, associations of such companies – such as Chambers of Commerce, or trade or industry federations – certainly are. Asia IT&C regards them as useful partners in the development of the IT&C infrastructure for SMEs. A business cannot be funded, however; venture capital must be obtained elsewhere!

In line with the spirit of the programme, applicants must be part of a consortium of partners from at least two EU member states and one from Asia. This is a minimum. The programme wishes to encourage the broadest level of co-operation possible, building links between European and Asian countries as well as between the continents themselves, so the more partners, the better, up to a point. There is no upper limit on the number of partners, but they should all participate actively in, and benefit from, the project.

The projects must genuinely be in information technology, and/or communications development. The programme has received several proposals which were really general development projects, with IT&C ‘bolted on’ as a component. These proposals should seek funding from more suitable sources.

The programme funds co-operative projects that apply IT&C to the following areas of activity:

Applicants have to specify the Area of Activity.

They also have to specify a Programme Component. There are six of these, and they are described below – together with the percentage of co-financing, and the grant amounts, allowable. These vary between the components, so it’s important for applicants to be clear about the component under which they are seeking funding.

Get-In-Touch and Keep-In-Touch Activities

These join organisations together so that each knows what the other is doing. The methodology can include conferences, seminars and other ways of swapping information. A partnership will often (but not necessarily) start with this component, and can then apply for further funding under a second programme component if it has been a success. Funding available: maximum 50%, between €100,000 and €200,000.

Short (University Level) Courses

Courses and workshops in either a business or university environment. Funding available: maximum 50%, between €100,000 and €200,000.

Information Society Interconnectivity

Proposals to improve connectivity between Europe and Asia, either globally or in a specific business or professional context. Funding available: maximum 50%, between €100,000 and €200,000.

Liaise with European IT&C Initiatives and Programmes

This is intended to fund liaison, through workshops and taskforces for instance, between Asian bodies and IT&C initiatives taking place within the European Union. It is chiefly meant to help Asian partners link with, or completely participate in, European Commission initiatives such as those under the Community Research and Technological Development (RTD) framework. However, programmes or initiatives not from the European Commission are also eligible, provided they are non-profit-making, and originate from at least two EU member states.

Understanding European and Asian Regulatory and Legislative Organisation Structures

This is intended both to help partners understand the way each other’s regulatory structures have evolved, and their strengths and weaknesses; and also for them to understand the regulatory context in which they might enter each other’s markets. This can be done through workshops, seminars, task forces and seminars.

Practical Demonstration Projects

These should consist of demonstrations of European IT&C technology, showing what it can achieve. Applicants should normally have completed a Get-In-Touch and Keep-In-Touch activity, or something similar, first. Maximum funding: 25%, between €100,000 and €200,000.

There are specific conditions attached to each Programme Component; these are described in full in the Call for Proposals 2001 and Guidelines to Applicants 2001, which can be downloaded from the Asia IT&C Web site [8].

Projects may last up to 36 months in all cases except for Short (University Level) Courses, where the maximum duration is 12 months.

A Broad Range

Asia IT&C has already accepted proposals for a number of different types of project. The types of partners are just as diverse.

Examples include a partnership led by the University of Liège, which is co-operating with the National University of Hanoi in Vietnam, along with other partners from France, Sweden and Vietnam. This is a university-level course which will lead to a European Master in Modelisation and Design of Engineering Sciences, with the objective of strengthening Vietnam’s capacity to design flood-protection measures. It is a key question for Vietnam, where in November 1999 catastrophic flooding killed 592 people and caused $235 million worth of damage.

A very different type of project is being led by the Centre for the Development of Advanced Computing in Pune, India, in collaboration with French and Spanish partners. The objective is the development of a text-to-speech convention, Internet-compatibility and optical character recognition (OCR) software for Indian languages and scripts. This could hugely increase the applicability of information technology for non-English-speaking groups.

Different again is the development of a centre of intelligent manufacturing and rapid prototyping, which will assist in the application of IT to manufacturing industry. Led by the University of Cardiff, Wales, it involves partners from Thailand and Malaysia on the Asian side, and Germany and Greece in Europe.

As stated above, Asia IT&C cannot give grants to private companies, but it can certainly support business and industry federations in their efforts to improve the business infrastructure. Thus the programme is also co-financing a project of the Chambre de commerce et d’industrie de Paris, France, in collaboration with India, Malaysia and Ireland, which aims to integrate e-commerce in the networked world Chambers of Commerce movement.

All the projects are based around activities, not equipment or capital investment. Asia IT&C isn’t really meant to fund that. It is quite happy to include the cost of a few PCs if they are needed for the project, but this should not be a big part of the budget.

Applying for Co-financing from Asia IT&C

Before any steps are taken towards applying for co-financing, it’s very strongly recommended that the Call for Proposals 2001 and Guidelines to Applicants 2001 be read with great care.

Then, find your partner! Many Applicants already have a clear idea of whom they would like to work with. But those who don’t, and are seeking partners, can upload their details to Asia IT&C’s Partner Search database via the Programme’s Web site [8]. Having done so, they will appear on the database and will receive a username and password which they can use to modify their entry. Whether they register or not, they are welcome to trawl the database for suitable partners in both Asia and Europe.

The application procedure can then begin. It looks more complicated than it is. The grant application procedure is based on the European Commission’s standard format. This in turn is related to the tendering procedure, so conditions are stringent. But there is a reason for everything. Partners will, for example, be asked for the statutes of their organisation; but these are needed to prove that it is non-profit-making. Similarly, they will be asked for their latest set of accounts; these are necessary because the European Commission would not wish to transfer funds to an organisation that is heavily in debt or even bankrupt. There are numerous other documentation requirements, but all have a rationale, and if a proposal does not meet them, the programme’s management is not allowed to review it further [9]. The Commission does try to be reasonable and, in particular, fair in its procedures. Where the requirements are unclear, potential Applicants are very welcome to contact the Programme Management Offices in Brussels or Bangkok for advice.

In practice, most proposals arrive with full documentation, but they can still be rejected at this stage, simply because they are not eligible. Asia IT&C has received beautifully-prepared proposals, correct in every administrative detail, for projects that it could never have considered - either because the partner(s) were not eligible, or because there were not enough of them; or because the proposal wasn’t really an IT&C project at all.

Assessing a Proposal – the Keys to Success?

If the proposal meets the administrative requirements, but still fails, the most common reasons are:

The Programme’s staff prefer accepting proposals to rejecting them. They are also aware that the application procedures can be complicated. Potential Applicants are extremely welcome to get in touch with the Programme Management Office (PMO) in either Brussels or Bangkok, to find out whether their project is suitable and to get informal advice and encouragement. Visitors are also welcome in person, and should contact the PMO in advance for an appointment. Bridging the digital divide is a difficult task, and Asia IT&C is always happy to meet new partners with which to share it.

Full information and documentation on the Asia IT&C programme may be found on its Web site [8].

References

  1. BBC, January 23 2001
    URL: <http://news.bbc.co.uk/hi/english/business/newsid_1119000/1119936.stm> Link to external resource
  2. United Nations Development Programme (Communications Office)
    URL: <http://www.undp.org/dpa/frontpagearchive/october00/19oct00/> Link to external resource
  3. BBC, January 23 2001, OECD statistical information on IT
    URL: <http://www.oecd.org/dsti/sti/stat-ana/> Link to external resource
  4. BBC, January 23 2001
    URL: <http://news.bbc.co.uk/hi/english/business/newsid_1119000/1119936.stm> Link to external resource
  5. Robbins, M. (1999) Information technology in Asia, internal discussion paper for Asia IT&C, December 1999.
  6. Swaminathan at ICARDA: the evergreen revolution, in ICARDA Caravan No. 8, Winter/Spring 1998, ICARDA, Aleppo, Syria.
    URL: <http://www.icarda.cgiar.org/Publications/Caravan/Caravans/8Article3A.Html> Link to external resource
  7. The information society and development, European Commission, Directorate-General for External Relations, January 12 2001
    URL: <http://europa.eu.int/comm/external_relations/info_soc_dev/doc/review.pdf> Link to external resource
  8. Asia IT&C Web site
    URL: <http://www.asia-itc.org/> Link to external resource
  9. Web site of the EuropeAid Co-operation Office
    URL: <http://europa.eu.int/comm/europeaid/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Mike Robbins
Asia IT&C

Mike has now left Asia IT&C. Further enquiries can be made to Guy Franck, Director of the Programme Management Office.
guf@asia-itc.org Link to an email address

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Robbins, M. "Bridging the Digital Divide in Asia: a European Commission Initiative", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/asia/>

-------------------------------------------------------------

Global Museum: a Personal Vision that became an Online Success

By Roger Smith - May 2001

Roger Smith, the founder and director of this highly successful international Webzine, shares his experiences of online publications and the rationale for establishing a museum-based compendium site. Global Museum is currently read in 88 countries and maintains a weekly mailbase of more then six thousand museum professionals and those with an interest in museums [1] [2]. Global Museum remains a free Webzine available to all and has developed a style that that encourages and maintains a loyal readership. Global Museum aims for immediacy of communication rather than in-depth analysis of museological issues. As a Webzine its role is to be a one-stop shop for museum news, views, vacancies and products without being perceived to be overtly commercial, which it is not.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

Where do all the best ideas start? Usually, with an entrepreneurial thread or an observation that simply won’t go away.

In the case of Global Museum the gestation period from observation to action took several months. As the then Chairman of ICOM’s Marketing and Public relations Committee (ICOM MPR) one of the challenges I faced was to communicate with several hundred members in many countries. The traditional forms of communication often meant that notices of importance and articles of interest took far too long a period to reach those for whom they were intended. Having worked in a variety of museums over many years and at various levels of the profession, I was quick to appreciate that there was a marketing opportunity and a niche that could be catered for.

This might at first glance suggest an altogether mercenary approach. It is true that like many other bidding e-entrepreneurs, an Amazon.com bookshop affiliation provided the seed funding to get things underway, but to reach a discerning audience needs more than ecommerce activity.

Global Museum commenced as an email newsletter – an ezine. As the publisher’s Web skills developed so did the publication until it reach its current status as a fully-fledged Webzine. Adopting a business like approach for this endeavour has meant that Global Museum has never lost its primary focus. It would be my suggestion to anyone contemplating any form of publication (and especially an online initiative) that they undertake a thorough business and market planning process. Equally, I would suggest that one is not too modest in one’s ambitions in this regard but that such enthusiasm and vision needs to tempered by reality. The world is after all littered with dot.com carcasses!

Global Museum has this vision: To be the international museum compendium site on the Internet.

Our Mission: is to provide services and up-to-date cultural and scientific information to the international museum profession, and to those with an interest in museums.

We are achieving this mission by:

In regards to the latter goal, it should be noted that Global Museum is in essence a nonprofit activity and very much a ‘labour of love’ even though it is guided by business rules. As publisher, I readily understood that before going for support grant funding, it would be necessary to build site credibility backed by thorough statistical and market analysis.

Global Museum is published from Auckland, New Zealand. The time zone differences between the country of origin and the Webzine’s readership works to its advantage. The nature of the Internet and contemporary technology means that the physical address is largely immaterial to the success of such initiatives. I am also happy to reveal for the first time that, unlike its competitors, Global Museum does not have a Cecil B DeMille ‘cast of thousands’ in support. It is a solo operation supported by an understanding wife and a study full of computer equipment!

With anticipated future grant support this scenario will change modestly and one of the ideals is to create multilingual versions on a monthly basis. Global Museum’s readership demographic shows considerable interest from its North American audience with a growing readership from the European Union. It is probably opportune also to restate that the Webzine welcomes news releases from all museums and would like to think that readers of Cultivate Interactive might convey this opportunity to their local museums and governing authorities.

The Global Museum site
The Global Museum site

Process

As a weekly online publication Global Museum maintains sophisticated search options to keep abreast of museum news from around the world. The stories are collated mid week and subject to editorial review so that a stimulating balance of stories is presented to the readership. Wherever possible we attempt to encourage participation and feedback. Two examples of this are the introduction a humorous caption contest and the use of real-time chat technology. Regular online polls of the readership contribute to site development and we have found that the best ideas come from the users. The Webzine quite deliberately sacrifices some downloading time for a highly visual and graphic style. While such a determination doesn’t necessarily receive universal acclimation, the growing increase in the speed of Internet access largely negates any inconvenience in the medium term. This visual profile also assists in the market delineation between Global Museum and its competitors.

Global Museum sections include: International news, career and resume postings service, an international museum studies database, museum product listings, a dedicated international travel service in association with a reputable travel and accredited provider, an online bookshop and virtual mall, a museum resources database with direct file downloads and links of interest, a forum section and much more.

Job and other postings are maintained on a daily basis and the physical design and publishing takes place in-house. Complimentary to this activity is an active and ongoing promotional campaign for the Webzine using search engine placement software and directory placements. Writing for the Internet is of course a specialist activity especially when one considers that the average viewing time for a Web page is a mere 57 seconds.

The various sections of the Webzine are treated differently. Some, such as the International News and Careers sections, are more dynamic in content while others such as the Museum Studies database listing tend to be of a more annual nature.

The lesson to be learnt from publishing online is that technology should serve to support the communication not rule it. With this philosophy in mind, Global Museum is constructed using Net Object Fusion software and is based on style templates that can be easily modified. Being image-rich, the Webzine pages require careful image editing and selection. The pages are published to the server and a news alert ‘teaser’ composed. This announcement is then dispatched to the subscriber base as an email message giving headline details of the current edition.

Online user surveys confirm patronage of the various site sections and we retain a policy of always following up user suggestions as to how the site might be improved and expanded.

What do they Say?

As a reader you might naturally expect the owner of a Webzine to be naturally biased in favour of his project’s success. It is probably appropriate to record a small selection of unsolicited reader comment and site review in support of the above. These endorsements are the sort of statements that enhance the credibility of the Web site and it would be my suggestion to all Web publishers that they consider including them in their content and promotion. Naturally all such statements must be attributed and not of a fictitious nature. Here are a couple of Global Museum examples:

I did a survey of international museum Web sites in February of this year as part of my work for the Public Relations department at Museum Victoria, Australia, and I would have to say that the Global Museum website is easily one of the best. It is easy to navigate and is a pleasure to use due to the layout, which contributes to the usability as well as to the aesthetic of the site. The content appeals and is of relevance to both museum professionals and the general public, and the material is always current and regularly updated. I would certainly rate this site very highly; I use it on a regular basis and would not hesitate to recommend it to anyone, irrespective of their level of involvement in the museum industry.Melinda Viksne, Public Relations, Museum Victoria, Australia

Published by Roger Smith, Global Museum is a free weekly newsletter gathering information from a large pool of museums located all over the world. With a clean, uncluttered layout, Global Museum acts as a wonderful resource of current information. From bizarre and unbelievable news to noteworthy historical facts, this newsletter will plug you into information that is just slightly off the beaten media track. You might be surprised to read some of the articles that didn't make worldwide headlinesList-A-Day.Com review

Conclusion

Global Museum is an ongoing success because it has creatively used the medium of the Web as a communication device. Successful Web sites and in particular online publications need to exemplify the basics, namely that the Web is all about relationship building and engaging an audience, adopting a ‘user- driven’ philosophy and writing in a style (and selecting content) that both challenges and entertains.

Global Museum is quite deliberately not a traditional museum journal. The Webzine at aims at immediacy and providing a selection of online services of interest to its readership. The Webzine is proactive in presenting stories from around the world and has adopted a magazine format style.

The future looks bright for the Webzine. It has seen competitors come and go in the three years of its existence and has been constantly refreshing both style and content to meet the expectations of a discerning audience. Because Global Museum is not directly affiliated with any non-govermental organisation or museum organisation it has been able to retain its editorial independence, unencumbered by the demands of stakeholders or a paying membership. It is this clarity and lack of vested interest that allows it to challenge traditional views and preconceptions.

Global Museum has remained nimble its ability to identify and capitalise upon opportunity and interest, an essential element in producing a successful online publication.

References:

  1. Global Museum
    URL: <http://www.globalmuseum.org/> Link to external resource
  2. Global Museum International
    URL: <http://www4.wave.co.nz/~jollyroger/gmi_introduction.html> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Roger SmithRoger Smith
Director
Global Museum International
2/70 St Johns Road
Meadowbank
Auckland
New Zealand

director@globalmuseum.org Link to an email address
< http://www4.wave.co.nz/~jollyroger/president/president.htm l> Link to external resource

Phone: +64 9 578 1011
Fax: +64 021 695322

Roger Smith is the former Chair of ICOM MPR and ICOM New Zealand. His career path includes museum directorships, executive directorships and consultancies, marketing and public relations. He is the founder and publisher of Global Museum and has recently moved from the physical management of museums back into education, where he is employed as Manager – Web Centre for the Auckland University of Technology. Roger has chaired and given keynote addresses at Museum conferences in Washington DC, Stavanger, Cologne and Calcutta.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Smith, R. "Global Museum: a Personal Vision that became an Online Success", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/global/>

-------------------------------------------------------------

Overview of EMII - the European Museums' Information Institute

By Rosa Botterill - May 2001

Rosa Botterill describes the work of the European Museums' Information Institute (EMII), a consortium of key organisations in the cultural heritage field.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The European Museums' Information Institute [1] is a unique consortium based on a very successful partnership of key organisations. It was established to strengthen the position of the cultural heritage sector, and more specifically the work of museums in Europe. EMII is a hosted organisation, with associates in 14 member states and 2 Economic Areas of the European Union. The EMII Secretariat currently benefits from sharing resources, technical support and professional expertise from working under the auspices of mda [2], EMII's UK partner, based in Cambridge, England.

EMII logoAs a project, EMII received funding for a year, from October 1999 to September 2000, from the previous DG X - Raphaël Programme. During that period EMII's first achievement was the delivery of the European Standards Survey. It was generally accepted that before embarking on new initiatives the consortium should first identify the current status of the use of information management standards in museums across Europe. 10 member states participated in the original survey, which also included information on partner's national overview and details of their future vision for EMII. Results of the survey can be found on the EMII Web site [3].

The EMII Survey is now seen as an important business tool by the cultural heritage sector. Updated statistical information on cultural heritage promotes improved understanding of the needs of the sector, contributes to better governmental policies, and to a more focused distribution of resources. Maintaining the survey offers EMII an excellent opportunity for further development work in the future. The consortium objective is to identify means of improving on EMII's original initiative by looking for co-operation with other relevant European networks and cultural organisations, with a view to offer comprehensive research of the cultural heritage sector in future.

EMII has evolved significantly a year on from when it was first launched in 1999. The consortium expects now to be able to take a more prominent role as a vehicle for the co-ordination of cultural digitisation programmes supported by the European Commission. Its strategic objective is to increase the existing network and to convert it into a dynamic distributed centre of expertise dedicated to:

The EMII Steering Committee has recently approved a model for the future sustainability of the consortium. Funding will be derived from two primary sources:

The approval by the Information Society Directorate of EMII's latest project proposal submission demonstrates a commitment to support further work from the EMII consortium. The new project entitled EMII Distributed Content Framework, will evaluate the issues (including standards and licensing arrangements) related to the future creation of digitised content from content holders within the cultural sector and beyond, for the use, in the first instance, for research purposes in projects funded by the European Commission.

Other project initiatives are on the way. The strength of the EMII consortium is beginning to make its mark. The work of the network has evolved and has established itself as an essential element amongst initiatives supporting cultural heritage organisations in Europe. It is clear that EMII has a role to play in Europe. It is also evident that the consortium must rise to the opportunities by carrying on working in co-operation to deliver solutions to the demands of users throughout the European Union.

References

  1. European Museums' Information Institute
    URL: <http://www.emii.org/> Link to external resource
    For further information on EMII contact
    r.botterill@emii.org Link to an email address
  2. mda
    URL: <http://www.mda.org.uk/> Link to external resource
  3. EMII European Standards Map
    URL: <http://www.emii.org./map/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Rosa BotterillRosa Botterill
EMII Co-ordinator
c/o mda,
Jupiter House,
Station Rd,
Cambridge
CB1 2JD, UK

r.botterill@emii.org Link to an email address

Phone: + 44 (0) 1223 315 760
Fax: + 44 (0) 1223 362 521

Rosa Botterill is currently Standards Co-ordinator for the EMII Consortium. Rosa has extensive professional experience in information management and the implementation of standards in a variety of contexts. Her professional career has been developed working in libraries and museums in Brazil, USA and UK. Rosa has a BA in Librarianship and Documentation from the University of Rio de Janeiro and subsequently attained diplomas in computer studies and scientific documentation. She was awarded a Technical Co-operation Award from the British Council and came to the UK where she obtained her MA in Archives, Library, Information Studies and Education from Loughborough University. She was also awarded a UNESCO/Information Programme scholarship to undertake a programme of studies in Europe on On-line retrieval of information.

Rosa's career began as a librarian in Brazil, followed by a period of research in Texas, USA before finally settling in the UK where she worked at Plymouth City Museums and Art Gallery, the National Maritime Museum, and Oxfordshire County Council. She is currently Standards Co-ordinator for the European Museums’ Information Institute, hosted by mda and based in Cambridge, UK.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Botterill, R. "Overview of EMII - the European Museums' Information Institute", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/emii/>

-------------------------------------------------------------

Making the PIE ...GEL

By John Paschoud - May 2001

John Paschoud from the London School of Economics Library explains how the HeadLine 'Personal Information Environment' for academic library users will evolve into part of ANGEL's 'Guided Environment for Learning'.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The HeadLine Project [1] was funded under the Joint Information Systems Committee (JISC) eLib Programme from January 1998, and concluded its' technical development phase in February 2001. An extension to the project is undertaking further user studies until July 2001.

HeadLine has produced a number of working software components (freely available on an Open Source basis to the higher education (HE) community), and also detailed studies of various aspects of the hybrid library, such as resources data-modelling [2] and user authentication and authorisation [3]. However, what has been widely perceived as 'the HeadLine model' of the hybrid library is the Personal Information Environment, or ‘PIE’. This article takes a first look (from a very first-hand point-of-view) at the ways in which this model could be broadened to encompass all the information needed by learners in higher and further education, and how the Authenticated Networked Guided Environment for Learning (ANGEL) Project [4] will address these objectives.

The HeadLine PIE is a Web-server-based portal, first proposed in detail in November 1999 [5] providing user-centred views of collections of library resources. Each individual PIE user can configure their own library collection, so that it contains all of the resources that they want to use, and none of those that are not relevant or useful. The PIE and the thinking and machinery behind it are more fully described elsewhere [6] but for the purposes of this article the screenshot below will illustrate one example of the interface and types of resource entry-points that it presents to an end-user.

Figure 1: PIE Screenshot
Figure 1: PIE Screenshot

HeadLine also explored the idea of 'multi-personal' resource collections, shared by several people with common interests. The most obvious candidate groups with common interest in the same resource collections are the groups of students on the same course, but once a mechanism to enable shared access to PIE pages had been implemented, the idea could simply be extended to any group, and any individual user could quite easily create and administer a new group.

Like many fixed-budget projects, HeadLine ran out of time and money before implementing fully all the ideas it had generated. These included features like integrated, themed, online 'chat' (within group PIE pages), and user controlled current awareness alerting facilities. As far as possible, the project team has tried to preserve these ideas for the use of future developers, and has included documented specifications for them, and appropriate 'hooks' in the completed PIE program code. A potential PIE developer could implement many of these features by selecting and adapting other suitable open source components from available libraries such as Comprehensive Perl Archive Network (CPAN) [7].

The HeadLine PIE is paralleled by similar developments undertaken by the HE library community worldwide [8] and also (at least in some features) by commercial products like OCLC’s WebExpress [9] MetaLib from Ex Libris [10] and Pica’s Picarta product [11]. This generic model for the presentation of personalised views of library resources has collectively come to be known as "my-library" (a term originally coined, I believe, by Eric Lease Morgan of North Carolina State University [12] ).

Soon after we had started constructing individualised views of information resources traditionally held in libraries, it didn’t take a colossal leap of our collective imagination to see the possibilities of extending these tools to encompass other personal information of use to typical users in HE, and to start visualising how "my-library" could be extended to become "my-university". In general, these other information resources fall into two main categories: administrative information (such as personal details and timetables), and pedagogic learning resources - both usually (and, unlike library resources) highly specific in both form and content to the individual university with which the individual user has the relationship of student, teacher or researcher.

Will the Library become the University? ...or will the University become the Library?

As the The Distributed National Electronic Resource (DNER) [13] supersedes the eLib Programme [14] as the JISC funding vehicle for new developments in this field (for higher and further education in the UK), the focus has widened from just the library, to a more inclusive treatment of "learning resources".

For the benefit of those readers who have already been calling their library a "learning resource centre" (or something more esoteric) for some time, a good way to illustrate this distinction is to classify most 'traditional' library resources as open-ended information spaces, in which the user is not guided through any particular course of study other than the (possibly optional) sequential order imposed by the author, and learning materials as more constrained and directed information, guiding the end-user (possibly with the help of some courseware management tool, such as WebCT [15], BlackBoard [16], and many others) through one or more specific courses of study.

The trend, reflected by recent calls for research and development proposals, is for educational systems to become ever more integrated and seamless in operation. This entails sharing databases and metadata systems and allowing different users selective access to those databases, depending on their needs and their authority. For example, the latest generation of courseware development tools, Virtual Learning Environments, provide integrated systems for course development, delivery and management. Digital library developments have progressed to the point where the DNER will create a single, integrated information access and management environment for the HE community. However these developments are not yet interconnected and neither are they integrated with university management information systems.

Internet-based services of all types in all walks of life are developing versions that are customised to the individual, as net technologies mature and rich integration of content strands becomes achievable. In the context of learning and information services in HE, this trend is emerging in commercial products, but there is just as much requirement to provide customised services within the environment of the developing DNER.

“A next stage might be to create several exemplar institutional environments where information, learning and other resources are brought together in a user's normal working environment, together with rich communications and other tools" [17].

Although they are characterised by high levels of guidance, feedback and support for users, "Learning environments…imply a model with a closed resource base of learning Resources" (JISC circular 5/99). Digital library developments have put increasingly rich and powerful data sets at the disposal of the academic user but the problem students and staff face when they switch to these richer, open resources is that they leave behind the supportive learning environment designed to help them make best use of the resource.

The aim of the ANGEL Project is to create a system that brings together Digital Library technology with Learning & Teaching resources in a way that:

Most uses of digital libraries typically help the user to identify and access resources from a wide range of online databases. These resources however are “context free" - they are found according to the specific search criteria employed and do not in themselves embody any pedagogic strategy.

The ANGEL Guided Environment for Learning (or ‘GEL’) will utilise the search and retrieval capabilities of digital libraries to identify specific clusters of resources that when combined with contextualising material would form the resource base for a specific “learning episode” or activity. This learning activity would be delivered online using a virtual learning environment (VLE) such as WebCT and alternative proprietary or 'open' solutions. It will facilitate access to DNER resources for users, removing from their navigation the frequent authorisation challenges which are such a familiar and frustrating part of the current environment, and will allow any authenticated member of a participating institution to connect, using a single personal identifier and password or a digital certificate, from anywhere on the Internet, to:

Of course, recognition of these potential synergies didn't take colossal imagination from many groups who had started by working on better end-user access to management information and pedagogic resources, either. Historically (at least in UK HE) there has not been a great deal of communication between the three communities involved in university administration, teaching and learning technologies, and library services. Indeed, there is a strong possibility that within the same university, several independent groups may each be developing their own ‘one portal for everything’, leading to potential battles for global supremacy as they all attempt to engulf the traditional information territory of the others. Rather than joining these battles, we should battle against some of the academic politics that tends to enforce these divisions, and try working towards becoming ‘joined-up universities’.

Finally, it’s worth mentioning that the ANGEL Project is tasked with a further, and highly complementary strand of development (through one of those quirks of the JISC committee processes that is probably best left to drift into the mists of history). In addition to development of the Guided Environment for Learning, ANGEL is also producing the first working implementation to satisfy the requirements for the ‘next generation’ of user authentication and access management. The specification, currently code-named 'Sparta', is being developed by JCAS - the JISC Committee for Authentication & Security. A national scale service based on the Sparta specification will eventually supersede Athens [18] as the authentication and access-management infrastructure connecting users in HE (and further education) with information and learning resources mediated by the DNER, and from many commercial suppliers. But that's the subject for an entirely different article...

References

  1. The HeadLine Project
    URL: <http://www.headline.ac.uk/> Link to external resource
  2. John Paschoud, The filling in the PIE - HeadLine's Resource Data Model, Ariadne Issue 27
    URL: < http://www.ariadne.ac.uk/issue27/paschoud/> Link to external resource
  3. Authentication and authorisation
    URL: <http://www.headline.ac.uk/public/diss/je-conc-day/> Link to external resource
    URL: <http://www.headline.ac.uk/public/diss/jp-SCURL-Apr00/> Link to external resource
  4. ANGEL Project
    URL: <http://www.angel.ac.uk/> Link to external resource
  5. HeadLine PIE
    URL: <http://www.headline.ac.uk/publications/aslib/PIE.htm> Link to external resource
  6. PIE
    URL: <http://www.headline.ac.uk/public/diss/> Link to external resource
    URL: <http://www.lita.org/ital/ital1904.html> Link to external resource
    URL: < http://www.headline.ac.uk/public/diss/jp-PIE-HybLib-model/> Link to external resource
  7. Comprehensive Perl Archive Network (CPAN)
    URL: <http://www.cpan.org/> Link to external resource
  8. Information Technology and Libraries, vol 19, no 4, Special Issue: User-Customizable Library Portals
    URL: <http://www.lita.org/ital/ital1904.html> Link to external resource
  9. OCLC’s WebExpress
    URL: <http://www.oclc.org/Webexpress/> Link to external resource
  10. MetaLib from Ex Libris
    URL: <http://www.aleph.co.il/metalib/> Link to external resource
  11. Pica’s Picarta product
    URL: <http://www.pica.nl/> Link to external resource
  12. Eric Lease Morgan, Personalized Library Interfaces, Exploit Interactive, issue 6, 26th June 2000
    URL: <http://www.exploit-lib.org/issue6/libraries/> Link to external resource
  13. DNER
    URL: <http://www.jisc.ac.uk/dner/> Link to external resource
  14. eLib Programme
    URL: <http://www.jisc.ac.uk/elib/> Link to external resource
  15. WebCT
    URL: <http://www.Webct.com/> Link to external resource
  16. BlackBoard
    URL: <http://www.blackboard.com/> Link to external resource
  17. Dempsey, L. (1999) EDINA and the DNER, EDINA Newsline 4.1, Spring 1999
  18. Athens
    URL: <http://www.athens.ac.uk/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

John PaschoudJohn Paschoud
London School of Economics Library

j.paschoud@lse.ac.uk Link to an email address
<http://www.headline.ac.uk/public/people/people-john.html> Link to external resource

John Paschoud is the national project manager of the HeadLine and ANGEL projects, and is an information systems engineer working at the Library of the London School of Economics.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Paschoud, J. "Making the PIE ...GEL", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/pie/>

-------------------------------------------------------------

UKISHELP: helping the UK understand the IST Programme

By Peter Walters - May 2001

Peter Walters introduces UK Information Society Help (UKISHELP) [1]. UKISHELP is a UK Department of Trade and Industry initiative set up to help newcomers understand and evaluate Information Society European programme's funding opportunities in the context of their business. Although UKISHELP is specifically aimed towards encouraging interest in the UK there are many lessons that can be learnt from its work for the rest of Europe.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

UKISHELP logo UKISHELP was established early in 1999 by the UK Department of Trade and Industry (DTI) as an information resource for organisations bidding for IST funding. More than 5,000 calls to the support line later, the level of interest continues to look good for UK involvement.

IST Involvement in the UK

To date the UK has been particularly successful in recruiting academic participants, which is important because of the vital role academic institutions fulfil in the research of developing technology. UKISHELP are now working hard to match UK academic success with commercial involvement and product development. "It’s about turning the economic mill in the country and that’s about products, services, applications, industry and commerce".

For organisations with an innovative idea who would benefit from working with other like-minded European organisations, the IST Programme is worth serious consideration. There are European funds allocated to help at every stage of product development from the very first research to user trials and marketing.

Funding

January 2001 saw the announcement of newly released funds in the Sixth Call of the IST Programme. Currently £700 million of funding is still to be awarded to participating organisations.

The European Commission has allocated some £10 billion for its current research and development programme designed to promote industrial competitiveness and improve quality of life in Europe. Prospective applicants are becoming increasingly aware of the benefits of IST Programme funding. It can support company growth and fund innovative and risky development projects. IST Programme funding should be readily considered alongside other forms of longer-term finance.

There are many benefits of using EC money for research: If you go to the bank you borrow the money; if you go to the European Commission they will give you the money. They won’t give you all the money, but they will give you the money and not charge you interest. You can apply for 100% of your manpower costs where you are just starting to appraise your market and undertake trials. If you go further down the development line you can get into a funding programme called Ten Telecom [2] which offers half the cost of your business plan and some trial marketing. You can then take your results to a bank and if you’re successful in securing a bank loan, you can even get help with the interest.

Examples of Successful Projects

There have been many successful EU-funded projects based in the UK.

A recent project examined how IT affected the way we work. In a world of laptops, modem links and advanced telecommunications, more and more people are choosing to work remotely (away from the office), whether as employees or self-employed. This steady release of office-bound ‘battery hens’ to the freedom of the ‘teleworker’, as they are known, has brought with it a new agenda of issues. In Europe there are around 2 million teleworkers and the number is increasing, but finding work is not always easy. In response, research by the EU-funded project TeleMart has helped to establish an online brokerage service enabling organisations to quickly locate the services and teleworkers they need. The project ended in March of 1999 and TeleMart co-ordinator David Horne, of Middlesborough based Tradezone International Ltd., has licensed the software developed and businesses can now visit a virtual market place and locate suppliers of teleworked services at the Web site [3]. Telemart’s evolution will now focus on broadening the range of services available whilst improving the quality of life of teleworkers throughout the continent.

Another EU-funded project, TAPPE, based in Northern Ireland, has examined the relationship between suppliers and purchasers. The resulting software is now helping both private and public sector purchasing departments to make objective decisions about suppliers. About 60p in every £1 earned in Northern Ireland is accounted for by the public sector, so it has had a significant influence on the economy.

In November 2000 the Northern Ireland company MINEit Software Limited [4] scooped a Grand Prize (one of only three available) at the European Information Society Technologies (IST) Awards held in Nice. The Web analytics company were the first based in Northern Ireland (NI) to win the prestigious award (referred to as the Oscars of IT Europe) which carries the greatest possible recognition for information and technology companies across the continent.

The Grand Prize was awarded for MINEit’s Easyminer software product, which takes the guesswork out of e-business marketing by analysing and building predictive models of visitor behaviour at Internet sites. Their product was selected from a total of over 200 submissions from 26 countries and reached the finals, competing against 20 other companies for a prestigious Grand Prize.

It’s success stories like these that the DTI believes will inspire even more organisations to apply. And whilst there are certain procedures, acronyms and terminologies which applicants should master, UKISHELP can assist.

Procedures

The procedures are there for a good reason. The Commission allocates huge amounts of money, so it’s careful not to award funds for the same project twice or projects with conflicting goals.It achieves this using administrative procedures such as invitations to apply for the money, or Calls, that happen regularly. This means that you can’t go to the Commission on any day and say, for example, ‘I’d like to work on intelligent houses!’ There are certain times during the Programme when funds for your area of interest will be allocated. Part of the UKISHELP service is to ensure that applicants know when the Calls are likely to happen, and which organisations will have most to gain from applying when they do.

The Commission fosters a greater understanding of European markets by encouraging you to work with partners from other countries. If working with a European partner is already part of your business plan this is good news. If not, you don’t have to worry about finding partners on your own. Partner search services exist in the form of SingleImage [5] and UKISHELP can put you in touch. Once you establish the right partnership, the joint venture or consortium approach presents many important benefits. You can network with partners, use them to reach new markets and exchange ideas of best practice.

Each consortium’s funding proposal is read by Independent evaluators with knowledge of the area that you are working in. They establish a concerted view of its value, and rank it alongside the others. The proposals that receive the most marks get the money. Once funding has been secured and the project is underway you should always remain clearly focussed on your project’s end results.

To maximise the benefits of involvement you should understand, even before you start, how the output is going to help you and your business, and how you are going to exploit it. This will enable you to show your partners why you’re in the business and what you want to achieve. If you understand the exploitation route you’ll increase your chances of a successful proposal and your project will flourish.

Successful Proposals

UKISHELP aims to help people put in successful bids for Projects. They have recently published a free guide for those interested. It is called 'Fast Track Guide to Successful Proposals’ and you can obtain your copy by visiting the Web site. European Funding success just involves following a number of serious do’s and don’t. Here is a few of the most important ones!

References

  1. UKISHELP (UK Information Society Help)
    URL: <http://www.ukishelp.co.uk/> Link to external resource
    Support line - 0870 606 1515
  2. Ten Telecom
    URL: <http://156.54.253.12/tentelecom/> Link to external resource
  3. Telemart
    URL: <http://www.telemart.org/> Link to external resource
  4. MINEit Software Limited
    URL: < http://www.mineit.com/> Link to external resource
  5. SingleImage
    URL: <http://www.singleimage.co.uk/Database.htm> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Peter Walters
UK’s National Contact Point for the IST Programme
UKISHELP

help@ukishelp.co.uk Link to an email address

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Walters, P. "UKISHELP: helping the UK understand the IST Programme", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/ukishelp/>

-------------------------------------------------------------

Digital Image Archiving and Advice: in Tandem with the Visual Arts Data Service (VADS)

By Phill Purdy - May 2001

The creation of digital image archives can be described as a cycle, from planning to preservation, taking in all points in between: rights management, archival image capture, data management and delivery systems. VADS as a UK Higher and Further Education data service works throughout this cycle in tandem with visual arts digital image collection creators to assist the planning, production, delivery, use and preservation of high quality digital materials.

This article by Phill Purdy illustrates how VADS works with its depositors and outlines some of the issues and methodologies employed by VADS to create its cross-searchable catalogue of archived visual arts digital image collections.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

VADS Background

The Visual Arts Data Service (VADS) is based at The Surrey Institute of Art & Design, University College. VADS is a part of the Arts and Humanities Data Service (AHDS) [1]. VADS and AHDS are funded by the Arts and Humanities Research Board (AHRB) [2] and the Joint Information Systems Committee (JISC) [3] to support research, learning and teaching in UK Higher and Further Education, by providing digital archiving and advisory services.

The AHDS was established in 1995 to support digital resources across the arts and humanities. The AHDS consists of five ‘Service Providers’; each with subject and technical specialisms, managed by an ‘Executive Service’ based at Kings College, London. VADS began operating in March 1997 to serve the full gamut of visual arts disciplines including fine art; design; architecture; applied arts; history and theory; media; museum studies and professional practice.

VADS provides an on-line catalogue of its archived collections and advice on the creation, management and use of visual arts digital resources. VADS aims to accession all forms of digital data, from text to multimedia and provides advice through an outreach programme that includes publications, workshops and consultation. VADS services are freely available to UK Higher and Further Education sectors and VADS seeks to work collaboratively within other sectors, both nationally and internationally. VADS accessions collections via a variety of means, from formal relationships with funding bodies, as in the case of AHRB projects, to individual negotiations with data creators.

VADS collections of third party created resources are promoted and preserved for broad and long-term academic and educational use through VADS on-line catalogue. Items within VADS catalogue are individually ‘branded’ to identity and acknowledge the original collection owners and the source of the collection. VADS catalogue represents a growing body of visual arts material, searchable across as well as within individual collections and provides a significant resource for all involved with research, learning and teaching.

Resources currently delivered by VADS include: image databases from the Imperial War Museum, London College of Fashion, National Arts Education Archive and The Design Council Archive; student degree show Web sites, and Computer Aided Learning packages.

Figure 1: A montage of items from VADS current and forthcoming image collections
Figure 1: A montage of items from VADS current and forthcoming image collections

As of 31 March 2001, VADS catalogue provides access to five disparate image collections totaling over 6,500 digital images. These image collections are due to be shortly joined by another five collections, resulting in a total of over 15,000 images and their full catalogue records. Subsequently, VADS image collections are due to more than double in quantity over the next eighteen months through several projects VADS is currently working alongside.

VADS Image Catalogue: Functions and Features

VADS Catalogue of Image Collections is accessible from the Web site [4]. It offers free-text searching using Boolean and other search operators, within any or across all image database collections.

Figure 2: VADS Catalogue page
Figure 2: VADS Catalogue page

Results are returned in thumbnail to 'screen-size' images with accompanying 'brief', 'core' and 'full' catalogue records. At the level of individual image records, items carry collection-stakeholder logos to identify their original source.

Figure 3: Example of VADS search results page
Figure 3: Example of VADS search results page
Figure 4: Example of VADS core record page
Figure 4: Example of VADS core record page

Also accessible from VADS Catalogue page are ‘Search Help’ pages, which give details of search operators and strategies; 'Collection Information', outlining the content and history of individual collections and an 'Advanced Search' facility, which allows graphical Boolean searching and choices as to how results are returned. The catalogue page also presents links to VADS non-image database collections.

Linking Collecting and Advice

As attested to above, VADS is an archive of third-party created image collections. This means that VADS does not undertake scanning and cataloguing of image collections. However, VADS does have a significant role supporting those engaged with the creation and management of visual arts digital image collections. VADS offers subject specific support to creators and managers of visual arts data, particularly those committed to or envisioning archiving with VADS.

VADS support throughout the digital creation life cycle, from planning to preservation, promotes the use of standards and good practices for data creation, ensuring maximum return on the investments made in data creation. VADS undertakes its image digitisation advisory role in collaboration with other UK Services in this field, such as the Technical Advisory Service for Images (TASI) [5] and the Higher Education Digitisation Service (HEDS) [6].

VADS works in an advisory capacity with the majority of collections it accessions. For example, the delivering and archiving of the JISC Image Digitisation Initiative (JIDI) visual arts collections [7]. VADS was a member of the JIDI Steering Committee and has worked collaboratively with JIDI depositors throughout the process of accessioning material with VADS.

VADS On-line Delivery of JIDI collections

The JIDI project, started in 1996, was an important research and development project within UK Higher Education, establishing digital image capture standards and procedures [8], along with metadata creation guidelines [9]. The project benefited from major input from TASI and HEDS, providers of advice and digitisation services respectively.

There are 11 visual arts collections resulting from the JIDI project, totaling over 15,000 images in all. As of 31 March 2001, VADS is currently delivering four JIDI collections, numbering between c300 and 3,000 images each, with the remainder scheduled for delivery within the first half of 2001. Further JIDI collections to be delivered by VADS are: Design Council Slide Collection, Manchester Metropolitan University; African and Asian Visual Arts Archive; University of East London; John Johnson Collection of Political and Trades Prints; Central St Martins Museum Collection; Spellman Collection of Music Covers, University of Reading; Fawcett Library Suffragette Banners, London Guildhall University.

The four JIDI collections VADS is currently delivering are Design Council Archive, University of Brighton, London College of Fashion: College Archive and the AE Halliwell and Basic Design Collection, both from the National Arts Education Archive, Bretton Hall College, University of Leeds. These collections vary in content from fine art learning materials within the Basic Design Collection to black and white archival photographs and negatives within Design Council Archive and London College of Fashion: College Archive. Conjoining them, alongside the IWM Concise Art Collection, within robust delivery systems was the task in hand for VADS.

To enable these collections not only to be fully text searchable individually, but also across one-another, VADS had to make some important decisions about its systems. This included both image specifications and suitable data structure and systems to use, as well as managing the administrative tasks involved in creating such an archive.

The administrative tasks include managing the Intellectual Property Rights (IPR) involved with image collections. VADS operates a licence agreement that establishes the rights situation and obligations of each party within the archiving relationship - the depositor (collection owner) and archive (VADS). A standard licence [10] was created as a pro forma to be used across the AHDS and allows for non-exclusive depositing of material by collection owners for VADS to deliver and preserve collections for educational purposes. This means that under the standard terms of deposit, the original rights owner maintains all rights they have in the original collection and is free to utilise the material for any other purposes. VADS is licenced solely to provide enhanced delivery and preservation of the collection for academic and educational purposes.

Systems

VADS delivery hardware is hosted at Bath University Computing Services (BUCS), a major UK Higher Education computing centre. VADS image catalogue is delivered using Index+ technology developed by Systems Simulation Ltd [11], a British software engineering company specialising in interactive text, image and multimedia information systems, with in-depth experience in the cultural heritage field.

VADS produces surrogate Web delivery images in jpeg format, derived from the archival tiffs housed off-line by VADS. The Web images are either produced in-house by VADS or obtained direct from data creators. The standard size jpeg images VADS delivers are: ‘thumbnail’ max. 90x90pixels; ‘core record’ max. 400x400pixels and ‘large’ reference image max 600x600pixels.

For its data-structure, VADS implements the Visual Resources Association Core Categories, Version 3.0 (VRA Core 3.0) [12] in its on-line delivery systems and thence promotes this standard to the wider community. VRA Core 3.0 is an image-cataloguing standard developed to describe “works of visual culture and the images that document them” [12]. For instance, using VRA Core 3.0, a painting would be documented using field titles such as: record type, measurements, material, creator, style/period, etc. In all, the published VRA Core 3.0 standard has 73 fields providing very rich and thus user-friendly descriptors of visual arts materials and their image-documents.

VRA Core 3.0 was chosen by VADS as the basis for its image collection data-structure, not only because it is one of the foremost standards for works of visual culture, but also because it promotes the use of terminology controls, which aid quality and consistency when creating data. More significantly, however, VADS adopted VRA Core 3.0 because it relates directly to other visual arts and more generic electronic resource description standards, such as the Categories for the Description of Works of Art (CDWA) [13] and Dublin Core [14]. This inherent ‘mapping’ of VRA Core 3.0 to other standards allows for potentially increased integration of digital image records across diverse systems, a vital benefit given VADS goal of building an interoperable on-line catalogue of digital resources. VADS will initially implement interoperability of its image collections database through a gateway for all AHDS collections, using the Dublin Core and Z39.50 protocols, which will allow all AHDS collections to be cross-searched simultaneously, creating a powerful Internet access tool to access Arts and Humanities digital resources. This technology could then be extended to other national and international integrated systems to increase access and usability.

Conclusion: VADS future

VADS systems will continue to evolve as collections expand in quantity and types, necessitating modifications to delivery and archiving systems. VADS is also undertaking a JISC funded project to investigate tools and provide resources to Promote the use of Image Collections for Learning and Teaching in the Visual Arts, (PICTIVA) [14], the results of which should be available on VADS Web site from March 2002. VADS will also continue to investigate opportunities for collaboration to increase access and use of digital collections for research, learning and teaching.

References

  1. Arts and Humanities Data Service (AHDS)
    URL: <http://ahds.ac.uk/> Link to external resource
  2. Arts and Humanities Research Board (AHRB)
    URL: <http://www.ahrb.ac.uk/> Link to external resource
  3. Joint Information Systems Committee (JISC)
    URL: <http://www.jisc.ac.uk/> Link to external resource
  4. VADS Catalogue of Image Collections
    URL: <http://vads.ahds.ac.uk/search.html> Link to external resource
  5. Technical Advisory Service for Images (TASI)
    URL: <http://www.tasi.ac.uk/> Link to external resource
  6. Higher Education Digitisation Service (HEDS)
    URL: <http://heds.herts.ac.uk/> Link to external resource
  7. JISC Image Digitisation Initiative (JIDI), 1998.
    URL: <http://www.ilrt.bris.ac.uk/jidi/> Link to external resource
  8. Tanner, S. and Robinson, B. (1998) A Feasibility Study for the JISC Image Digitisation Initiative (JIDI), 1998.
    URL: <http://heds.herts.ac.uk/resources/papers/jidi_fs.html> Link to external resource
    [accessed 1 August 2000]
  9. Metadata creation guidelines
    URL: <http://www.ilrt.bris.ac.uk/jidi/metadata.html> Link to external resource
  10. Standard Licence
    URL: <http://vads.ahds.ac.uk/depositing/deposit_licence.pdf> Link to external resource
  11. Systems Simulation Ltd
    URL: <http://www.ssl.co.uk/>
  12. Visual Resources Association, 2000.VRA Core Categories, Version 3.0.
    URL: <http://www.vraweb.org/> Link to external resource
    [Accessed 21 February 2001]
  13. Description of Works of Art (CDWA)
    URL: <http://www.gett.edu/research/institute/standards/cdwa/> Link to external resource
  14. Dublin Core
    URL: <http://dublincore.org/> Link to external resource
  15. Promote the use of Image Collections for Learning and Teaching in the Visual Arts, (PICTIVA)
    URL: <http://vads.ahds.ac.uk/learning/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Phill Purdy
Manager
Visual Arts Data Service
The Surrey Institute of Art & Design, University College
Farnham
Surrey, GU9 7DS
United Kingdom

Phill@vads.ahds.ac.uk Link to an email address
<http://vads.ahds.ac.uk/> Link to external resource

Phone: +44 (0)1252 892724
Fax: +44 (0)1252 892725

Phill Purdy has been working with visual resources in the academic and commercial sectors, since 1991. He has been with VADS, since November 1998, following completion of an MA in Computer Applications for the History of Art, Birkbeck College, London.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Purdy, P. "Digital Image Archiving and Advice: in Tandem with the Visual Arts Data Service (VADS)", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/vads/>

-------------------------------------------------------------

Introduction to Innovation Finance

By Steve Glangé - May 2001

Steve Glangé reports on how you can turn your Research and Development results into successful ventures with just a little 'LIFT'. LIFT- Linking Innovation Finance and Technology [1], is a free service sponsored by the European Commission’s Innovation/SMEs Programme that can give you help and guidance through the IT financial maze.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

This article proposes to look at three different European success stories in the field of Information Technology and Life Science in order to distinguish the key factors for successful ventures.

In examining the cases of Gemplus, RheinBiotech and Lernout & Hauspie, we will show what made them successful in our opinion and how LIFT can help you to succeed.

We will end with the free services offered by the LIFT team and how you can benefit from these services.

The Key Factors for a Successful Venture

Let's consider each of the three case studies.

What made these ventures successful? Was it the market, the product, the drive…? Or was it “entrepreneurship”?

The question is how can you find the entrepreneur in you? With competition for funding fairly high it is imperative that you assess yourself and your venture in order to detect if you are ready for external funds or to accept foreign interaction in your project. You will need to assess your self on each side, e.g. commercially, financially.

Business Plan

When you have decided to go forward, it is time to prepare a quality business plan, which should include the following sections:

Now that you are aware of your needs and your objectives, it is time to tell your potential investor(s) about it. Beforehand you should study your potential partners: each is interested in another aspect of your venture, e.g. a business angel, a venture capitalist.

The next step is to approach them effectively. You may approach them indirectly, e.g. through the intermediates such as consultants or a mutual acquaintance, or through more or less public events, e.g. networking events such as First Tuesday [5] or through auctions like those organised by the European investment Forum or at other seminars.

How can LIFT Help You?

LIFT is a free service sponsored by the European Commission’s Innovation/SMEs Programme. LIFT's mission statement is to help you:

LIFT's experienced experts offer a wide range of comprehensive services including:

LIFT Toolkit

The LIFT toolkit includes a document on Assessing your venture that lets you answer some of the questions any venture capitalist may ask and shows you how well prepared you are. Preparing a Technology Business Plan shows you how to prepare a successful business plan and finally Financing Innovation – A Guide gives you an overview of all the sources of finance and how they operate [6].

LIFT Workshops

LIFT runs a number of workshops throughout Europe with hands-on instruction and guidance.

LIFT Helpdesk

The LIFT Helpdesk provides individualised support, help and direction - via phone, fax, email or post - on a wide range of high technology business financing issues. The service is free of charge, fully confidential and currently available in English, French and German.

Conclusion

What's next? Now it is up to you to:

References

  1. LIFT (Linking Innovation Finance and Technology)
    URL: <http://www.lift.lu/> Link to external resource
  2. Gemplus
    URL: <http://www.gemplus.com/> Link to external resource
  3. RheinBiotech
    URL: <http://www.rheinbiotech.com/> Link to external resource
  4. Lernout & Hauspie
    URL: <http://www.lhsl.com/> Link to external resource
  5. First Tuesday
    URL: <http://www.firsttuesday.com/> Link to external resource
  6. Brochures from the LIFT information pack (Assessing your Venture, Preparing a Technology Business Plan, Sources of Finance)
  7. To contact the LIFT info line email info@lift.lu Link to an email address

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Steve Glangé
LIFT (Linking Innovation Finance and Technology)
11 rue de Bitbourg
L-1279 Hamm
Luxembourg

steve.glange@lift.lu Link to an email address
URL: <http://www.lift.lu/> Link to external resource

Phone: +352 428 001
Fax: +352 428 00344

Steve Glangé is employed as a Senior Consultant at INBIS Luxembourg Ltd (a Management Consultancy branch of INBIS Ltd UK). His responsibilities include assisting entrepreneurs on their way to commercialization and implementing Technology Transfer.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Glangé, S. "Introduction to Innovation Finance", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/lift/>

-------------------------------------------------------------

Cultivate Interactive Issue 4: Regular Articles

At the Event:

Praxis: new

Metadata:

-------------------------------------------------------------

DIGICULT Column

By Concha Fernández de la Puente - January 2001

This section aims to provide news of the European Commission's initiatives in the field of digital heritage and cultural content. Its objectives are to summarise the developments in programmes, projects and activities since the last Cultivate Interactive issue and to give a clear picture of progress in the area. It certainly does not pretend to be a comprehensive account of what the EC is doing in the area but rather a short summary of some of the key items. The content is based largely on the information provided in the e-Culture Newsletter, published by the European Commission, DG Information Society, Cultural Heritage Applications Unit, that can be found on the Web [1].

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

These last months since the last issue of Cultivate Interactive and, therefore, since the last DIGICULT column, we have been very active and many initiatives have been launched.

In June 2000, an action plan for eEurope was agreed by the European Council at the Feira Sumit. In order to follow up on some of the agenda issues, a panel of national experts from most Member States were invited by the European Commission to a meeting in Luxembourg on 15-16 November 2000. The main objective was to develop strategies to improve the co-ordination mechanisms for Member States' digitisation programmes, in particular in the field of cultural heritage. The participants, agreeing that improved co-ordination was needed, came up with the idea of creating an inventory of the existing initiatives. As a first step, a questionnaire on digital policies was established

In January 2001, Sweden took up the Union’s Presidency for six months. They supported our initiative in the field of digitisation by holding our follow-up meeting in conjunction with a meeting under the Swedish Presidency in April in Lund. The meeting in Lund was extremely successful, achieving major objectives. The initial results of the responses to the questionnaire were presented and discussed. The Member States' representatives proposed tools to keep the data updated over the longer term. Benchmarking was identified as a sound mechanism to exchange good practice across Member States and to improve national practices in the field of digitisation policy. The countries who will next be holding the Presidency (Belgium and Spain) have already expressed their wish to continue supporting this important initiative.

Meanwhile we have continued with the day- to- day running of our cultural heritage programme. As a result of the opening of Action Line III.1.5 Trials on new access modes to cultural and scientific content of the 2000 Work Programme, a total of 53 proposals were received last December and 254 of them have been selected for funding. On the 6th February 2001, a meeting was held in Luxembourg with all the successful projects to brief them on the next steps in the contract negotiation. The meeting also gave the participants the opportunity to exchange experiences and discuss co-operation options for the future. We expect these projects to start this summer.

As mentioned in the last DIGICULT column, the IST work programme 2001 contains two action lines for the digital heritage sector: ALIII.1.2. Heritage for All and ALIII.1.3 Next Generation of Digital Collections. Both action lines were published on a call opened in January and closing in April [2]. The projects selected from this call are expected to start at the beginning of 2001. This will probably be our last major call under the 5th Framework Programme.

In the framework of this call, and especially of the action line on digital libraries, the Cultural Heritage unit has signed a co-operation agreement [3] with the Digital Libraries programme of the National Science Foundation (NSF) USA for collaborative work in the on RTD projects.

Other interesting action lines open in this call were ALIII.5.1 xcontent futures, that aims to provide opportunities for high payoff breakthrough research within the scope of the Key Action, yet with a focus on issues not covered at present by its Action Lines. And, ALIII.5.2 Competence building, that aims to support the acquisition of multimedia skills and to enhance the access to competence in multimedia: provide access to advanced emerging technologies and services, knowledge and competence relevant to multimedia systems and services, via world class competence centres already existing or emerging in Europe. These access measures could be very useful for cultural actors to reinforce their portfolio of expertise.

The EU funded DELOS network of excellence organised an EU-DL All Projects concertation meeting on 7 and 8 February 2001 in Luxembourg in cooperation with the Cultural Heritage Applications unit. Planned as part of a series of meetings to bring together representatives of relevant initiatives in the field of digital libraries, the objectives were to exchange information about the projects, to identify areas of synergy, to jointly promote standarisation and dissemination activities and to provide the IST Programme with a global view of the evolution of Digital Libraries technologies.

We have launched a study on Technological Landscapes for Tomorrow’s Cultural Economy (DigiCULT) [4]. This is a strategic study on the state of the art of use, development and research of information and communication technologies in the cultural (and associated) sectors in Europe. The objective of the study is to provide a clear set of action recommendations for cultural institutions Europe-wide. In order to achieve this goal, DigiCULT will provide an in-depth analysis of the state-of-the art of technologies, content, cultural services and applications as well as (user) demands and policies in the European cultural sector. Starting on 1 January 2001, the study will be coordinated by Salzburg Research over a period of 12 months, and will involve a consortium of nine highly acknowledged European cultural organisations.

The eCulture newsletter is now in its 8th issue. This has proved to be a very efficient communication tool for us, triggering many encouraging comments. The newsletter Web page [5] is among one of the most visited in CORDIS!

Also within the communication area, during the first quarter trimestre of 2001, we have restructured the digicult Web site [6]. It now offers new features such as a clustered approached to our projects and a more dynamic home page. Have a look and send us your comments.

The Commission is already working on the preparation of the next Framework Programme for RTD (FP6). In order to get wide input from the IST community, a web-based consultation system has been set up [7]. You are invited to reflect on future priorities and express your opinion using this tool.

The other interesting EC activity in the field of electronic record management is the DLM-Forum. Its goal is to investigate possibilities for wider co-operation in the field of electronic archives both between the EU Member States and at Community level. Thus the 1st and 2nd DLM-Forums (Brussels, December 1996 and October 1999), organised by the European Commission and the EU Member States, hosted some 800 experts and decision-makers from public administration, archives, ICT-industry and research. The DLM-Monitoring Committee and its special working party plan to organise the 3rd DLM-Forum 2002 during the forthcoming Spanish EU Presidency (1st half of 2002). This will provide an interdisciplinary European platform to present best practices and concrete solutions and to promote, with the support of DG Information Society, the European Network on Electronic Archives.

The EU eContent [8] programme was finally adopted by the Council on 22 december 2000 for the period covering the years 2001 to 2004 and with a total budget of 100 Meuro. The programme covers three main strands of action: improving access to and expanding use of public sector information, enhancing content production in a multilingual and multicultural environment, and increasing the dynamism of the digital content market. As a result of the call for proposals [9] for preparatory actions published on 20 April 2000, 28 projects have been chosen to stimulate the development and use of European Digital Content on the global networks and to promote the linguistic diversity in the Information Society. Some of the selected projects relevant to the cultural heritage area are: MNM (Minority Newspapers to New Media) and MUDICU (Multilingual Digital Culture Web Project). A complete list of selected projects is also available [10].

As we have already informed you, Culture 2000 [11] is a European Union financial support programme established to support cultural co-operation projects in all artistic and cultural sectors. In 2000, this Community programme provided aid for 219 projects amounting to over EUR 32 million [12]. The programme published a new call for proposals in January 2001, closing at the beginning of May. This year the focus is on projects which are aimed at young people, people with disabilities and disadvantaged sections of society and which promote their social integration and projects which combine artistic, cultural and scientific quality and are accessible to the general public. In order to extend the impact of the programme, the Commission presented on 12 March 2001 a proposal for a Council Decision on the participation of Central and Easter European Countries in the programme. This requires a decision by association boards, which the Commission hopes will be taken soon enough for CEECs to participate in cultural projects backed by the European Union in 2001. The same conditions and procedures will apply to CEECs as to Member States.

On 14 February 2001,14th the European Parliament approved the Common Position on the EU Copyright Directive [13]. Libraries were successful in lobbying their interests during the preparation of this directive and managed to have the most harmful amendments for the libraries rejected. The next stage after adoption by the Council of Ministers (which should go through) will be the process of the Directive's enactment in the different Member States.

As in previous issues, we have reviewed the main developments of the work done by the Commission in the cultural heritage sector over the past months. In the next issue we will be able to tell you more about the Sixth Framework Programme developments and some of the initial results of the projects funded under our current programme. Keep in touch with this column.

References

  1. e-Culture: a newsletter on cultural content and digital heritage
    URL: <http://www.cordis.lu/ist/ka3/digicult/en/newsletter.html> Link to external resource
  2. Action lines for the digital heritage sector
    URL: <http://www.cordis.lu/ist/calls/200101.htm> Link to external resource
  3. Co-operation Agreement
    URL: <ftp://ftp.cordis.lu/pub/ist/docs/digicult/eu-nsf-call.pdf> Link to external resource
  4. Technological Landscapes for Tomorrow’s Cultural Economy (DigiCULT)
    URL: < http://www.salzburgresearch.at/fbi/digicult/ www.salzburgresearch.at/fbi/digicult/> Link to external resource
  5. eCulture Newsletter
    URL: <http://www.cordis.lu/ist/ka3/digicult/en/newsletter.html> Link to external resource
  6. DIGICULT
    URL: <http://www.cordis.lu/ist/ka3/digicult/> Link to external resource
  7. Consultation System
    URL: < http://www.cordis.lu/ist/fp6/fp6consult.htm> Link to external resource
  8. eContent Programme
    URL: < http://www.cordis.lu/econtent/home.html> Link to external resource
  9. Call for proposals
    URL: <http://www.cordis.lu/econtent/calls.htm> Link to external resource
  10. Projects
    URL: < http://www.cordis.lu/econtent/projects.htm> Link to external resource
  11. Culture 2000
    URL: <http://europa.eu.int/comm/culture/c2000-index_en.html> Link to external resource
  12. Cultural cooperation projects/ Projets de coopération culturelle
    URL: <http://europa.eu.int/comm/culture/cp2000listebis.pdf> Link to external resource
  13. EU Copyright Directive
    URL: < http://europa.eu.int/comm/internal_market/en/intpro p/intprop/news/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

ConchaConcha Fernández de la Puente
European Commission
DG Information Society
Cultural Heritage Applications

concha.fpuente@cec.eu.int Link to an email address
<http://www.cordis.lu/ist/ka3/digicult/> Link to external resource

The information provided does not necessarily reflect the official position of the European Commission.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Fernández de la Puente, C. "DIGICULT Column", Cultivate Interactive, issue 4, 7th May 2001
URL: <http://www.cultivate-int.org/issue4/digicult/>

-------------------------------------------------------------

National Node Column: Belgium

By Pascale Van Dinter - May 2001

How is the CULTIVATE project dealt with in Belgium? Which project tasks have been carried out to date? What kind of information is available via the national Web site? And what kind of information is sent via the national e-list? Pascale Van Dinter, the Belgian National Node, attempts to answer these questions and more.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

CULTIVATE Belgium logoThis article is divided into three sections:

How are Archives, Libraries and Museums organised in Belgium?

Libraries

In Belgium public competences in the sector of libraries, are widely spread among federated entities of the country, the Cultural Communities; with the exception of the Royal Library (and the libraries of the federal scientific and cultural institutions) [1] which is placed under the responsibility of the Federal Office for Scientific, Technical and Cultural Affairs (OSTC) [2]. The other scientific and public libraries lie with the ministeries of the Flemish, French and German-speaking Communities or even with the provincial and local authorities.

The policies regarding public libraries have been defined since 1978 by Community decrees and implemented by a special administration for public libraries located within the ministeries of culture for the Communities. Higher Councils for public libraries (advisory bodies) have also been installed in the Flemish and French Communities.

University libraries are placed under the authority of the leading bodies for each university. There is no central administration responsible at Community level for these libraries.

Cooperation between libraries has been steadily increasing during the last years.

Museums

Public competences in the sector of museums are also largely exerted by the Communities. Museums are organised and managed by various authorities: municipalities, provinces, the Communities themselves (a limited number of institutions), the federal State (mainly the major federal museums located in the Brussels region), learned societies and private organisations. The principal federal museums are scientific establishments with a large autonomy, placed under the administrative control of the OSTC.

A central administration is installed in the ministry of culture for each Community (Flemish Community: Unit Visual Arts and Museums; French Community: Unit Patrimony, Visual Arts, Craft and Folklore). The Communities have established a regulatory framework with regard to the public funding of museums. Higher Councils for museums (advisory bodies) have also been installed in the Flemish and French Communities.

Although museums do in general enjoy a large amount of autonomy, cooperation is now increasing. New collaborations are being developed with regard to training, learning, publishing, research projects (in particular for telematics developments) and the digitisation of collections.

Archives

The public archives are governed by the law of 24 June 1955 which organises the General State Archives and the State Archives in the Provinces (16 deposits). Together these institutions form a federal scientific establishment placed under the administrative authority of the OSTC. The 1955 law prescribes the conditions under which all public archives must be transferred in this establishment. It will soon be replaced by a law that will take into account, among others things, the new federal structure of the country.

The Communities have established a regulatory framework with regard to the approval and public funding of private archives. The General State Archives (AGR) are de facto a national focal point as regards archival activities.

Cooperation Between Memory Institutions in Belgium

There are no permanent structures for cooperation between ALMs nor are there coordinated programmes. Some collaborations and synergies may possibly occur between various institutions who have the same organising/funding body or work within the framework of thematical programmes (e.g. the federal programme of scientific support to the development of the information society).

One of the main aims of CULTIVATE is to enforce national cooperation between ALMs within the framework of European programmes and the project is as such a useful initiative for the memory institutions in Belgium.

More detailled information on the situation of ALMs in Belgium can be found on the Cultivate Web site [3].

The Belgian Partner in CULTIVATE

The Scientific and Technical Information Service (STIS)

The STIS [4] is a separate government agency within the Federal Office for Scientific, Technical and Cultural Affairs (OSTC). The mission of the STIS is to retrieve and disseminate scientific, medical and technical information and documentation; to promote the use of electronic information sources, especially from international providers of scientific and technical information and to disseminate information about European research and innovation programmes.

Government agencies, universities, research centers and businesses, as well as the non-profit sector, can call upon the STIS whose main priorities include scientific research, innovation and science policy.

How the STIS became a Partner in CULTIVATE

The National Focal Point (NFP) for European Libraries was created by the Belgian authorities in October 1990 in order to ensure exchange and disseminate information as well as follow-up the Telematics for Libraries Programme of the European Union. The OSTC were responsible for the secretariat and presidency of the NFP. From 1990 to 1998 the NFP represented Belgium in the European network of national focal points created for the Telematics for Libraries activities of the 3rd and 4th framework programmes for RTD of the European Union.

The Commission 'International Cooperation' of the Interministerial Science Policy Conference decided in January 1999 to enlarge the NFP to include representation of the archives and museums communities in order to be able to reflect in an appropriate manner the new place for these memory institutions in the 5th framework programme. The NFP became the National Focal Point for European Archives, Libraries and Museums (NFP/ALM) and the STIS was asked to act as chairperson and to run the secretariat.

In 1999, when some members of the "old" European network of National Focal Points decided to submit the CULTIVATE proposal under the IST-programme, the STIS became the Belgian partner in the new network.

What are the services provided by CULTIVATE Belgium?

Tasks of CULTIVATE Belgium

STIS is planning to try to continue its information and assistance activities beyond 2002 after CULTIVATE has ended, within the new context of the European Research Area.

Information Sessions

On February 5 2001, CULTIVATE Belgium organised an information day on European Archives, Libraries and Museums in the Royal Library Albert I in Brussels. This event attracted 100 participants from the ALM world. In the morning session presentations of EU programmes like IST-Digicult, Culture 2000 and eContent were given by representatives of the EU. The afternoon session was focused more directly on the 6th IST Call for proposals and on a presentation of the new Belgian CULTIVATE Web site. Belgian ALMs who have already work on European projects presented their experience in relevant areas. The Austrian partner of CULTIVATE also gave a presentation.

Belgian CULTIVATE Web site

Navigation bar on This Belgian CULTIVATE Web site

The Belgian CULTIVATE Web site [5] is multilingual: French, Dutch and English. It contains 5 main items:

Under CULTIVATE one can find detailed information about the CULTIVATE network and the services provided at European level.

National Focal Point presents the National Focal point: a short history, the organisations that are represented in this national coordinating body, its tasks and its members. The subitem "Belgium, a Federal State" is especially designed for foreign visitors to explain how the Federal State is organised.

EU-programmes contains the following subitems: general information about EU-programmes and fundings, electronic tools for partner search and specific information on the different EU-programmes which are of interest for the ALMs.

Via the item National Activities one can find explanation of the national programmes, activities and research projects for ALMs implemented at federal and communities level. This section will be developed with the aim at giving a broad picture of the research and innovation activities of the Belgian ALMs and it should help establishing new collaborations with foreign partners.

Archives/Libraries/Museums gives a list of URL's of professional associations, advisory bodies, institutions and projects.The subitem "Report" presents a state of the art of ALMs in Belgium and is especially interesting for foreign users who want to know more about the status and the institutional framework of the memory institutions in Belgium.

Belgian CULTIVATE e-list

Since March 2001 one can subscribe to the BE.CULTIVATE-list, an electronic list for those who want to know more about:

Subscription to this list is possible from the Belgian CULTIVATE Web site [6]. The information distributed via this list is provided both in Dutch and in French. At the beginning April 2001 more than 100 people had already subscribed to the list

References

  1. Royal Library
    URL: <http://www.kbr.be/> Link to external resource
  2. Federal Office for Scientific, Technical and Cultural Affairs (OSTC)
    URL: <http://www.belspo.be/> Link to external resource
  3. Belgian Cultivate Web site section on ALMs
    URL: <http://www.be.cultivate-eu.org/reporte.htm> Link to external resource
  4. Scientific and Technical Information Service (STIS)
    URL: <http://www.stis.fgov.be/> Link to external resource
  5. Belgian CULTIVATE Web site
    URL: <http://www.be.cultivate-eu.org/elistcultivatee.htm> Link to external resource
  6. Belgian CULTIVATE e-list
    URL: <http://www.be.cultivate-eu.org/elistcultivaten.htm> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Pascale Van Dinter
Scientific Worker STIS and Project manager CULTIVATE Belgium.
Keizerslaan 4 Bd de l'Empereur
1000 Brussels

Pascale.vanDinter@stis.fgov.be Link to an email address

Phone: +32 (0)2 519 56 42
Fax: +32 (0)2 519 56 45

Pascale Van Dinter worked between April 1995 and June 2000 for the Central Public Library Leuven (Belgium) on the following European projects:

Since July 2000 Pascale has worked at the Scientific and Technical Information Service as a project manager for CULTIVATE Belgium amongst other tasks.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
V. Dinter, P. "National Node Column: Belgium", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/nodes/>

-------------------------------------------------------------

At the Event:

-------------------------------------------------------------

OAI Open Meeting

By Rachel Heery - May 2001

The Open Archives Initiative (OAI) develops and promotes interoperability standards that aim to facilitate the efficient dissemination of content. In February Rachel Heery attended their Open meeting held in the Berlin State Library (Staatsbibliothek zu Berlin). The meeting marked the start of a validation period for the specification.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Open Archive Initiative (OAI) designers and early adopters launched the recently released OAI Metadata Harvesting Specification to a packed meeting in the Staadsbiblothek , Berlin, in February. Following on from a parallel event in Washington, DC in January, this meeting marked the start of a ‘validation period’ for the specification. Over the next year experimental implementations of the specification will inform the OAI and the wider community as to the possibilities offered by the OAI model for metadata exchange. This brief article will only give a short summary of the many presentations from the interesting and varied programme. Readers are referred to the OAI Web site [1] where there are copies of presentation slides. In this short report I will merely highlight some of the themes that emerged and note some issues of particular interest.

After a warm welcome from Diann Rusch-Feja, Max Planck Institute, who is one of the European members of the OAI steering committee, the programme got underway. Presentations for the day included views from a number of stakeholders representing the OAI executive, implementers both information services and software development backgrounds, existing e-print archives, and vendors of library management systems.

Carl Lagoze, executive director of the OAI, led off with an overview of its history and an account of progress to date. The origins of the Open Archives Initiative were in the e-print community. The impetus for the initiative was a desire for effective interworking between e-print archives. In the early days the e-print community’s efforts were concentrated on enhancing interoperability between e-print archives, culminating in the Santa Fe convention in 1999 [2]. The work of the initiative continues to be relevant to this community, however as time went on it became clear that the fundamental enabling technology for simple metadata exchange is relevant in a much wider context.

Carl explained that its harvesting protocol positions the OAI independently from any specific content or economic model. OAI’s future ambitions promise to have much broader relevance in opening up access to a wide range of digital materials. The ambition is to enable 'interoperability that will work', and at a low cost so that the entry level for providing interworking services is lowered.

Paul Ginsparg, director of arXiv.org the well known e-print archive at Los Alamos National Laboratory (LANL) gave a perspective from the longest established open archive. Serving the physics community this pre-print archive is central to scholarly information exchange, and has been successfully built on the model of author self-archiving. This analysis of both author and end-user interactions with the archive gave a fascinating insight into the patterns of user behaviour that can be gleaned from statistics. The LANL archive does not provide open access to robots at present and has no plans to change this policy. Paul explained this was primarily to exclude adverse impact on performance, but also indicated that such 'diffusion' of the target audience might not be beneficial. If this policy were to change it would be interesting to compare the way users of search engines, for example Google, interacted with the site compared to the behaviour of users who made direct access.

The rapid development of the OAI specification is certainly impressive, as is the early focus on a very specific well-scoped implementation area. Carl went on to give a detailed consideration of the harvesting protocol and how it fits into the overall OAI interoperability framework. Drawing on work carried out with Herbert Van de Sompel, Carl gave a detailed presentation of the core concepts in the OAI metadata harvesting specification and how these are built into the protocol. The model is of a number of 'service providers' using the OAI protocol to harvest metadata from 'data providers', the protocol allowing a limited number of simple requests to be made within the gathering transaction. In order to facilitate interoperability data providers must provide their metadata in simple Dublin Core using XML encoding, although they may choose to provide metadata compliant to other schemas in addition if they wish.

The emphasis within OAI is on simplicity, and it will be interesting to see how far this simplicity will be retained in operational services, or whether there will be an imperative to provide refinements and differentials which will require adding complexity to the simple exchange of simple metadata.

The next part of the programme involved a number of first hand accounts of implementation experience from representatives of the group of alpha testers of the OAI specification. The alpha test period ran from November 2000 to early 2001 and involved participation from institutions in a variety of domains. There were a number of approaches to alpha testing, some looked at making metadata available for harvesting (acting as data providers), others looked at the role of service providers gathering metadata from repositories, and some focused on developing compliant software building on existing systems.

Kurt Maly, Old Dominion University, gave an account of the experience of testing OAI harvesting from the perspective of a federated service of e-print archives. The alpha test involved harvesting data from arXiv, cogprints, Virginia Tech Thesis/Dissertation collection and several other institutional repositories. In a summary of lessons learned Kurt noted that the expense of maintaining a quality federation service is highly dependent on the quality of metadata declared by data providers. Using a unified controlled vocabulary, or at least defining mapping relationship, is important in a federated archive service. Also he noted that in using XML syntax and character encoding a single error could influence large set of data, and such character encoding errors occur frequently in many data providers. Service providers also need to consider the trade-off between data freshness and harvest efficiency

Heinrich Stamerjohanns and Susanne Dobratz explained testing of the protocol at the Humboldt University, Berlin, which runs an eprint archive service for theses, dissertations, and scientific publications. The archive contains text in a variety of formats (SGML, XML, PDF, PS, HTML) as well as non-text data (video, simulations). The archive is now compatible with OAI version 1.0.

Jean Yves Le Meur told of his experience at the CERN library. This involved a test collection of books and eprints. Metadata was provided for these using three metadata formats: the mandatory Dublin Core, plus MARC and RFC 1807. One issue was scoping the collection to limit the metadata declared for the OAI repository, which meant trying to identify a sub-set of the whole CERN collection. Within the declared metadata there was also some question as to the best identifier to use. CERN also considered how full text identified by OAI metadata might be exchanged, at present the OAI protocol does not specify procedures for linking to full text.

Eva Krall, Ex Libris, outlined implementation of the OAI protocol in the library management system Aleph 500. Ex Libris were successful in using the OAI protocol to provide a simple means of maintaining a union catalogue as an alternative to message based replication of data between systems. However Eva noted that in the context of libraries there were some issues such as lack of authorisation mechanisms, and the need to transfer holdings data, so some refinements and enhancements might be required.

Andy Powell, UKOLN, carried out a test implementation of the OAI protocol within the Resource Discovery Network, a co-operative network of UK subject gateways giving access to high quality Internet resources. Within the RDN cross searching has been implemented using Z39.50 but because of performance issues and difficulty with building flexible browse interface there is interest in looking at a record sharing solution. One of the issues that emerged from testing was the richness, or lack of richness, of the simple Dublin Core schema for records. Simple Dublin Core does not support all the elements included within RDN records; e.g. it does not indicate the subject classification scheme in use. It may be that a richer metadata schema would be more appropriate. Andy indicated that issues of authentication and branding might also need further exploration.

Les Carr, University of Southampton, reviewed the eprints.org software which facilitates institutional and author self-archiving. The CogPrints Cognitive Sciences E-print Archive alpha tested the OAI protocol and is now OAI v1.0 compliant. The eprints.org software is freely available and is designed to be as flexible and adaptable as possible, so that universities world-wide can adopt and configure it with minimal effort for their institutional self-archiving needs. Les went on to consider ideas for building a citation database derived from analysis of use of e-print archives, and considered how analysis of use of archives might suggest the tools needed to support archive administration and user interfaces.

Future plans for implementation of the OAI protocol are being drawn up in different application areas. Donatella Castelli, Istituto di Elaborazione della Informazione, gave a brief overview of the Cyclades project. This is a recently funded project as part of the EC IST programme. Its aim is to support scholars in inter-acting with multi-disciplinary archives as members of networked scholarly communities. The project intends to develop a working space for groups to have shared access to their own documents, to other collections, and to related links and annotations. It will test whether such a quality service can be built on the OAI low barrier interoperability framework.

Jeff Young, OCLC, then outlined activity within the ALCME project at OCLC. ALCME is working with the National Digital Library of Theses and Dissertations (NDLTD) to develop a name authority linking mechanism. Participants will create authority records in their local repositories and share them with other repositories using OAI protocol for metadata exchange. Jeff intends to explore the use of RDF to enable participants to annotate each other's records. Further details of this ambitious application are available from the OCLC Web site.

During question time there was some reflection on already existing alternatives to the OAI framework. Inevitably a comparison with Z39.50 was suggested. But how can one compare OAI and Z39.50? In such a comparison to consider all the functionality of Z39.50 would be far too wide a scope. However realistic comparison should include more than the capability of both protocols to gather all metadata instances from a compliant repository.

The event also gave the audience some insights into options for 'OAI next steps'. Presentations during the day prompted ideas (in this member of the audience) ranging from facilitating shared metadata creation, in effect collaborative cataloguing, to more specific implementation matters such as working towards recommendations for the optimal size of a metadata repository. Identifying criteria to guide the harvesting process also seem of significant importance, in order to achieve a balance between distributed and centralised repositories. Of major interest is the business impact of the OAI model, where is the burden of work located for services following the OAI 'technical framework'? The next year will give data providers and service providers the opportunity to explore some of these issues. Already further meetings have taken place and are planned to take this forward.

References

  1. Open Archives Initiative Web site
    URL: <http://www.openarchives.org/> Link to external resource
    Please note that links to information about alpha tester are available from the OAI site, so are not listed here.
  2. The Santa Fe Convention of the Open Archives Initiative, Herbert Van de Sompel and Carl Lagoze, D-Lib Magazine, February 2000.
    URL: < http://www .dlib.org/dlib/february00/vandesompel-oai/02vandesompel-oai.html> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

UKOLN logoRachel Heery
Research and Development Team Leader
UKOLN
University of Bath
BATH
BA2 7AY
United Kingdom

r.heery@ukoln.ac.uk Link to an email address
<http://www.ukoln.ac.uk/> Link to external resource

Phone: +44 1225 826724

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Heery, R. "OAI Open Meeting", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/oai/>

-------------------------------------------------------------

Internet Librarian International 2001

May 2001

The third annual Internet Librarian International Conference [1] was held between 26 and 28 March 2001 at Olympia 2 in London with pre-conference workshops given on Sunday 25 March and post conference workshops on Thursday 29 March. Caroline Milner of Rubicon Communications gave a preview of the conference in the last issue of Cultivate Interactive [2]. Members of the Cultivate Interactive Team were at ILI and attended a number of the presentations.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Conference

The actual Conference ran over 3 days and offered three parallel programme tracks: Track A was named the 'Intranet professional’s institute' and covered issues surrounding Intranets, portals and knowledge management. Track B was the 'Webwizards’ symposium' covering tools and systems, Web design and management and navigating the net. Finally, Track C dealt with e-resources and looked at content management, e-roles and e-learning. The running of three tracks meant that delegates could not see all presentations given at the Conference. However, the choice of three speakers at all times suggests that there should have been something of interest to most at any point in the day.

Here is a mixed bag of the more memorable presentations:

William Hann gave a talk about the technicalities of running a portal. Hann is the managing director of Free Pint, a free email newsletter and Web site for information professionals [3]. His presentation covered how Free Pint had dealt with the realisation that they needed to generate income somehow. Initially William had gone for raising cash through Venture Capitalists but had had a 'funny feeling' that it wasn't a good idea and backed out at the last minute at the cost of several tens of thousand pounds of his own money. It was probably a sensible decision given the current state of other dot.com companies. The approach Free Pint opted for in the end was one of 'organic growth', a process which has included adding adverts to the newsletter, getting sponsorship, adding company financials to the home page and spending 6 months on technical development. William was very open about the mistakes Free Pint had made and the lessons they'd learned. The presentation provided useful advice for any portal or Web publication thinking of ways of increasing their financial status.

Brian Kelly of UKOLN took us on a lightning tour of the rights and wrongs of hosting advertisements on public-sector Web sites. The issue of whether this form of income generation is acceptable in today’s political climate is a pertinent, yet still controversial one. UKERNA have recently published a fact sheet dealing with advertising on Janet that is very helpful [4].

Brian's polemical presentation was followed by the Web masters roundtable panel session. The panel consisted of Brian Kelly again, Greg Notess from Montana State University-Bozeman Library, USA and Mary Peterson from the Royal Adelaide Hospital, Australia. The Web masters initially gave some interesting tips on deconstructing your Web site and then discussed how Libraries fit in to the concept of the Web. It was pointed out that when Tim Berners-Lee developed the three key architectural components of the Web (Data format (HTML), transport (HTTP) and addressing (URIs/URLs)) he forgot one of the most important components, metadata. Conclusions were drawn that the Library community do have a number of valuable roles to play in the Web world, one of the more important being in driving metadata, but that their developments should be based on sound architectural principles which avoid vendor lock-ins. They should also keep up to date with new Web technology developments, such as XML, or face being left behind. The panel session involved a fair amount of interaction with a profusion of comments from the audience.

There were also some worthy sessions on Search Engines given by Danny Sullivan and Greg Notess. Both discussed 'the death of search engines', a prediction made a few years ago that finally seems to be taking shape. An introduction to the 'Invisible Web' was given by Gary Price and Chris Sherman. The invisible Web is the area not indexed by search engines and believed to be between 2 and 50 times larger that the visible Web [5]. It mainly consists of a number of very important databases, many of them from the government sector [6]. The presenters have a book coming out on their research in July [7].

Steve Coffman and one of his colleagues gave a live presentation of the Virtual Reference Desk [8]. The VRD is a way of providing live online reference assistance to users in need of support. A number of US libraries have started running VRD programmes and Steve got some of the librarians to log on in their pyjamas (so to speak). Three Slide sessions were given, one from London and two from different places in the USA. The whole presentation was very live and very dynamic. It was refreshing to see something different from your standard Powerpoint, and Steve's enthusiasm was enough to get us all excited.

The Exhibition

The conference runs in parallel with an exhibition. Exhibitions at conferences generally tend to only be helpful if you are on the lookout for a particular product or service, such as library equipment or a new content management system. The freebees given usually only justify one lap of the exhibition hall. However this year the ILI Advisory Committee had the insight to include a number of free workshops. Some of the workshops echoed the presentations given at the actual conference whilst others were totally unique. Our very own UK National node Rosalind Johnson gave a tour of the new UK Portal & Cultural area of the Cultivate Web Site [9]. Stephen Abram of IHS, Canada & Bonnie Burwell, Burwell Information Services, Canada gave a number of workshops on Intranet Toolkits and e-learning. Steve Coffman also gave his global broadcast again using innovative technology to join together attendees from all over the globe.

Conclusion

On the whole the Internet Librarian International Conference did seem to be lacking a certain something. With 45 presentations given over the three days it became difficult for the delegates (and the library world) to establish clear themes or threads arising from event. The generalness of the Conference seemed to become a negative aspect, though this many of the delegates seemed happy with the generic level of content. It is possible that the Conference could benefit from more involvement from UK and European speakers and a broadening of the remit to include the wider cultural heritage sector. This however would reflect changes in the UK and Europe, which are not happening in the US from where a large portion of the delegates came.

  1. Internet Librarian International 2001
    URL: <http://www.internet-librarian.com/> Link to external resource
  2. Caroline Milner, Internet Librarian International 2001 Preview, Cultivate Interactive, issue 3, 29 January 2001
    URL: <http://www.cultivate-int.org/issue3/ili/ > Link to external resource
  3. FreePint
    URL: <http://www.freepint.co.uk/> Link to external resource
  4. Advertising on Janet
    URL: <http://www.ja.net/documents/factsheets/advertising.pdf> Link to external resource
  5. Google has indexed over 1.3 billion pages, 1,346,966,000 at time of writing.
  6. Invisible Web databases
    URL: <http://www.invisibleweb.com/> Link to external resource
    URL: <http://www.completeplanet.com/> Link to external resource
    URL: <http://beta.profusion.com/> Link to external resource
  7. The Invisible Web
    URL: <http://www.invisible-web.net/> Link to external resource
  8. Virtual ReferenceDesk
    URL: <http://www.lssi.com/virtual/> Link to external resource
  9. Cultivate UK Web site
    URL: <http://www.uk.cultivate-eu.org/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Cultivate Interactive "Internet Librarian International 2001", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/ili/>

-------------------------------------------------------------

Praxis

-------------------------------------------------------------

An Introduction to Streaming Video

By David Cunningham and Neil Francis - May 2001

David Cunningham and Neil Francis report on the technologies available, as well as some of the problems encountered when trying to stream video content across the Internet.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

Advances in computing and networking technology mean that it is now feasible to deliver sound and video across the Internet. However there are still many users with old computers and slow network connections, and care needs to be taken to ensure that streaming technology is not used inappropriately and without due regard for the target audience.

This article is based on our experiences at the University of Bath where we have experimented with both live and recorded video. There is a Web page [1] with examples of some videos streamed from the University of Bath.

Why Use Streaming Video?

Streaming video can be used for live or recorded events. The main reason for broadcasting live is to reach a wider and/or more dispersed audience. Typical live broadcasts could be lectures, sports or entertainment events, and academic or other ceremonies. For a major academic lecture given at a university the number of people who could actually attend would be limited by the size of the lecture theatre, whilst the potential audience could be anywhere in the world. Live video is essential if the aim is to give a remote audience an experience as close as possible to being physically present at the event.

If an event is broadcast live it is relatively simple to make a recording which can then be published on the Web for later viewing. However, there are many more possibilities with non-live broadcasts. A streamed broadcast should be considered to be a multimedia event, which could include full motion video if appropriate. Good examples of multimedia lectures can be found at the Boxmind site [2]. Boxmind use synchronised broadcasts with several streams containing video, audio, scrolling text, pictures or diagrams, and hypertext links. The synchronisation ensures that the text display corresponds exactly to the spoken commentary. Although the pictures and diagrams would be generally considered to be an integral part of the lecture, it could be argued that the audio and video are not essential. It is however widely accepted that information retention rates are much higher when a student can see and hear a lecture in addition to being able to read the text.

Figure 1: A sample e-lecture from Boxmind
Figure 1: A sample e-lecture from Boxmind

What is Streaming Video?

Streaming technology is not new. Most people are familiar with it from an audio only point of view since Marconi invented radio in 1897. Streaming video followed with TV from the mid 1930s onwards. Most people would refer to this as broadcasting. This concept is well understood. A continuous stream of information is transmitted and receivers are able to tune in and receive the information in real time.

Streaming across the Internet, although similar in concept, has its own specific issues that must be addressed. The Internet was not designed for real time streaming. The Internet is a shared medium and uses a best effort delivery mechanism, Internet Protocol (IP) to deliver content. There is no dedicated path between source and the sink. IP breaks content up into self contained packets and these packets are routed independently. Limited bandwidth, latency, noise, packet loss, retransmission and out of order packet delivery are all problems that can affect real time streaming over the Internet.

In the main, traditional Internet traffic is not sensitive to these problems - or handles them higher up the protocol stack at its leisure. Live or on-demand streaming is a time critical application which is sensitive to the variation in delay that is normal for a shared access network like the Internet. Not only does the amount of bandwidth that you have access to matter but also the consistency or quality of this bandwidth. All Internet streaming technologies get around this by buffering a certain amount of content before actually starting to play. The buffer irons out the natural traffic variations inherent on the Internet. Many seconds worth of content can be buffered and in excess of 30 seconds worth is not uncommon. Note that after the initial buffering the streamed broadcast will start to play at the same time as more content is being downloaded. This is an improvement over earlier technologies where the whole file had to be downloaded before playing could commence.

It is probably still safe to say that the majority of end users are accessing the Internet over very narrowband dial up links. Comparatively few people have the luxury of access at anything over 2Mbps, however over the last couple of years cable and DSL access has been increasing allowing bandwidths between 128kb/s to 512kb/s to be available to end users. At this bit rate near-VHS quality rich media can be achieved through modern compression techniques and sophisticated codec technology.

It is not all about quality of bandwidth however. Content creation, serving, usability and availability are also challenges that need to be addressed.

Standard Compression Technologies

There are a variety of compression systems used today. The Motion Picture Experts Group (MPEG) [3] has three open (ISO/IEC) standards that can be used for streaming.

MPEG-7 (Multimedia Content Description Interface) is scheduled for release in July 2001, and work has started on MPEG-21 (Multimedia Framework).

Proprietary Compression Solutions - The Big Three

Despite the open standards of MPEG most people use one of the big three proprietary formats. These are RealMedia, Quicktime and Windows Media. All three have specific advantages which have allowed them to gain ground in the market - mainly because they are free, and support the Real Time Streaming Protocol (RTSP).

RealMedia [4]

A very popular player which is very widely distributed and available for all major OS platforms. RealNetworks claim over 70% of the Internet streaming market with the player being installed on over 90% of home PCs.

RealPlayer is up to version 8 and the latest generation codecs (developed with Intel), coupled with their SureStream technology, will probably keep them in a dominant position. RealSystem 8 supports over 40 media formats. Surestream is an automatic multi bit-rate technology that will adjust the streamed data rate to suit the client's connectivity. In practical terms this means that a single encoding will suit all users from dial-up to corporate LAN. Also supported is Synchronised Multimedia Integration Language (SMIL) which allows mixed multimedia content to be delivered in a synchronised way.

RealServer is also available for most OS platforms but is only free for a basic 25-user licence. Streaming is RealNetwork's core business so they cannot subsidise the technology in favour of market share as Apple and Microsoft do. Serving more than a couple of hundred simultaneous streams can become quite expensive and one major drawback of the system.

Quicktime [5]

Originally developed in 1991 version 4 now claims more than 100 million copies distributed world-wide. Quicktime's major advantages are its maturity and the large number of codecs available for it. It features an open plug-in feature to allow third party codecs to be added. MPEG1 and MPEG4 codecs are currently available.

The plug-in feature has allowed over 200 digital media formats to be supported by Quicktime 4 with companies such as Sorenson Labs [6] producing very impressive codecs. As with RealPlayer, SMIL is available and now RTSP is also supported. (Prior to version 4 only progressive streaming, not true real time streaming was available in Quicktime). Quicktime 5, currently in beta, also has support for immersive virtual reality.

Quicktime server is supported natively in MAC OS. The open source Darwin Streaming Server [7] is available for other platforms and is free.

Windows Media Player [8]

Windows Media Player (currently at version 8) is the newcomer to the streaming world. Because of this there are fewer codecs available for it. There is an MPEG4 codec and Microsoft's proprietary but very good ASF codec. Microsoft have put some work into their RTSP implementation and it is considered more efficient than others. SMIL is supported, but only at a basic level.

Microsoft give the player away free and the company's marketing might means that the format is quickly gaining popularity. There are currently 220 million players in 26 languages in existence.

Microsoft's streaming server (called Microsoft Media Services) is free and supplied as standard under Windows 2000 Server and as a free download for Windows NT server. Microsoft have not open sourced the code which means that other platforms are not supported. This is considered a major disadvantage as far as flexibility is concerned.

The Streaming Model

The components of an end to end streaming system are the client or player, the server and some sort of content creation process. As always, content is king so the greatest amount of time will probably be spent on the creation process.

Content Creation

The designer of the content will use various production tools to create the content. These tools convert audio, video, or animation to a data type format that the server can stream. Because most servers can deliver content in many different formats, there are a number of tools that people can use in creating content. Production tools can epitomise the content for efficient delivery over the Internet, based on the nature of the material and the capabilities of the client computers.

Each of the big three provides tools for creating or converting content into a format that can be handled by their servers and epitomised for Internet Streaming. RealNetwork's RealProducer 8 will convert from a number of raw formats (AVI, MPEG-1, AU, AIFF etc) and is free for the basic version. Apple's Quicktime Player (free and pro) also provides content authoring and import/export facilities and Media On-Demand Producer is free from Microsoft.

Sonic Foundry's [9] Stream Anywhere can be downloaded for about 125 US dollars and provides for the creation of streaming content in Real and MS Media formats. A more complete (and more expensive - upwards of 600 US dollars) product is Terran's Media Cleaner 5[10]. Cleaner 5 is a complete suite of tools for preparing video and audio for the web and is considered the industry leader in this field.

The content creator can also create a Synchronised Multimedia Integration Language (SMIL) file to synchronise several clips within a presentation. A SMIL file co-ordinates the layout and playing of two or more media clips in parallel (simultaneously) or in sequence. A typical example of this is a lecture or presentation with associated slides where the presentation of the slides can be synchronised with the audio content of the lecture.

RealNetworks have put the most effort into developing SMIL for the web and have created proprietary formats of RealText, RealPix, RealVideo, RealAudio and RealFlash for use within a SMIL script. SMIL version 2.0 is currently in draft and will enhance the language significantly.

Creating content with SMIL [11] (which is based on XML) takes more time and effort but the results are worth it. RealNetworks can supply the Oratrix Development program GRiNS [12] for the creation of SMIL texts. There are many examples on the web showing how, for very little bandwidth, excellent media rich presentations can be compiled which are much more informative and interesting than the statically presented video.

Local experience has shown that it is not usually sufficient to simply encode existing video content for streaming. Content producers need to be cognisant of the tremendous compression ratios that are common in this arena. Subtle visual information is lost and picture sizes will be small. Limited camera movement is important as is good lighting, simple backgrounds and close ups of subject material.

All the systems have ways of making it easy to provide a single link for users encompassing multiple data rates. This means that your files can stream without the user having to specify a particular bandwidth. QuickTime's approach is to create a different file for each. This complicates the encoding process and does not address the issue of fluctuating bandwidths. However, having each file individually encoded does provide enormous flexibility.

RealNetwork's SureStream technology and Microsoft's Intelligent Streaming lets you put multiple tracks in a single file each with a different bit rate for delivery. Of the two Real's SureStream is the most sophisticated and flexible, and if bandwidth fluctuations are an important factor in delivery of content this will deal with it best. Combining SureStream with SMIL is also possible.

The content creator can either prepare media clips in advance or encode a live event as it happens. In this the term encoder refers to the software (such as RealProducer, for example) that converts live or pre-existing media into a format that the server can deliver.

Streaming Servers

Just as a Web server delivers pages to Web browsers over the Internet, streaming servers deliver media clips to clients (clips are created with the production tools described elsewhere). Real time streaming requires specific servers. RealNetworks, Microsoft and Apple all provide streaming servers. These servers give you a greater level of control over your media delivery but can be more complicated to set up and administer than a standard HTTP server. Also, real time streaming uses special network protocols, such as RTSP or MMS (Microsoft Media Server).

The Client

Ideally the user should have a simple hypertext interface and have to do no more than click on a link. Any upgrade or download of a client player utility should be automated and transparent. In practice client downloads tend to be large and complicated procedures with too many options available for the average user.

Availability

Streaming availability on the global Internet should ideally mean a server ready to stream content to any clients who have an interest in receiving it. Unfortunately the demand and availability of media rich content has lead to a breakdown of the traditional client server model. Single servers streaming content to diverse groups of clients distributed across the Internet are ineffective in terms of both server load and network congestion.

Over the last couple of years strategies have evolved in the commercial sector to address these problems. Content Delivery Networks (CDNs) are an attempt to introduce a coherent approach to building an infrastructure of caching proxies, mirror servers and proxy accelerators to enable a more efficient and speedier delivery of streamed content to end users. The ultimate goal is to replicate content and bring it closer to the end user in a transparent fashion. In this way the user sees no URI changes and has no knowledge (nor interest) in the actual source of the content.

There are several commercial CDNs already offering these services. Probably best known is Akamai [16] with its FreeFlow technology. Adero [17] use what they call their GeoTraffic Manager and Omnicast technology to move fresh content closer to the audience. Digital Island [18] does very much the same with its Footprint technology. They claim a 10-fold speed increase by distributing content to their world-wide network of servers.

iBeam [19] use their MaxCaster media serving system located in points-of-presence around the world. They use proprietary software and satellite networks to push content through their network. They claim more than 500,000 simultaneous streams now and will be capable of serving millions of streams in the future. Edgix [20] use their Edge Delivery Platform which includes edgeMedia, edgeNews, and edgeStream to ensure high performance delivery of content to end users.

Although the above are commercial ventures, the notion and requirement for the CDN model has been appreciated generally. Over the last 12 months work has been carried out within The United Kingdom Education & Research Networking Association (UKERNA) [21] to look at providing a similar distributed resource for delivering streamed content within UK academia. The recent upgrade to JANET (SuperJANET4 [22]), providing 2.5Gbps backbone links increasing to 10Gbps in the next two years, provides a huge leap in bandwidth availability. This offers excellent opportunities to experiment with streaming media but is also cause for concern as without proper management even large bandwidths like this can be swamped.

UKERNA intend to pilot an implementation of a content management system using the JANET core network. Content will be replicated at the edge of the (core) network and clients automatically directed to their nearest edge node. In this way core network resources are far more efficiently managed than in a centralised server model, and the end user should benefit from better and faster access to the resources they require.

SuperJANET4 new backbone contains eight Core Points-of-Presence (C-PoPs) geographically located throughout the UK [23]. Here bandwidth and switching converge and offer the capacity to accommodate additional services and opportunities above pure transmission and routing.

Problems and Solutions

Given a good a network connection streaming video works well, although in many ways it is surprising that it works at all. As mentioned earlier, the nature of the Internet and its use of IP means that a broadcast is competing with other data transmissions, and in general there is no way of guaranteeing sufficient bandwidth to ensure an uninterrupted broadcast. Video conferencing systems usually use other network technologies such as ISDN, which has a relatively low but guaranteed end to end bandwidth, or ATM which can be set up with channels with guaranteed Quality of Service (QoS). Bandwidth over the Internet is increasing rapidly but unfortunately demand seems to be keeping up with supply, so increasing bandwidth alone is unlikely to solve the problem. Various developments are taking place which should ultimately result in QoS being available over IP [24] and this, together with the emergence of CDNs, should result in a rapid growth in the use of video over the Internet.

A more mundane, but nevertheless important, difficulty in our experience is that many users have trouble setting up their client machines to receive audio and video. In the case of PCs most users seem to need a PC expert to help them install, for example, Real Player. A more fundamental problem in many educational establishments is that teaching rooms have frequently been set up without audio hardware. Where there are a large number of machines in one area it is generally necessary to use headphones.

Because "live" broadcasts are not really live but are typically delayed for around 30 seconds it is difficult to set up remote feedback. For example where an on-line lecture has been publicised in advance it would be beneficial to allow questions from the remote audience. One way to achieve this would be to set up a Web page so that people could type in their questions, which could then to relayed to the lecturer by another person in the room.

Many organisations have their networks protected with a firewall and, even if normal Web traffic is allowed, special provision may have to be made to allow access to the ports used to receive streaming video. The same applies when serving video to the Internet from inside a firewall.

Despite the problems, our experience of streaming has shown that it is practicable to deliver multimedia broadcasts across local and wide area networks, providing the end user is connected to the network with a reasonably fast connection such as Ethernet, DSL or cable modem. We do not consider it feasible to use a dial-up modem connection to view full motion video streamed broadcasts although it should be adequate for audio only or slide-show presentations.

This article has concentrated on the technology needed to produce and deliver multimedia, and in particular video presentations. However, careful consideration should be given as to whether video is needed as part of a multimedia presentation. Although easy to produce, a continuous shot of someone talking direct to camera is technically demanding on bandwidth and probably adds relatively little to the presentation. In particular, when producing material for education and training a combination of slide shows, animation and recorded computer session together with a commentary is easier to deliver and in many cases more effective than full motion video.

References

  1. Video Streaming at the University of Bath
    URL: <http://www.bath.ac.uk/bucs/multimedia> Link to external resource
  2. Boxmind
    URL: <http://www.boxmind.com/> Link to external resource
  3. Motion Picture Experts Group (MPEG)
    URL: <http://www.cselt.it/mpeg/> Link to external resource
  4. Real Networks
    URL: <http://www.real.com/> Link to external resource
  5. Quicktime
    URL: <http://www.apple.com/quicktime> Link to external resource
  6. Sorenson
    URL: <http://www.sorenson.com/> Link to external resource
  7. Darwin
    URL:< http://www.opensource.apple.com/projects/streaming> Link to external resource
  8. Microsoft Windows Media
    URL: <http://www.microsoft.com/windows/windowsmedia> Link to external resource
  9. Sonic Foundry
    URL: <http://www.sonicfoundry.com/ >
  10. Media Cleaner 5
    URL: <http://www.terran.com/> Link to external resource
  11. SMIL
    URL: <http://www.w3.org/TR/REC-smil/> Link to external resource
  12. Oratrix Development (GRiNS)
    URL: <http://www.oratrix.com/> Link to external resource
  13. Ligos
    URL: <http://www.ligos.com//> Link to external resource
  14. e-Vue
    URL: <http://www.e-vue.com/> Link to external resource
  15. Project Mayo
    URL: <http://www.projectmayo.com/> Link to external resource
  16. Akamai
    URL: <http://www.akamai.com/> Link to external resource
  17. Adero
    URL: <http://www.adero.com/> Link to external resource
  18. Digital Island
    URL: <http://www.digitalisland.com/> Link to external resource
  19. iBeam
    URL: <http://www.ibeam.com/> Link to external resource
  20. Edgix
    URL: <http://www.edgix.com/> Link to external resource
  21. UKERNA
    URL: <http://www.ukerna.ac.uk/ukerna.html> Link to external resource
  22. SuperJANET4
    URL: <http://www.superjanet4.net/> Link to external resource
  23. SuperJANET Colocation
    URL: <http://www.superjanet4.net/colocation//> Link to external resource
  24. QoS Forum
    URL: <http://www.qosforum.com/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

David CunninghamDavid Cunningham
Assistant Director
BUCS
University of Bath
BATH
BA2 7AY
United Kingdom

D.Cunningham@bath.ac.uk Link to an email address
<http://staff.bath.ac.uk/ccsdc/> Link to external resource

Phone: +44 1225 826288

 

Neil FrancisNeil Francis
Team Leader
BUCS
University of Bath
BATH
BA2 7AY
United Kingdom

N.J.Francis@bath.ac.uk Link to an email address
<http://staff.bath.ac.uk/ccsnjf/> Link to external resource

Phone: +44 1225 323571

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Cunningham, D and Francis, N. "An Introduction to Streaming Video", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/video/>

-------------------------------------------------------------

Streaming Video: A Look Behind the Scenes

By Jim Strom - May 2001

Over the last few years there has been a dramatic improvement in the quality of IP-based network media technologies. Both real time and on-demand media can now be created, served and played at the desktop using PC-based platforms and software freely available across the Internet.

In our second article on this topic Jim Strom, responsible for the Future is Hybrid Multimedia Presentations [1], gives us a behind the scenes look at what can be achieved with streaming video using a number of examples and case studies.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Introduction

If you want to see a good example of streaming video then have a look at the TERENA (Trans European Research Network Association) conference given in Lund, Sweden in June 1999 [2]. The whole conference was streamed out live across the Internet. When I watched it, I realised that streaming technology for distance learning had finally arrived. The quality of the video and audio was much better than anything I had ever seen or heard before. It was as good as actually being there at the conference. In fact it was better, because during parallel presentations I could pull down the separate streams and easily switch between sessions without having to shuffle between rooms. That’s the real benefit of interactive Web media – doing things that can’t be done in real life.

Behind the scenes there was a lot of technical preparation and resources needed to achieve the quality of the video stream in Lund, however the techniques can all be replicated on PCs using freely available software. The stream from Lund shows the presenter’s slides running alongside the video. This was done using SMIL (Synchronised Multimedia Integration Language), which defines the layout and synchronisation of the different media clips in the stream. When we set up the Advanced Telematics Centre (ATC) [3] in 1999, at the University of Manchester, we recognised the importance of this technique for e-learning and have since used it extensively in both live and on-demand developments with streaming media. This paper illustrates 2 case study examples of streaming productions that the ATC has done using SMIL.

Some Basic Concepts

In order to play smoothly, video data needs to be available continuously and in the proper sequence without interruption. Until fairly recently, it had to be downloaded in its entirety to the PC before it could be played. With streaming, the file remains on the server. The initial part is copied to a buffer on the PC and then, after a short delay, called ‘preroll’, starts to play and continues as the rest of the file is being pulled down. Streaming provides a steady method of delivery controlled by interaction between the PC and the server. The server regulates the stream according to network congestion and thereby optimises the presentation on the PC.

There are 3 software components involved in streaming:

Content can be On-demand or Broadcast

On-demand content is controlled by the client. The user can select a pre-recorded stream and also freely choose when to view it. Furthermore the user can control the video stream - pausing, jumping ahead/back, restarting, etc – just as with a video recorder. On the other hand, broadcast content is controlled and scheduled by the server. The content is only made available for viewing at selected times. The viewer can only watch the stream as it is being transmitted without any control over it, just as with a television or radio broadcast. Broadcast content can be scheduled to come from an archived file or can be a live transmission from an external audio/video device such as a camera or video recorder.

Streams can be Unicast or Multicast

A unicast stream refers to a single link from the video server down to the client for access to either on-demand or broadcast content. A single video server is able to handle many simultaneous unicast links to clients accessing the same or different content. Unicast streams can range from 20Kbps to more than 1Mbps so an important consideration here is the aggregate bandwidth needed by the video server. 60 unicasts, for example, coming down together at 20Kbps each, are going to amount to a total bandwidth of 1.2Mbps (60x20Kbps) over the video server’s local network.

Multicast gets around this problem by sending out a single stream that can be picked up by any number of clients, thus saving network bandwidth. In the case of the previous example, that would mean 20Kbps bandwidth usage for 1, 60 or 6000 simultaneous viewers watching the same broadcast. The number of viewers is immaterial. However, multicast only offers the client the opportunity to join a live or scheduled broadcast. The user has no control over the stream and cannot stop or restart the transmission. Multicasting is controlled by the network infrastructure itself and is dependent on the routers being multicast-capable. Institutions may be able to offer this internally over their own intranets if their routers are enabled. Also, over the academic network, we have access to Mbone – the academic multicast backbone. But outside of this, across commercial Internet, there is very little opportunity to use multicast at the moment. Unicast is therefore the predominant form of streaming for global access.

Web Server vs Video Server

On-demand content can be delivered from a Web server rather than having to set up and manage a separate video server. When a user requests the video file, it starts to be copied down to the PC using HTTP like any other Web data. Control is passed to the player and the stream plays as the rest of the file is being brought down. However HTTP and its underlying TCP/IP protocol are designed simply to transfer text and images reliably to the client and offers no inherent control over the stream. This leads to a bursty form of transfer rather than a steady delivery. In practice it is better to use a video server to regulate the transfer and give a smooth playback at the client end without rebuffering interruptions. For a useful overview of the pros and cons of this, see Microsoft’s white paper on Streaming Methods [4].

Implementing Streaming Video

The two main contenders in the streaming arena are RealNetworks’ RealSystem G2 and Microsoft’s Windows Media Technologies. Both suppliers provide the basic three streaming software components for free, downloadable across the Internet. Apple’s QuickTime is also now capable of streaming, with the release of QuickTime 5. Table 1 shows the platforms and provision of the player, server and encoder software.

Product Player Video Server Encoder
RealSystem G2
URL: <http://www.real.com/>
Free (RealPlayer 8 Basic) Free (RealServer 8 Basic; up to 25 streams) $2000 (RealServer 8 Plus; up to 60 streams) $6000+ (RealServer 8 Professional; 100-2000 streams) Free (RealProducer 8 Basic) $199 (RealProducer 8 Plus)
  Windows 98/NT/2000 + Mac Windows 98/NT/2000 + others (Linux, Free BSD, Solaris, HP/UX, IRIX) Windows 98/NT/2000 + Mac + Linux + Solaris
Windows Media Technologies
URL: <http://www.microsoft.com/>
Free (Windows Media Player Version 7) Free (Windows Media Server Version 7; Up to 2000 streams on a single processor server) Free (Windows Media Encoder Version 7)
  Windows 98/NT/2000 + Mac + Solaris Windows NT/2000 Windows 98/NT/2000
Apple QuickTime
URL: <http://www.apple.com/>
Free (QuickTime Player 5) Free (QuickTime Streaming Server 3) £18 + VAT QuickTime Pro
  Mac + Windows 98/NT/2000 Mac + Windows NT/2000 + others (Linux, Free BSD Solaris) Mac + Windows98/NT/2000

Table 1: Streaming Media Software from Real Networks, Microsoft and Apple.

Both RealSystem G2 and Windows Media Technologies embody a rich set of streaming management features including multicast capability, security and authentication resources and tools for streaming slide presentations. Both video servers provide Dynamic Stream Control so that the stream rate is adjusted according to network conditions. Also both products offer Multiple Bit Rate Encoding. This is the ability to encode a single file that can be streamed out to clients at different data rates according to their access bandwidth. The server will automatically select the appropriate encoding suited to the client’s bandwidth when the connection is made.

In an ideal world you might expect to view any content on any player. However, in general, content generated by one supplier’s encoder is only viewable through that supplier’s player. Also watch out for version incompatibility between players and content. For example, content encoded with the latest version of a supplier’s encoder may turn out to be not viewable by earlier versions of the player. This problem occurs with versions of RealPlayer.

Streaming Protocol and File Format

The video server uses a streaming protocol to manage the flow of data and control information with the client. RealSystem uses RTSP (Real Time Streaming Protocol), a standards-based client-server protocol for streaming media. Windows Media uses a proprietary protocol called Microsoft Media Server (MMS). Each system has its own file format for the streamed data. RealSystem encodes into a .rm file format. Windows Media encodes into an .asf or .wmv file format. You can see examples of these with the following references: [5], [6], [7], [8], [9], [10] representing both RealSystem and Windows Media encodings for 56Kbps, ISDN and DSL access rates. [You will need to have RealPlayer 8 and the Windows Media 7 Player installed to view these.] In each case, the same content has been encoded so a direct comparison of video and audio quality can be made. The playback window size is 320x240 pixels.

The content used in these examples is a 1-minute clip of Professor Frank Sumner from the University of Manchester (Figure 1). The clip was taken from a videoconference link with an assembled group in Antwerp where Professor Sumner was talking about his experiences at the time of the Manchester Baby (the first stored program computer that ran its first program on 21 June 1948).

Figure 1: Antwerp video clip used to compare the quality of Real Video and Windows Media
Figure 1: Antwerp video clip used to compare the quality of Real Video and Windows Media

Buffering

The audio and video will come down as separate streams. During the preroll, the player will buffer a small amount before starting to play with the aim of bringing down the remainder as the clip is playing. If the client’s bandwidth is not sufficient to sustain the stream then there will be frequent pauses as rebuffering takes place. You can see this happening in the ‘Manchester Baby’ examples if the ISDN or DSL clip is pulled down over a modem link. The server will always try to maintain the audio stream at the expense of the video stream since interruptions in the audio are more noticeable than changes in the video.

Capacity Planning

Media files can become very large and therefore take up considerable disc space. In producing the ‘Manchester Baby’ clip, the original video was first captured for editing in an uncompressed, AVI (Audio Video Interleave) format. This 1-minute clip at 24-bit colour depth, at 15 fps and with a 320x240 pixel frame size took up 267Mb! When compressed however for the two encoding systems, it came down to around 350Kb (56K), 800Kb (ISDN) and 4,300Kb (DSL). Projecting from these figures, a 1-hour stream for access rates of up to 250Kbps is going to take up around 260Mb of storage on the video server.

SMIL you’re on Video

SMIL (Synchronised Multimedia Integration Language) is the XML complaint markup language that provides a time-based synchronised environment to stream audio, video, text, images and animation. SMIL is the officially recognised standard of the World Wide Web Consortium. RealSystem G2 and QuickTime 5 are SMIL compliant. The current Windows Media 7 player is not, however the next release of the Windows Media, due out later this year, will be compliant, albeit with a Microsoft flavoring. So SMIL is set to become the common media mark-up language across all the three steaming platforms.

A SMIL file (with a .smi extension) defines the layout and sequencing of the media clips. Sequential (<seq>) and parallel (<par>) tags allow you to specify that clips should be played either one after another or at the same time.

Using a SMIL file to control a presentation means that a slide stream can be run in parallel with a presenter’s video stream, scrolling or ticker text can appear and live links to other Web pages or media can be built in. RealNetworks have added some extensions of their own to the SMIL standard and have defined two additional data types:

Some Examples of Production

The following two case studies, together with some general guidance on production, provide examples of using SMIL with RealSystem G2.

Case Study 1

This event was a student fashion show held by UMIST Department of Textiles. The requirement was to stream it out live over the Web and to provide an archived copy on the server. The encoder used a Windows98 PC (Celeron 400, 64Mb) together with a Hauppauge WinTV capture card and the free RealProducer Basic software. The encoded stream was fed over a LAN to a video server, (WinNT4, dual PII 400, 128Mb + SCSI drive), running the free RealServer Basic software. The event was advertised on the Department’s Web site with a link to a SMIL file on the video server. The SMIL file produced a presentation (Figure 2) with 3 media inserts: the UMIST logo, some ticker text produced using RealText and the live video window. The stream was archived onto the encoder PC as it was broadcast. The source video was also captured onto digital videotape and used to generate an edited version of the Webcast to go up on the server. The resulting stream can be seen at [11].

Figure 2: UMIST Fashion Show Live Webcast
Figure 2: UMIST fashion show live Webcast

The SMIL code used for this is shown in Table 2. Basically the code defines the layout for the different areas in the presentation window and then, in the ‘body’, specifies the source and relationship of the three streams. The surrounding ‘par’ tags cause the streams to be played back in parallel.

A further example of a live Webcast done by the ATC can be seen at [12]. This was an outside event for the Mottram Millennium festival (Cheshire) where a civil war re-enactment was captured in the field (literally) and relayed back over a wireless link to a nearby cybercafe for encoding. From there it was exported over ISDN to the video server in Manchester. The example shows the use of scrolling text to support the video presentation.

<smil>
<head>
<meta name="title" content="UMIST Fashion Show 2000"/>
<meta name="author" content="Advanced Telematics Centre"/>
<layout>
<root-layout width="390" height="270"/>
<region id="logo" z-index="1" width="72" height="270" left="0" top="0" />
<region id="ticker" z-index="1" width="318" height="30" left="73" top="241" />
<region id="media" z-index="1" width="318" height="240" left="73" top="0" />
</layout>
</head>
<body>
<par>
<img region="logo" src="umist.gif" fill="freeze"/>
<textstream region="ticker" src="fashion_ticker.rt" fill="freeze"/>
<video region="media" src="fashion_video.rm" fill="freeze"/>
</par>
</body>
</smil>

Table 2: SMIL code used for the UMIST student fashion show.

Some things to bear in mind with live streaming

Case Study 2

This event was a public seminar on xDSL broadband access technologies with speakers from the industry. The requirement was to obtain a good quality video of the presentations and then to create the streaming media afterwards. The set up was to use a camera crew with 2 cameras, lights, fixed and roving mics, audio and video mixing, etc.

The videotapes were edited and then used to generate the streaming video using a NT workstation running RealProducer Basic. The resulting presentation is shown in Figure 3. The speaker’s slides were converted to jpegs and encoded using RealPix to be synchronised with the presenter’s video.

Figure 3: xDSL Presentation by Patrick De Boeck, Telindus K-Net
Figure 3: xDSL Presentation by Patrick De Boeck, Telindus K-Net

The SMIL file used for this is shown in Table 3. There are 6 media inserts covering the presenter’s video, slides, Web site (via logo) and email links for the speaker and the ATC. The body of the code contains a switch element to differentiate 3 different streaming bands - 150Kbps, 80Kbps, 20Kbps - and to stream a different encoding of the presentation slides and the presenter’s video according to the access bandwidth at the client end. The resulting presentation can be seen at [13].

<smil>
<head>
<meta name="title" content="Telindus K-Net xDSL Presentation"/>
<meta name="author" content="Advanced Telematics Centre"/>
<layout>
<root-layout background-color="black" width="650" height="400"/>
<region id="background" z-index="1" left="0" top="0" width="650" height="400" />
<region id="atc-email" z-index="2" left="75" top="362" width="500" height="20" />
<region id="media" z-index="2" left="20" top="124" width="180" height="140" />
<region id="presentation" z-index="2" left="203" top="15" width="440" height="330" />
<region id="speaker-email" z-index="2" left="12" top="95" width="180" height="20" />
<region id="speaker-logo" z-index="2" left="50" top="33" width="111" height="50" />
<region id="atc-logo" z-index="2" left="30" top="350" width="30" height="36" />
</layout>
</head>
<body>
<par>
<img region="background" src="background.gif" fill="freeze" />
<ref region="speaker-email" src="speaker.rt" fill="freeze" />
<ref region="atc-email" src="atc.rt" fill="freeze" />
<img region="speaker-logo" src="telindus.gif?url=http://www.telindusk.net/" fill="freeze" />
<img region="atc-logo" src="atc.gif?url=http://www.telematics.eu.org/" fill="freeze" />
<switch>
<par system-bitrate="150000">
<img region="presentation" src="telindushighband.rp" fill="freeze" />
<video region="media" src="telindushighband.rm" fill="freeze" />
</par>
<par system-bitrate="80000">
<img region="presentation" src="telindusmidband.rp" fill="freeze" />
<video region="media" src="telindusmidband.rm" fill="freeze" />
</par>
<par system-bitrate="20000">
<img region="presentation" src="telinduslowband.rp" fill="freeze" />
<video region="media" src="telinduslowband.rm" fill="freeze" />
</par>
</switch>
</par>
</body>
</smil>

Table 3: SMIL code used for the Telindus K-Net xDSL presentation.

Some things to bear in mind with on-demand encoding

Click and Go Video

Using video streaming in a live teaching situation and linking in presentation slides and other Web material is not straightforward. Generation tools have yet to appear before this process becomes more automated and easier. The Click & Go Video project is devising a methodology for using video streaming in real time (lecture-based) and collaborative (tutorial and distance–based) learning. One of the technical underpinnings of the project is to look at automating the set up process using a Web-based SMIL generator. Figure 4 illustrates the type of options that may be presented to the user.

Figure 4: 'Click and Go' SMIL generator
Figure 4: 'Click and Go' SMIL generator

Click and Go Video is being run under the JISC/DNER Programme: Enhancements for Learning and Teaching [14], and will evaluate its methodology with three teaching departments in Manchester - the Department of Textiles, UMIST (teaching of fashion marketing), Manchester Royal Infirmary (teaching of surgery) and the Department of Hospitality and Tourism, MMU (teaching of catering). Each of these case studies requires a different set of video production techniques and interaction for the learner covering:

Over 2001/2002, the project will be running a series of workshops on video streaming and producing ‘how to do it’ guides together with advice on learning and teaching best practice. Further details are available on the Click and Go Video Web site [15].

Viewpoint

In finishing this paper, the Advanced Telematics Centre has just completed another live Webcast, this time for the International Association of TEFL Conference in Brighton. The opening keynote speaker, Professor Carol Chapelle, from Iowa State University came in over an Internet videoconference link to Edinburgh University and from there over an ISDN6 link to the conference in Brighton, with the videoconference being picked up by Manchester University and streamed out over the Web. Despite the rather complex connection set up, the streaming worked. There were even email responses coming back to Brighton from people who were picking up the stream and who were able to have their questions relayed to the speaker. It was particularly fitting since the theme of the presentation was about the use of Web technology in TEFL. See [16] for the recorded session.

Streaming video technology has clearly reached a state where it can start to be used effectively. As the process of capturing and exporting media clips becomes easier and more straightforward, and set up tools become more widely available, then we will start to see a rapid take-up and expansion of its use. Take a look at the Scottish Parliament site [17]. This is really a vision of the sort of media enriched Web site that we can increasingly expect to see.

Acknowledgements

Many thanks to Paul Hammond-White from the Click and Go Video project, and from Jules Newgrosh and Fotis Zapheiropoulos at the Advanced Telematics Centre, for all their help and advice given in researching this paper.

References

  1. The Future is Hybrid (Manchester event, February 2001): Multimedia Presentations, Ariadne and Jim Strom, 23-Mar-2001.
    URL: <http://www.ariadne.ac.uk/issue27/hl-multimedia/>
  2. TERENA-NORDUnet Networking Conference 1999, Lund, Sweden.
    URL: <http://video.ldc.lu.se/terena_prog.htm>
  3. Advanced Telematics Centre, University of Manchester.
    URL: <http://www.telematics.eu.org/>
  4. Microsoft’s Windows Media Technologies: Streaming Methods – Web Server vs Streaming Media Server
    URL: < http://www.microsoft.com/windows/windowsmedia/en/compare/webservvstreamserv.asp> Link to external resource
  5. Real Video Example, Antwerp clip, 56K
    URL: <http://www.telematics.eu.org/streaming/real56k.ram> Link to external resource
  6. Real Video Example, Antwerp clip, ISDN
    URL: <http://www.telematics.eu.org/streaming/realisdn.ram> Link to external resource
  7. Real Video Example, Antwerp clip, DSL
    URL: <http://www.telematics.eu.org/streaming/realdsl.ram> Link to external resource
  8. Windows Media Example, Antwerp clip, 56K
    URL: <http://www.telematics.eu.org/streaming/windows56k.asx> Link to external resource
  9. Windows Media Example, Antwerp clip, ISDN
    URL: <http://www.telematics.eu.org/streaming/windowsisdn.asx> Link to external resource
  10. Windows Media Example, Antwerp clip, DSL
    URL: <http://www.telematics.eu.org/streaming/windowsdsl.asx> Link to external resource
  11. Live Webcast Example: UMIST Student Fashion Show 2000
    URL: < http://khalibar.mcc.ac.uk:8080/ramgen/fashion2/fashion.smi>
  12. Live Webcast Example: Mottram Millennium Festival
    URL: <http://khalibar.mcc.ac.uk:8080/ramgen/mottram2000/mottram.smi>
  13. RealPix Example: xDSL Presentation by Patrick De Boeck from Telindus K-Net
    URL: <http://khalibar.mcc.ac.uk:8080/ramgen/telindus/telindus.smi> Link to external resource
  14. JISC/DNER (Distributed National Electronic Resource)
    URL: <http://www.jisc.ac.uk/dner/> Link to external resource
  15. Click and Go Video
    URL: <http://www.clickandgovideo.ac.uk/> Link to external resource
  16. Keynote Speaker: Carol Chapelle, International TEFL Conference, Brighton, April 2001.
    URL: <http://khalibar.mcc.ac.uk:8080/ramgen/iatefl/iatefl.smi> Link to external resource
  17. Scottish Parliament
    URL: <http://www.scottishparliamentlive.com/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Jim StromJim Strom
Project Director
Advanced Telematics Centre
University of Manchester
MANCHESTER
M13 9PL
United Kingdom

jim.strom@man.ac.uk Link to an email address
< http://www.telematics.eu.org/> Link to external resource

Phone: +44 161 868 0545
Fax: +44 161 868 0565

Jim Strom is employed as a Team Leader for Telematics Applications at Manchester Computing, University of Manchester. His responsibilities cover management of the Advanced Telematics Centre, which provides advice and practical support to regional SMEs and public sector organisations in the use of interactive Internet technologies for business development and innovation. He is also the Project Director for the Click and Go Video project. Over the last 15 years he has been involved in university teaching for communications networks and distributed systems. Recent research covers Internet-based electronic commerce for SMEs in the local economy and the evaluation of IP-based desktop videoconferencing and streaming resources for deployment across city and regional network infrastructures.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Strom, J. "Streaming Video: A Look Behind The Scenes", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/scenes/>

Metadata

-------------------------------------------------------------

900 Years of Jewish Marriage Contracts at the Jewish National and University Library

By Elhanan Adler and Orly Simon - May 2001

Elhanan Adler and Orly Simon describe how the Jewish National and University Library (JNUL) has begun a digitisation project aimed at making many of its unique collections accessible to remote users [1]. The first stage of this project, recently completed, is the digitisation and cataloging of the JNUL's unique collection of some 1200 ketubbot (Jewish marriage contracts). This collection contains both manuscript and printed ketubbot from a wide variety of countries and time periods, many of them beautifully illuminated. The search engine allows access by country, city, date, and all persons named (bride, groom, witnesses), and retrieves colour images in several resolutions.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Jewish National and University Library

The Jewish National and University Library (JNUL), founded over 100 years ago, today serves a threefold purpose as the Central Library of the Hebrew University of Jerusalem, the National Library of the State of Israel and the National Library of the Jewish People. In its latter role (chronologically its initial one) the JNUL strives to collect materials of all types which reflect or represent the history of the Jewish people and its culture throughout the world. The JNUL collections of Hebraica and Judaica are the largest in the world.

Figure 1: Ketubba. Jerusalem, Eretz Israel - Click to Enlarge
Figure 1: Ketubba. Jerusalem, Eretz Israel
Paper 64 x 56 cm.

In the last two years the JNUL has embarked upon a major digitisation project aimed at making significant parts of its collections accessible worldwide. With the aid of a multi-year grant from the David and Fela Shapell Family, digitisation of several entire collections at the JNUL is underway covering manuscripts, rare and ancient maps, Jewish music and fragile printed materials. The ketubbot collection is the first one to be completed and made Internet accessible.

Jewish Marriage Contracts (Ketubbot)

For over 2000 years Jewish law has required that every husband present his wife, at the time of their marriage, with a marriage contract or ketubba, guaranteeing the wife's financial rights in case of the husband's death or divorce. While the core text of the ketubba has changed very little over the ages (and much of the text is still written in the ancient Aramaic language), over generations various local customs found their way into the legal text of the ketubba. Their decoration often reflects the Jewish art of the locality and period. Some ketubbot are ornate, illuminated manuscripts which are considered valuable works of art and can be found in museums and private collections throughout the world. Even today many couples will forgo the standard printed ketubba provided by rabbinic authorities in favor of a personalised, illustrated one prepared by a calligrapher which is subsequently framed and prominently displayed in the couple’s home. Ketubbot are therefore a rich source of material on Jewish history, customs, genealogy and art. The fact that, as legal documents, ketubbot always contain exact dates and place names also allows their absolute identification with specific communities and periods.

Figure 2: Ketubba. Herat, Afghanistan, 1812 - Click to Enlarge
Figure 2: Ketubba. Herat, Afghanistan, 1812
Paper 59.5 x 42.5 cm.

The JNUL Ketubbot Collection

The collection of Ketubbot in the Jewish National and University Library numbers over 1200 items, arguably the largest such collection in the world. The ketubbot originate in over 50 different countries, providing a wide gamut of both textual and artistic variation. While most are entirely handwritten, some are printed forms with the personal details added by hand, and there are even some blank forms used in specific localities. The earliest ketubba in the collection is from Eretz Israel and dates from 1024; it was preserved in the Cairo Geniza, one of the most important sources of medieval Jewish manuscripts. The most recent is from Jerusalem in the year 2000. From the start of the project it was decided to digitise the entire collection and not just selected items as it was felt that researchers would benefit most from access to as many of these items as possible. For the same reason the JNUL has invited other libraries and collections to join in the project by depositing digital images of their ketubbot at the JNUL site, and adding their metadata to the project catalog. The JNUL hopes that this project will ultimately expand to be a world repository of ketubbot.

Textual Variation in Ketubbot

The JNUL collection provides an excellent opportunity to study the local customs and legal stipulations which were practiced in various Jewish communities. The JNUL collection includes many ketubbot with additions relating to such topics as dowry, inheritance, polygamy (practiced for many years in some Oriental communities) and care of children from previous marriages. In North Africa and Yemen, the ketubba often contained an obligation by the groom not to force his wife to move to another city without her consent, and in some ketubbot from Syria and Eretz Israel, there is a clause barring the husband from going on distant journeys without leaving his wife a conditional bill of divorce, to spare her the status of abandoned wife (aguna) in case of his disappearance. An interesting local custom of the town of Ioannina, Greece, has the groom subsequently attesting to the bride’s virginity as an addendum to the ketubba (this addition is found in all five Ioannina ketubbot in the collection, spanning a period of 100 years).

Artistic Variation in Ketubbot

The JNUL collection contains hundreds of illuminated ketubbot whose border decorations reflect both Jewish texts and symbols and the prevailing art of the locality. Italian illuminated ketubbot show the strong influence of the secular artistic preferences (human figures, nature scenes) while in ketubbot from Islamic countries geometric designs are prevalent.

Figure 3: Ketubba. Rome, Italy, 1771 - Click to Enlarge
Figure 3: Ketubba. Rome, Italy, 1771
Parchment 83 x 51 cm.

Technical Details

Since many of the ketubbot are works of art, it was decided to film and scan each Ketubba to a single high standard appropriate to the quality of the originals. Accordingly, the ketubbot were filmed using Cibachrome 35mm colour microfilm, and subsequently scanned using Kodak Photo-CD Pro. The resulting 72mb TIFF images were then individually processed using Photoshop to produce three public images: a “thumbnail” of 9.5cm. height (aver. size: 31k), a screen width image 750 pixels wide (aver. size: 200k), and a large image, no-reduction conversion from TIFF (aver. size: 1.5 mb). The latter image, equivalent to viewing the ketubba via a magnifying glass, almost always allows exact reading of the text. In several cases where the text was particularly difficult to read, an additional specially enhanced image was added.

The photography was done in-house by the JNUL Photographic Department. Scanning, post-scanning image processing and other graphic needs were outsourced to a commercial service bureau. Quality assurance of the images was done by the JNUL Automation Department.

The above public images are all saved at resolution of 72 DPI and bear a copyright notice. This resolution provides excellent viewing but is not of sufficient quality for commercial reproduction. High quality copies can be produced by the JNUL from the archival TIFF master files upon request.

Metadata and Presentation

Detailed cataloging of the ketubbot was done by the Manuscript Department of the JNUL (previously only brief cataloging had been done). Cataloging was based on MARC format with a few local fields added. In order to be both world-wide accessible as well as faithful to the original text, many fields were entered twice – once in English or Romanized form, and again in the original Hebrew characters. In order to maximize usefulness of the project for genealogical research all names (bride, groom, witnesses, etc.) were recorded.

The data was entered into an ALEPH-300 database (ALEPH-300 being the standard library system of all Israeli university libraries). The various fields are searchable using the ALEPH-300 WWW public catalog software and the bibliographic records display the URLs of each ketubba.

Figure 4: ALEPH record
Figure 4: ALEPH record

In consideration of the fact that the use of library catalog software requires some expertise, and that we expect that most casual browsing of the collection will be by country/city, we have also create a single html geographic listing with direct access to each ketubba.

Figure 5: Access by country list
Figure 5: Access by country list

The initial display page of each ketubba provides brief bibliographic information and a “thumbnail” image of the ketubba. Links on this page lead to full bibliographic information (the ALEPH record) as well as to the additional, larger images.

Figure 6: Example of ketubba page
Figure 6: example of ketubba page

A series of virtual exhibitions based on the collection, and prepared by relevant scholars is planned. The first exhibition on Jerusalem in ketubba decoration is now in preparation.

Summary

The JNUL ketubbot collection digitisation project is but the first of a series of projects aimed at making the Library’s collections accessible worldwide. Further stages of the digitisation program will involve additional JNUL collections such as manuscript books, ancient Holy Land maps, the Albert Einstein Archive and the National Sound Archives.

References

  1. The collection can be accessed via the JNUL Web site
    URL: <http://jnul.huji.ac.il/> Link to external resource
    Or directly at:
    URL: <http://jnul.huji.ac.il/dl/ketubbot/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Author Details

Elhanan AdlerElhanan Adler
Director, Israel Center for Digital Information Services
c/o Jewish National and University Library
P.O.B. 34165
Jerusalem,
Israel

Elhanan@libnet.ac.il Link to an email address

Orly SimonOrly Simon
Head, Computation Department
Jewish National and University Library
P.O.B. 34165
Jerusalem,
Israel

orlysi@savion.huji.ac.il Link to an email address

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

For citation purposes:
Adler, E. and Simon, O. "900 Years of Jewish Marriage Contracts at the Jewish National and University Library", Cultivate Interactive, issue 4, 7 May 2001
URL: <http://www.cultivate-int.org/issue4/ketubbot/>

-------------------------------------------------------------

Cultivate Interactive Issue 4: News and Events

The content on this page is current at the time of publication (May 2001), but will become out of date. To reach a more recent issue of Cultivate Interactive use the 'Current Issue' link in the top green navigational bar.

News

Vucedol Orion and the 'Oldest European Calendar'

The Oldest European CalendarAn exhibition showing what the Zagreb Archaeological museum have called 'the Oldest European Calendar' is now taking place and will carry on until the end of June 2001.

"The right bank of the Danube River in eastern Croatia was settled by members of the Vucedol Culture at the beginning of the third millennium BC. This predominant cultural phenomenon had a great influence on other contemporary cultures, and it also left behind traces in European heritage as a whole. One pot from the Vucedol layer in Vinkovci, dated prior to 2600 BC. displays the most complete European (Indo-European) calendar based on astral symbolism representing the relevant constellation characteristic for all four seasons. The calendar is synchronous with the Sumerian and Egyptian calendars. The year at Vucedol began with the spring equinox, when the Sun symbolically supplanted the most important winter constellation of Orion. To be more exact, on that night Orion's Belt would appear for only a short while in the winter sky before disappearing for several months. This chance circumstance no longer exists today, but it helped the Vuedol people to determine the first day of the new year."

The exhibition has been organised by the Archaeological Museum of Zagreb and the Municipal Museums of Vinkovci and Vukovar. The initiator of the the exhibition is Aleksandar Durman.

Further Information?: Those wishing to find out more should contact Jacqueline Balen Link to an email address or look at the exhibition Web site Link to external resource .

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The EC's Fight the Fog Campaign - or how to write clearer documents

The European Commission's Translation Service is running a campaign called "Fight the FOG" to encourage authors and translators to write more clearly. This light-hearted campaign draws attention to the dangers of FOG - that vague grey pall that descends on EU documents, obscuring meanings and messages, causing delays and irritation.

The campaign activities include:

Further Information?: See the Fight the Fog Link to external resource or the Eurospeak article in this issue of Cultivate Interactive.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Comparative Literature and Culture Journal

For a Journal detailing work in the humanities as cultural studies, comparative literature, comparative cultural studies, culture theory, literary theory, communication and media studies have a look at Comparative Literature and Culture: A WWWeb Journal, CLCWeb for short. CLCWeb is published quarterly in free-access mode by the Purdue University Press. The journal is archived by the National Library of Canada. Material submitted for publication is peer reviewed (blind). In addition to new work in scholarship, the journal maintains a LIBRARY with cumulative and selected bibliographies for work in the humanities (postcolonial studies, ethnic minority writing, film and literature, etc.), selected research and course materials such as in/for audience studies.

Further Information?: For more information see the CLCWeb Web site Link to external resource .

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Photos of the Renardus DDC Mapping Workshop

Renardus DDC Mapping WorkshopPhotos of the Renardus DDC Mapping Workshop at SUB Göttingen held between 22 and 23 February 2001 are now available for viewing.

Further Information?: For more information view the UKOLN Web site Link to external resource .

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Asia IT&C Programme

The Asia IT&C Programme supports mutually beneficial partnerships in IT&C between Europe and Asia. The Programme's contact database is now accessible from the "Search for partners" page of the Asia IT&C Web site. A call for Proposals 2001 has also now been published on the Web site of the European Commission. The Call for Proposals 2001 and the Guidelines for Applicants 2001 (and annexes) can be downloaded from "How to apply" page of the Asia IT&C Web site.

Further Information?: For more information see the article in this issue of Cultivate Interactive - Bridging the Digital Divide in Asia.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Renaissance Library Calendar

The new Renaissance Library Calendar for 2002 will feature:

The 2002 edition will be available from June.

Further Information?: For more information send an email to info@isim.org Link to an email address putting the word 'calendar' in the Subject field, and leaving the rest of the message blank. To find out about the 2000 calendar see the Creation of the Renaissance Library Calendar article in issue 2 of Cultivate Interactive.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Intellectual Property Rights Helpdesk (IPR-Helpdesk) CD-ROM

The Intellectual Property Rights Helpdesk CDThe Intellectual Property Rights Helpdesk (IPR-Helpdesk), a project funded by the European Commission Innovation Programme of Enterprise DG, has produced a CD-ROM for the EU-RTD community. Entitled It all starts with an idea, the CD-ROM incorporates a complete offline copy of the multilingual IPR-Helpdesk website, more than 10,000 pages of IPR information in English, French and German.

The CD-ROM It all starts with an idea is available free to all current and potential EU-RTD contractors.

Further Information?: If you would like to order a copy then either send an e-mail to promo@ipr-helpdesk.org Link to an email address or telephone +352 47 11 11 1 or fax your request to +352 47 11 11 60. In all cases please provide your full details. To find out more about IPR-Helpdesk see the articles in Cultivate Interactive issue:
URL: <http://www.cultivate-int.org/issue1/ipr/>
URL: <http://www.cultivate-int.org/issue2/ipr/>.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

COVAX News

COVAX (Contemporary Culture Virtual Archive in XML) has produced a newsletter for those in libraries, museums and archives interested in internet developments.

Further Information?: For more information view the Covax Web site Link to external resource .

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Events

Libraries Without Walls 4
The Delivery of Library Services to Distant Users: Distributed Resources - Distributed Learning

Where?: Aegean Island of Lesvos, Greece
When?: 14 September - 18 September 2001
Source?: Email from Zoe Clarke Link to an email address

The fourth Libraries Without Walls conference is organised by CERLIM - The Centre for Research in Library and Information Management. The conference continues the tradition of the LWW Conferences by bringing together international perspectives on the delivery of library services to users distant from the physical library. When the first LWW Conference was held in 1995, the focus was primarily on distance learning and geographical dispersion. Since then, however, rapid advances in the development of ICT (Information and Communications Technologies) based infrastructures and services have led to a situation where many library users now routinely access services remotely - even when 'remotely' means 'within sight of the library building'. As a previous conference attendee observed, "we are all distance learners now".

Conference themes:

Further Information?: See the conference Web site Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Cultural Heritage Support Actions Concertation Event

Where?: Austrian Bundesministeriumfür Bildung, Vienna, Austria
When?: 21 June 2001
Source?: Email from Concha Fernández de la Puente Link to an email address

DG Information Society, Cultural Heritage Applications unit is preparing a concertation event for support actions in the area of cultural heritage. The event will be hosted by the Austrian Bundesministeriumfür Bildung, Wissenschaft und Kultur, and organised in collaboration with the Cultural Service Centre Austria, the Austrian CULTIVATE node. The objectives are to discuss issues related to the current IST support actions and to look into emerging trends and future needs of this type of actions.

Further Information?: Contact Concha Fernández de la Puente Link to an email address

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Institutional Web Management Workshop

Where?: Queens University,Belfast
When?: 25 June - 27 June 2001
Source?: UKOLN Web site Link to external resource

The Fifth Institutional Web Management workshop will cover a range of topics of interest to members of Web management teams in Higher and Further Education Institutes, and will include multimedia, dynamic content, personalisation, Web design, e-business, Web strategies and general management issues.

Further Information?: See the Workshop Web site Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

There are more News Items and Events listed on the Cultivate Web site.
http://www.cultivate-eu.org/newsandevents/ Link to external resource

For information on European Jobs currently available see the Jobs section.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Cultivate Interactive Issue 4: Jobs Section

The content on this page is current at the time of publication (May 2001), but will become out of date. To reach a more recent issue of Cultivate Interactive use the 'Current Issue' link in the top green navigational bar.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Position?: Assistant Web Editor (Graphic Design)
Company?: Information Services Directorate - Learning Support Services, The University of Nottingham, UK
Closing Date?: 14 May 2001

The University is seeking to appoint a member of the web team, helping to develop the University’s web corporate identity, producing graphics in the University’s web corporate identity style and applying graphic design to other sites across the University. The successful candidate will work with University Schools and administrative departments to implement the University’s web corporate identity and to provide technical input into the Web Support team.

Candidates must have a good honours degree in graphic design and the required blend of communication, technical and design skills. A working knowledge of mark-up languages including HTML is essential and knowledge of PHP is desirable. Candidates should also have excellent communication and interpersonal skills. Experience of proof reading, general editorial skills and an awareness of the academic use of the Web is also essential.

Please send applications to:
Personnel Office
Highfield House
The University of Nottingham
University Park
Nottingham
NG7 2RD

URL: <http://www.nottingham.ac.uk/personnel/vacancies/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Position?: Development Officers
Company?: North East Museums Libraries and Archives Council (NEMLAC), UK
Closing Date?: 31 May 2001

North East Museums Libraries and Archives Council (NEMLAC) is the new regional development agency for this sector in the North East region. NEMLAC is appointing four full time Development Officers to assist in the following areas:

Please send applications to:
Christine Peacock
NEMLAC
House of Recovery
Bath Lane
Newcastle upon Tyne
NE4 5SQ
UK
Phone: 0191 - 222 1661

URL: <http://www.cultivate-int.org/issue4/print-all/www.nemlac.co.uk/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-------------------------------------------------------------

Cultivate Interactive Issue 4: Misc.

Articles

Other Misc. Items

-------------------------------------------------------------

Cultivate Interactive Competition - Spot the European City

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Time again for a bit of light relief with the Cultivate Interactive Competition.

Below are two sets of four pictures. Each set of four represents a different European city. All you have to do is decide which cities are being shown.

City 1

City 2

The answers should be sent to cultivate-editor@ukoln.ac.uk Link to an email address before the closing date of 1st August 2001. Names will be drawn out of a hat and the winner will receive a book token. Good Luck!!

Issue 3 Winner

The winner from issue 3 was Seamus Keating from Barcelona, Spain. Congratulations!! A book is on its way to you.

The answers were:

For more fun try the Cultivate Interactive Scramble game.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-------------------------------------------------------------

Cultivate Interactive Scramble Game

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

The Cultivate Interactive Scramble Game is here to stay. It is very easy to play. All you need to do is start the game using the link below, then click on the 'Scramble' button to mix up the pieces of the picture below and then rearrange them all back again by hovering your cursor over the piece you would like to move. Those who can manage the game in under 2 minutes deserve a pat on the back.

Start the Scramble game.

The picture is of a Ketubba, a Jewish Marriage Contract, from Jerusalem. The Ketubbas can be found at the Jewish National and University Library's Ketubbot Digitisation Project.

The Cultivate Interactive Scramble Game has been provided courtesy of Dynamic Drive Link to external resource .

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-------------------------------------------------------------

Cultivate Interactive Issue 4: Jobs Section

The content on this page is current at the time of publication (May 2001), but will become out of date. To reach a more recent issue of Cultivate Interactive use the 'Current Issue' link in the top green navigational bar.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Position?: Assistant Web Editor (Graphic Design)
Company?: Information Services Directorate - Learning Support Services, The University of Nottingham, UK
Closing Date?: 14 May 2001

The University is seeking to appoint a member of the web team, helping to develop the University’s web corporate identity, producing graphics in the University’s web corporate identity style and applying graphic design to other sites across the University. The successful candidate will work with University Schools and administrative departments to implement the University’s web corporate identity and to provide technical input into the Web Support team.

Candidates must have a good honours degree in graphic design and the required blend of communication, technical and design skills. A working knowledge of mark-up languages including HTML is essential and knowledge of PHP is desirable. Candidates should also have excellent communication and interpersonal skills. Experience of proof reading, general editorial skills and an awareness of the academic use of the Web is also essential.

Please send applications to:
Personnel Office
Highfield House
The University of Nottingham
University Park
Nottingham
NG7 2RD

URL: <http://www.nottingham.ac.uk/personnel/vacancies/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Position?: Development Officers
Company?: North East Museums Libraries and Archives Council (NEMLAC), UK
Closing Date?: 31 May 2001

North East Museums Libraries and Archives Council (NEMLAC) is the new regional development agency for this sector in the North East region. NEMLAC is appointing four full time Development Officers to assist in the following areas:

Please send applications to:
Christine Peacock
NEMLAC
House of Recovery
Bath Lane
Newcastle upon Tyne
NE4 5SQ
UK
Phone: 0191 - 222 1661

URL: <http://www.cultivate-int.org/issue4/print-all/www.nemlac.co.uk/> Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-------------------------------------------------------------

Cultivate Interactive Issue 4: Links Section

Who is linking to Cultivate Interactive Web magazine?

In this section we will review some of the sites that have chosen to link to us. If you would like to be mentioned in the next issue then please Contact Us.

If you would like to see how many sites are linking to Cultivate Interactive have a look at Link Popularity Link to external resource

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Web Site Name?: Horizon
URL?: <http://horizon.unc.edu/history/> Link to external resource
Description?: Horizon's mission is to inform educators about the challenges that they will in our increasingly technically enabled world and steps they can take to meet these challenges. They offer a number of services including the Horizon site, the Horizon mailing list, seminars and workshops, and conferences. These aim to explore and extend our thinking as an educational community about the implications of a rapidly changing world and what we can do to make educational organizations and programs more effective in the future. They provide a wealth of links in the Educational On-Ramp section to valuable Web data sources that provide historical data and informed discussion related to the future of education.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Web Site Name?: Preserving Access to Digital Information
URL?: <http://www.nla.gov.au/padi/> Link to external resource
Description?: PADI is a subject gateway to digital preservation resources. It is based at the National Library of Australia and aims to provide mechanisms that will help to ensure that information in digital form is managed with appropriate consideration for preservation and future access.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Web Site Name?: Digital Libraries Initiative Phase Two
URL?: <http://www.dli2.nsf.gov/> Link to external resource
Description?: Digital Libraries Initiative Phase Two is a multiagency initiative which seeks to provide leadership in research fundamental to the development of the next generation of digital libraries, to advance the use and usability of globally distributed, networked information resources, and to encourage existing and new communities to focus on innovative applications areas.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Web Site Name?: The Library Association's IST Programme Pages
URL?: <http://www.la-hq.org.uk/directory/prof_issues/oppsineurope/what.htm> Link to external resource
Description?: The Library Association has provided a very useful introductory site to opportunities in Europe. The site lists the IST Programme and other Fifth Framework Programme opportunities. There are also links to other possible funding sources such as the TEN-Telecom Programme, eContent Programme, PROMISE Programme and Structural Funds.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

-------------------------------------------------------------

If you wish to translate this page choose the preferred language using the WorldLingo icon below.

translation logo

Cultivate Interactive is not responsible for the outcome of this translation software. For further information see a previous article on Machine Translation.