This evaluation looked at the current version of the Oxford Digital Library Website (at 09/2005) specifically focusing on the user interface and usability of the system. The evaluation of the ODL Website was funded by the Oxford Digital Library Development Fund. The website was at the prototype stage but with enough content and functionality to inform a full evaluation of the system. The evaluation primarily focused upon the existing usability of the ODL web site, including issues such as accessibility and browser compatibility. Additionally the functionality of the system was benchmarked against the basic and advanced services that potential users expected a digital library system to offer. This evaluation did not look at the effectiveness of the digital library being used in any context, for example as an embedded part of the research or teaching and learning processes.

Research Questions and Definitions

To evaluate the ODL website three questions were posed:

  1. What service do the potential users of the ODL website expect from a digital library and are these requirements met by the ODL website?
  2. How usable is the current ODL website to for its potential users?
  3. What recommendations can be made to improve the usability and functionality of the ODL web site?

For the purposes of this evaluation a digital library was defined as “a virtual system that provides central access to digital collections”. Usability was defined as “a measure of how easy it is for a user to complete a task. In the context of Web pages this concerns how easy it is for a user to find the information they require from a given Web site.”

Survey on “What is a Digital Library?”

To assess how potential users of the ODL site addressed the concept of a “digital library”, it’s benefits, disadvantages and what they thought such a system should offer a selection of high-level potential users chosen by the Digital Library Services Committee (the body which provides oversight of ODL activities) were surveyed. A questionnaire consisting of 4 open-ended questions and a usability feature rating scale was hosted online using SurveyMonkey and the sample informed via e-mail.

Task list Interface Evaluation followed by Questionnaire

Due to the timing of the evaluation (at the beginning of September and out of term time) it was difficult to gather a representative sample for the user population of the ODL site. Advertisements were sent via e-mail asking for volunteers to participate in the study offering the incentive of a £20 book voucher. The first 16 volunteers were accepted to partake in the user testing exercise. Although the sample may not have been representative or particularly large, considering the nature of the exercise which simply requires each subject to offer an individual perspective, this was not considered to have an adverse effect on the results.

All volunteers were asked to come to a central computing cluster and given a task list to work through the various features of the ODL site. They were then asked to fill in an online questionnaire. The questionnaire was divided into 7 sections, each section representing a different design category of usability and interface design:

G1 : Overall reaction of the digital library

G2 : Digital library page layout

G3 : Terminology and digital library site information

G4 : Digital library site capabilities

G5 : Navigation

G6 : Information retrieval

G7 : Completing tasks

The questions largely asked the participants to give their opinion on certain features via rating scales. At the end of each section participants were able to add any other comments they had via an “Any other comments” box, providing a more qualitative insight into specific usability issues. Additionally, users were asked to record their steps as to how they used the search / browse features to find particular items in the collections. This provided an insight into how users used the system to perform certain tasks.

Heuristic Evaluation

A method derived from “Usability Engineering” (NIELSEN, 1994), a heuristic evaluation involves the systematic examination of an ICT by a group of evaluators to judge its compliance with recognized usability principles (the "heuristics"). A group of evaluators is desirable because a single individual is unlikely to be able to spot all the usability problems in an interface. Using multiple evaluators will therefore significantly improve the reliability of this method. For the purposes of this evaluation three evaluators conducted a heuristic evaluation of the ODL site. The evaluators were also asked to look at issues of accessibility and browser compatibility that would not have been picked up by the user testers.

A heuristic evaluation sheet was made available online for the evaluators to record their findings and each individual evaluator were asked to inspect the interface alone to ensure unbiased findings. The results of the individual’s evaluation are were then aggregated and conclusions drawn. The heuristics used for this evaluation were derived from JACOB NEILSEN’S 10 usability heuristics with reference to BENSEN’S ET AL. Heuristic Evaluation Instrument for e-learning (2001).

LTGPublicWiki: User_Interface_Evalauation_of_Greenstone_Digital_Library_System (last edited 2013-05-20 11:29:49 by localhost)