Read the http://www.jisc.ac.uk/uploaded_documents/Tools%20Focus%20Study%20Report%20v3-2.doc

The evaluation took a summative approach, focusing on the effectiveness of the finished tools or tools under development with a view as to recommending whether they could be used in wider settings and for their implementation to the project teams and JISC. To perform this evaluation we needed to be clear about the type of learning and functionalities that the tools were designed to achieve, thus being aware of general aims and goals of the individual projects and the part that the tools played within them. To obtain the information needed to effectively assess the tools the evaluation process made use of a number of data collection methods:

Document Analysis

To identify the aims and objectives of the projects, and the part that the tools played within them, the original project proposals and quarterly reports were gathered from each of the four projects. In addition some projects supplied papers and presentations that they had published. Tools were identified from this documentation by the evaluation team which formed the basis of a questionnaire to be issued. Contacts were also identified to fill in the questionnaire or to be interviewed.

Questionnaire

A questionnaire was hosted online using SurveyMonkey to gather initial information about each of the tools developed, adapted, and utilized within each project. The questionnaire was sent to the project managers and key developers at each UK and US institution. As the aim of the questionnaire was to identify tools and their main functionalities the questions were open-response. Respondents were given two weeks to submit their responses and a reminder was issued prior to the deadline.

An issue arose within a number of projects concerning who would take responsibility for filling out the questionnaire. In response we suggested that the institution that developed each individual tool should submit a questionnaire response. In some cases this meant that there would be a number of responses for each project.

After the responses from the questionnaire were received the tools and contact details were collated for each project. This information was then passed on to each project for verification.

Interviews

Implemented after the questionnaire, interviews were held to obtain elaboration on important points that had arisen as a result of the survey responses. The interviews were semi-structured with the sequence and wording of questions organised in advance by means of a schedule. Conducted face-to-face with the UK project teams and via telephone with US partners, the data collected was largely qualitative and in-depth, allowing the exploration of the development process, user evaluation, and situations of use. Where possible the evaluation team gained access to the tools before the interview for hands-on experience, additionally all tools were demonstrated during the interview process and in many cases the team was able to see student work that had been completed through using the tools.

Interoperability Matrix

A martrix was drawn up to make the interoperability features of each tool easily viewable.