next up previous
Next: Colorado Component of Up: INTERFACE DESIGN AND Previous: INTERFACE DESIGN AND

UCSB Component of the Interface Evaluation Team

Membership: Hill (leader), Carver, Dolin, Frew, Kemp, Larsgaard, Rae, Simpson (Montello of the Geography Department at UCSB and Green of the School of Education at UCSB also served on this team for much of 1996.) Mission Statement of Team: The goal of the UCSB component of the Interface Evaluation Team is to evaluate the effectiveness of the ADL system from the perspective of potential users of the system. Knowledge gained from these evaluation activities is used (1) to inform the design and implementation of the ADL system on the Web, and (2) to document in detail the effectiveness of the ADL and areas calling for improvement, both in the interface design and in the underlying system functionality and content.

Beta testers: Demographics

The ADL Beta-Tester program began during the spring/summer of 1996. At that time, interested persons were able to sign-up with ADL to gain access to the library. Each person had to fill out a short Web Access Request Form in order to receive the username and password. This username and password enabled them to access the full functionality of ADL on the Web. The original intention of the Web Access Request Form was to provide the information needed to decide who would be chosen to be a beta tester. Therefore, the responses to the questions on the Form were allowed to be free-form; that is, not bounded by a limited set of choices and not verified. It was subsequently decided, however, to let everyone become a beta tester who requested access. Over 2000 people signed up. The responses were analyzed manually to see what could be learned about the set of interested users who wanted to test the ADL web interface.

The Request Form contained five mandatory fields and a set of optional fields. The required fields were Name, Email, Organization, Occupation, and Referral. None of these were controlled in any way by some domain or accuracy check. Everyone who submitted the form with a valid email address received a confirmation notice with the username and password.

A total of 2287 beta testers signed up. An analysis of the access logs indicate that beta testers came in from 906 different ip addresses, with a total of 1340 sessions, between August 12, 1996 and February 4, 1997. Unfortunately, the usage logs before August 12th were lost as well as the logs for fourteen days after August 12th; so full count of the number of accesses by beta testers is not possible. A smaller number of beta testers (109) submitted the online survey form (see description of the survey below).

A high level summary of the results of the analysis of the beta tester data from the Access Form is that most of them discovered ADL randomly through the Web. Presentations, publications, and personal visits to UCSB were cited as the referral source by approximately 25% of the beta testers. There was worldwide participation from 49 countries, with 65% from the U.S. The Email domain analysis was inconclusive because many users came through commercial network vendors. Scientists comprised 42% of the group by occupation - geographers alone made up 9% of the total.

The full report of this analysis is available at http://www.alexandria.ucsb.edu/~kemp/UI/BTest/report.html.

Beta Testers: Survey Results

The ADL User Feedback Survey is a WWW-based online survey http://www.alexandria.ucsb.edu:3366/ADR/FillOutSurvey. The survey is a six part questionnaire which combines free text with Likert-scale multiple choice questions. It was developed from a combination of previous surveys and known literature results. In addition, specialized components were added, specifically oriented toward our particular system. Questions about the user's background were also included, including his or her familiarity with computers and with the type of information our system is designed to retrieve.

There were three main goals of the survey. The first was to provide a mechanism for acquiring detailed and directed feedback about users' experiences with the interface. The second was to learn something about the ADL community of users. Finally, it was hoped that the survey would allow us to study relationships between users' experiences with the user interface (UI) and their backgrounds.

The resulting survey consisted of a set of multiple choice questions and open-ended questions. The multiple choice questions were formatted with five possible answers: strongly agree, agree, neutral, disagree, strongly disagree, no opinion. These questions were randomly formatted in a positive or negative sense to remove the tendency of users to give the same answers and encourage a more careful reading of the statements. Each major section of the survey included a question for a free-text response. This served the purpose of providing information which can be analyzed non-numerically, as well as allowing users to point out areas of the UI which the survey might not sufficiently cover.

There were 96 usable surveys (completely or nearly completely filled out) from 109 surveys submitted. Demographically, the group was: 80% male; 87% have college degrees; 63% have a Master's or Ph.D. degree; the average age is 35-36 years old; 24% said that English is not their native language; and they are, as a group, frequent users of libraries, geographical data, the WWW, and online catalogs.

Using the incomplete usage logs that are available, it appears that approximately 40% and 70% of the beta testers actually used ADL. If it is therefore assumed that between 1000 and 1600 beta testers actually used the system at least once, the survey responses that were submitted represent approximately 6% to10% of the active beta testers and approximately 4% of the total number who signed up to be beta testers. The low response rate and uncertain extent of any non-response bias suggest caution in generalizing the results.

The data is still being analyzed but the following high-level summary highlights the strongest results from the multiple choice questions. Moderate approval is the average reaction to the ADL. The strongest reactions of approval are to the exit procedure and session-saving capability, the consistency of terminology, the degree to which the ADL is stimulating, the degree to which terminology could be understood, the degree to which needs of experienced users are taken into consideration, the ease of reading screen characters, the degree to which exploration of features is encouraged, and the map browser. The strongest reactions of disapproval, small in magnitude though significant, are to the slowness of the ADL's performance, the degree to which it is frustrating, and the difficulty of correcting mistakes. Strong positive approval of the survey itself was indicated.

The data is being analyzed to see whether the reactions to ADL vary in any meaningful way as a function of characteristics of the respondents. So far, the only significant relationship is with the respondents' sex, with females being less approving of ADL than are males. But, given the small number of female respondents, very little can be made of this difference.

Nearly all respondents made at least a few such comments, resulting in well over a hundred in total. This summary can do no more than give a flavor of the comments -- however, the many detailed comments will provide very useful feedback to interface and system designers. The comments paint a somewhat more negative picture of ADL than do the Likert scales; the great majority of the comments express problems or difficulties. This is not surprising insofar as people are more motivated to comment when they encounter problems than when something works. Nonetheless, the comments clearly suggest that many important problems exist in the current implementation of the ADL. Several comments mentioned the lack of data, places where the interface was cluttered or unclear, slow performance, and a lack of clear introduction and motivation to the system. It is apparent that attempts to more clearly communicate the capabilities of the system will be appreciated. The system was difficult to understand particularly for those respondents who did not attempt to use the Tutorial. The survey was thought to be too long by several respondents. Especially notable was the difficulty several respondents had in getting the system to produce anything in response to query attempts. This made the detail of the survey seem even more excessive and inappropriate.

Fuller versions of the survey analysis can be found at http://www.alexandria.ucsb.edu/lhill/uie_paper_schedule.htm.

Ethnographic Studies

A team from the UCSB Graduate School of Education conducted ethnographic studies to inform the ongoing development of the web interface and its underlying library development. The Education Team decided to describe and analyze user activities and interactions both in the physical Map and Imagery Laboratory (MIL) in the UCSB library and when they were using the ADL web interface, noting the similarities in use patterns and expectations.

To study user interactions in the MIL, reference staff solicited participation in the study from those who asked them for assistance during the study period. When users agreed to participate, audiotapes were made of the reference interviews. This resulted in thirteen recorded and transcribed sessions. Two determinants were noted that influenced the mode of the interview: (1) the users' familiarity with MIL (system knowledge) and (2) the users' familiarity with the geospatial materials held by the library (domain or task knowledge). The resulting patterns of interaction are:

  1. Unfamiliar with geospatial information or its use/Inexperienced with MIL: Pattern 1: User depends on librarian to frame question and guide outcome;
  2. Unfamiliar with geospatial information or its use/Experienced with MIL: Pattern 2: User knows questions and depends on librarian to select appropriate outcome;
  3. Familiar with geospatial information or its use/Inexperienced with MIL: Pattern 3: User and librarian frame question together for desired outcome;
  4. Familiar with geospatial information or its use/Experienced with MIL: Pattern 4: User directs the framing of the question to procure the desired outcome.
Concurrent with the collection and analysis of the MIL reference interviews was the videotaping of a range of users interacting with Alexandria Digital Library's web interface prototype. As with the users of the MIL, there were significant differences in how people interacted with ADL depending on the user's background knowledge in several areas. Unlike the MIL experience, however, users did not have the equivalent of a reference librarian to whom they could turn to reshape a question or to be guided toward a successful outcome. The chief determinants which influenced a user's interaction with the ADL web interface, apart from their knowledge of computers, were the user's knowledge of

World Wide Web and Netscape Browser
Library search strategies
Maps as representations of geographic information
Geo-spatially referenced data
Programming and interface design

The Education Team also conducted a domain analysis of the feedback comments made by beta testers while they were using the web prototype and of the comments that were submitted as part of the online survey of beta testers. These comments were categorized by the section headings of the online survey of the beta web interface so that they could be easily correlated with those results. The Team is in the process of clustering the data in other ways to characterize user reactions and expectations.

Many problems were identified from the patterns seen in the data. Users expressed impatience with the introductory material and with the slowness of the system itself. However, they recognized the problem ADL was battling to make a complex system understandable--that is, useful and efficient while still providing sophisticated functionality. It is a design issue to determine how transparent or opaque to make that functionality, to decide how much users need to know as they use the system and how much can occur hidden from their sight (which may also mean removed from their control). Many of the problems identified may be seen partly as resulting from ADL's failure to express its purpose and potential adequately in the interface. The intended audience of the prototype and the knowledge and skills needed to use ADL successfully are not obvious. It is clear that many of the users studied did not understand the what ADL was trying to do and where it is in the development process.

When people enter the MIL, they are requested to sign in and are met by a reference librarian, a person who will act as a guide and facilitate their use of the library. The user's experience with the library and knowledge of the use of the materials can be appropriately supplemented by the guidance of the librarian. Currently there is no comparable service provided by ADL, no experienced guide who can act as liaison between the user's needs and expectations and Alexandria's potential. One alternative is to develop ways to inform users of the knowledge and skills needed to interact successfully with the web interface. New users, who are outsiders to the Alexandria community, undergo a right of passage through their interaction with the system. They may eventually gain membership to the Alexandria community of informed users, either through concerted effort and persistence or through the assistance of a more experienced guide. However, the interface can also be revised to acknowledge various groups and their expected uses, making the holdings more accessible and the interface less demanding on the users' expertise, knowledge, and technological skills.

Results of these three studies (analysis of MIL audiotapes, web interface videotapes, and user comments in reaction to the web interface)include lists of functional requirements that will be used in the design of the next ADL user interface.



next up previous
Next: Colorado Component of Up: INTERFACE DESIGN AND Previous: INTERFACE DESIGN AND



Terence R. Smith
Thu Feb 20 13:50:53 PST 1997