Indiana University Saves Testing and Analysis Time and Thousands on Lab Expenses
User Experience Challenge
To design the Sakai Assessment Manager (SAM), an online assessment-and-survey software module being developed as part of the Sakai Project.
The Sakai Project is a $6.8 million software development project funded by more than 20 colleges and universities, including the University of Michigan, Indiana University, Massachusetts Institute of Technology, Stanford University, the uPortal Consortium, and the Open Knowledge Initiative (OKI). Project funding includes a $2.4 million grant from the Andrew W. Mellon Foundation. The goal of the Sakai Project is to develop reusable software that schools can share with one another at no charge. This could potentially save schools millions of dollars each year by eliminating the need to buy many commercial software products.
SAM is one of the reusable software modules created by the Sakai team. SAM is a course management system module that enables professors and students to create, administer, participate in, and evaluate the results of, online surveys, quizzes and tests.
Indiana University will be the first school to utilize SAM. The software will be integrated with the university‘s Oncourse system that provides a wide range of computer-based learning tools. Oncourse has over 80,000 users (undergraduates, graduate students, faculty, researchers, and part-time students). That figure represents about 90 percent of the university community. In the future, SAM will be integrated into similar course-management systems at other participating schools.
Morae’s All Digital Approach Revolutionizes Indiana’s User Experience Program
Indiana's User Experience Group lab was launched in 1998. The lab conducts as many as 50 projects annually, primarily supporting enterprise and department-wide application development initiatives. To replace and upgrade the lab’s eight-year-old hardware with next generation technology would have cost over $20,000. Although the lab didn't have budget for it, after years of running complex, proprietary hardware, the university knew that a change was needed. But before purchasing another hardware-based solution, the school investigated more affordable and flexible alternatives.
The school ultimately decided to abandon its clunky hardware in favor of Morae, a software based solution from TechSmith Corporation. Powered by TechSmith’s patent-pending Rich Recording Technology (RRT), Morae is the industry’s first tool that synchronizes real-world actions, such as user speech and facial expressions, with detailed application and computer system data. With Morae, usability teams can virtually "look over the shoulder" of a person using an application to see and hear human-computer interaction.
Morae presents user experience professionals, applications developers, user interface designers, and others with a unique view into the way that desktop software, Web sites and e-business applications are experienced. This makes Morae the only solution that captures the total user experience by recording and synchronizing the user, the application, and the events. Test administrators can save and index data, which makes searching easy, and create highlight clips to share key moments with stakeholders and decision makers.
Testing SAM with Morae
To garner a realistic understanding of how students and faculty members use SAM, the lab tested five students and five faculty members. The educators were given a one-hour task -- to create an online exam consisting of true/false, multiple choice and short-answer sections, and an essay portion. The instructors were also told to incorporate a video file into the test. They were then asked to rearrange, add, delete and edit certain portions of the exam. For students, the thirty-minute task was to take the exam by selecting and re-selecting answers, typing text into the short-answer and essay sections, and then submitting the exam to their instructors.
Morae’s Recorder component was loaded on the test machine. To the participants, it wasn't noticeable that Morae was running in the background, so they acted naturally as they attempted to perform their tasks. Because Morae records all the screen activity in realtime, every time a faculty member or student pressed a button, typed a sentence, or spoke a phrase, Morae captured it. Morae also uses TechSmith’s lossless codec, meaning the quality of the recording is crystal-clear, making it easy to see all the action from mouse clicks to dialogue boxes, and all the cursor movements. Such quality could never be matched by older technology, such as scan converters.
During the tests, the stakeholders, such as SAM’s interface designers, were in an observation room watching all the on-screen activity through Morae’s Remote Viewer. Remote Viewer enables observers to set “markers,” or points-of-interest, that highlight both the positives and negatives as the user is interacting with the application. The markers are then automatically synchronized with the entire session in the Morae Manager component, so they can be found easily later during analysis.
Markers serve several time-saving and organizational purposes. For example, during the one-hour test session for each faculty member it would have been challenging to find a specific three-minute sequence using the lab’s old analog system. However, since all the data is digitally indexed and searchable in Morae, the test administrator can very quickly find the beginning and ending marker of the sequence to calculate time-on-task.
As each participant finished their predetermined set of tasks, the data was saved to the desired location, such as a network drive, or if the client requests, burnt onto a CD-ROM, DVD or a Flash Drive so they can watch the session in its entirety at their own pace.
Morae automatically created a synchronized Picture-in-Picture movie of the student or faculty member as they used the SAM module. From this, designers and other stakeholders evaluated how they performed the task. They could see exactly where the mouse cursor moved; hear what the subject said and even witnessed signs of frustration on their faces. During the usability testing, it became clear that SAM’s interface required clearer labeling to reduce trial and error. Designers also realized that the overall presentation of the test needed to be modified to make it easier for faculty and students to administer and take the exams.
Using Morae for the first time on such a critical project proved to be a very rewarding exercise for the User Experience Group. Not only did it expose the group to the enormous advantages Morae offers, but it also allowed the designers to quickly rebuild SAM’s interface confident in the enhancements and changes the usability testing uncovered. SAM was gradually rolled out, beginning in July 2004 under the name Oncourse Assessment Manager and will be utilized in the fall semester at Indiana University.
During the test period, Morae was able to erase a long-standing complaint by the IT department that usability testing was too expensive and time consuming to be feasible. In fact, the university’s User Experience Group now has a cost-effective, proof-positive tool to justify additional development expenditures that otherwise would be overlooked.
Another goal of the User Experience Group is to spread the word about its new usability capabilities to all developers across all the campuses in the Indiana University system so they will test early and often to save money downstream on redesigns after an application is launched. Because of Morae’s flexibility and speed, the User Experience Group is confident they will meet that goal. Additionally, the group is now considering establishing a similar lab based on Morae at the Indianapolis campus to better serve the Northern part of the state.
Indiana University’s UXG
The User Experience Group (UXG) is a division of University Information Technology Services (UITS) that assists information technology managers both inside and outside Indiana University in increasing the usability of their projects. UXG works to make systems easier to use by applying a user-centered approach to systems design, development and evaluation. They use a variety of methods to help increase the efficiency, effectiveness, satisfaction, and ease of learning for users of a particular system. The approach involves incorporating user feedback in all phases of development including requirements analysis, conceptual design, development, and testing.
Wiring diagram that used to hang in Indiana University’s Usability Lab as a reference for hooking up all the required hardware pieces.