Page Last Updated Sunday, April 27, 2008 10:29 PM

Formative Evaluation Report

Overview

A web-based performance support system for second year medical students was designed and created specifically for the Office of Medical Education at the University of Tennessee Health Sciences Center.  The system provided users with on-demand access to clearly audible pronunciation models for a series of generic drug names presented according to specified categories of pharmaceuticals.  The core product was composed of four web pages: an introduction, an illustrated guide, the database presentation page, and a resources page.  The database page offered the user a list of drugs according to categories.  When a drug was selected from the list, its and an audio control were presented on the right side of the screen.

The formative evaluation of the product collected data at key points during the final stages of the development process.  These data were used to improve the product before its final release to the client.

The primary objective in the evaluation was to determine product usability.  However, it was also of interest to determine the extent to which users expected to find the product supportive of their efforts to correctly pronounce a specified set of drugs names.  While the ultimate objective of the owners of the performance system was to have students use it to enhance their pronunciation skills, the extent to which users master such skills was explicitly outside the scope of this evaluation.

Based on the evaluation objectives, the following questions inform the evaluation.

  1. Does the system operate as designed? That is,
  • Does the initial splash screen appear in the browser window of the user’s choice?
  • Are the links from one page to another functional?
  • Does the system support user selection of a specified drug category?
  • Do the drug categories and drug names cleanly load from the backend database to the front end interface?
  • When a user selects a specified category, does the system present the list of drugs that belongs to that category?
  • Does the system permit the user to select any drug within the specified category?
  • Does the system legibly present drug information, including applicable brand names and phonetic pronunciation guide for the generic name about the selected drug?
  • Does the system operate without error?
  1. Can typical users utilize the system for its intended purposes?
  • Does the system provide the user with an onscreen method to play an audio pronunciation of the specified drug?
  • Do the audio files play on the user’s system?
  • Is the system successful in permitting the user to play a given audio file repeatedly?
  • Is the audio audible to the user?
  • Does the system permit the user to select a different drug from the same category, without having to restart the system?
  • Is the interface intuitive enough so that users navigate the system without needing outside assistance?
  • Do users experience any confusion in how to use the system?
  • Does the system permit users to recover gracefully from user error?
  • What user errors occur during system use?
  1. To what extent do the users believe the system will support their efforts to learn to correctly pronounce the names of drugs?

The evaluation was conducted in the following successive phases: (1) expert usability review by eLearning experts (heuristic approach); (2) one-on-one learner testing of usability; (3) small group learner testing of usability.  The e-Learning expert reviews occurred April 3-4, 2008.  These results were summarized by the project manager. The three one-on-one reviews took place April 9-10, 2008. The five small group reviews were conducted April 11, 2008.  Both the one-on-one and small group review results were summarized by the data collector.

Design

Phase 1: Expert usability review by eLearning experts using a heuristic approach

To identify any problems with the usability of the product before it was introduced to the intended users, a heuristic evaluation was undertaken as the initial phase in the formative evaluation.  Two eLearning experts, professors of instructional design and technology at The University of Memphis, were invited and agreed to critically examine the system’s interface in terms of its usability.

Instrument. A three-part evaluation instrument was designed for this purpose, based on Neilson’s “Ten Usability Heuristics” (Neilson, nd. b). The first section of the instrument asked testers to document test conditions (platform, OS, browser, date, and duration). The second section of the instrument was divided into five dimensions derived from Neilson’s principles. This section contained a total of 20 items to be scored using a 5-point Likert scale (1=Strongly Disagree to 5=Strongly Agree). Each individual dimension was assigned from 2 to 7 separate rating items plus a single open-ended response item. The third section was used to collect tester comments and ratings of additional issues not otherwise addressed in the response form. The tester was asked to (a) briefly describe the issue, (b) specify where it was encountered, and then (c) rate it in terms of severity and extensiveness. A 3-point scale was used for each rating, using the following scale values: Severity (3=Critical; 2=Non-critical but attention recommended; 1=Fix if time permits) and Extensiveness (3=Occurs in 3 or more places; 2=occurs twice; 1=Occurs only once). A complete copy of the instrument (including transcribed results) is included in Appendix A.

Methodology. To evaluate the system, the URL for the system was emailed to the experts together with the evaluation instrument and a brief statement of the purpose of the system and its intended audience.  The experts were asked to complete the evaluation within a 48-hour testing window and to return the completed instrument either electronically or in hard copy format.  The experts were asked to test the system using whatever operating system or browser they chose.  If they wished to test the product with multiple browsers or operating systems, they could do so, but were requested to annotate any differentiated results on the response instrument.  The evaluations occurred on April 3 and 4, 2008, and both experts returned their review forms via e-mail.

Phase 2: Learning usability testing using one-on-one trials

The second phase of the formative evaluation was conducted using one-on-one trials, following revisions to the system based on the analysis of the eLearning expert review.  This phase was important because the participants provided valuable information as they demonstrated the online performance support system.  Their impressions of the online performance support system helped to make necessary changes to the system.  By observing individuals go through the system, the developers could see the kinds of problems that occurred.  The effectiveness of the system could be obtained from the observations as well.

Methodology. For the one-on-one evaluations, the second draft of the product was used. In each of the three trials, the participants were given a guide which indicated the purpose of the evaluation was to determine the usability of the system.  The guide presented the URL for the system and listed 5 drugs which participants were to locate and play the audio pronunciation model.  At the conclusion of the trial, each participant was asked to complete three Likert-style response items (strongly disagree to strongly agree) designed to ascertain the participant’s perception of the usability and utility of the program.

Procuring participants. Despite the client’s repeated efforts to procure volunteers for this and the next phase of the evaluation from among the medical student body, only two were obtained.  The testing window unfortunately coincided with a month of study during which the third year medical students were preparing for their first Board exam, which meant these students were absolutely unavailable.  Second year medical students were likewise engaged in intensive study, for their final exams.  As a result, only first year medical students were potentially available.  The client emailed this group of students and procured a total of two volunteers to participate in the test. 

Since this was an insufficient number for conducting both the one-on-one and small group evaluations, it was decided to reserve these two volunteers for the final review process with the small group testing.  The design team subsequently sought help from the project sponsor to identify a viable source for additional test subjects.  Through the assistance of the Staff Development Educator in the Department of Nursing Education at St. Jude Children’s Research Hospital, a volunteer roster of ten registered nurses was made available to the team.  Angela Macklin contacted all volunteers by e-mail to set the evaluation schedule.

Actual participants.   In the end, three participants similar to the target audience were used in the one-on-one trial.  These participants were nurses from St. Jude.  In their jobs, they regularly read doctors’ prescriptions and administer medications to patients.  They all had taken a Pharmacology course in nursing school.

Instrument. This evaluation collected data using an Observation Note Log and Participants Guide. (See Appendix B.)  The Participant’s Guide was comprised of a set of instructions, the list of target drug names, and a 3-item Likert-style attitude response section.  Draft two of the system was used.

Phase 3: Learner usability testing using small group trials

The third phase of the formative evaluation was conducted using small group trials.  The users for the third phase were identified in the manner described earlier.

Methodology. The same version of the product was used for the small group trials as was used with the one-on-one testing.  Although this was not the ideal arrangement, it was necessary in order to capitalize on the availability of the volunteers.  Since the revisions needed after the one-on-one testing were only minor, this compromise was deemed acceptable.

The small group trials were conducted in the same manner as the one-on-one trials.  That is, they were conducted individually and observed by one of the design team.  One of the tests was conducted in the UTHSC computer laboratory provided by the Director of Instructional Technology.  The rest took place at St. Jude Children’s Research Hospital. The observation protocol and testing guide were the same as those used for the one-on-one trials. 

Participants.  Five participants, one from the target audience and four from a similar group were used in the small group trial.  The participants consisted of one M2 student and four nurses from St. Jude.  The M2 student will be taking a Pharmacology course soon.  The nurses have had a Pharmacology course.

Instrument. The same set of instruments used for one-on-one trials were used for the small group trials.  See Appendix B.

Results

PHASE 1 RESULTS: Expert usability review by eLearning experts using a heuristic approach.

A complete transcription of results and comments is provided in Appendix A.

Context: Platform, OS version, and Browser:  One expert used a computer with the Windows operating system and the Firefox browser (version 2.0.013).  The other expert used a Macintosh computer (10.5 OS).  This expert tested the product with two different browsers: Firefox versions 2 and 3beta, and Safari 3.1.  The first expert did not report the total testing time, but the second expert reported the test took one hour.

Results from each of the five dimensions will be presented in turn. As shown in Figure 1, results were very supportive of the product.  However, there were specific areas where revisions were recommended.  These will be summarized in the following sections.

Figure 1: Dimension (eLearning Experts)

Figure 1 : eLearning Expert review: Means for Dimensions

Dimension 1: User Control and Interaction

Results: The overall rating for user control and interaction was 3.75, with ability to exit and reasonable feedback time given the highest scores (M=4.5).  Mean responses regarding appropriate feedback and shortcuts were 3.0 (neutral), although one respondent who marked “disagree” on shortcuts commented that while there were none, none were needed since the system only required a single click.  One of the respondents indicated the audio player and pronunciation guide did not appear, even after checking “about 30” of the drug entries.  The other respondent remarked that the video “took a bit too long to download and begin playing.”  Item responses are indicated in Table 1.

Table 1: User control and interaction

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Mean*

The system gives the user control over the ability to exit from the system at any time

0%

0%

0%

50%

50%

4.5

Feedback is provided within a reasonable period of time

0%

0%

0%

50%

50%

4.5

The system provides users with appropriate feedback

0%

50%

0%

50%

0%

3.0

The system incorporates shortcuts that can cut down user navigation time

0%

50%

0%

50%

0%

3.0

*Scale: Strongly Disagree (1) to Strongly Agree (5)

Assessment of Results and Changes: A follow-up conversation with the tester who reported the audio did not play revealed the individual looked at drug names that were NOT in the Autonomic category.  Since the system was designed to present audio and phonetic spellings for only the Autonomic category, this was not considered a performance issue.  To avoid this potential confusion in the remaining tests, the database was abbreviated so that the Autonomic category of drugs was presented.  The final decision regarding the disposition of the “extra” drug names was referred to the client, who requested that the final product include all drug names.  Thus, the complete database was subsequently added back to the product.

However, this problem did suggest issues with the usability of the instructions which clearly stated there were pronunciations and spellings available only for this specific group of drugs.  As a result, the orientation language in two pages of the web site was streamlined, reducing the volume of text and adding emphasis via bulleted points.  A downloadable copy of the Illustrated Guide was also added, with its link placed on the guide page, so the users might have a printed copy available for reference during their use of the product. 

With respect to the video download time: the video clip was included only to demonstrate to the client that the system had reserved space for later inclusion of video, should the client desire to do so.  This was not therefore considered an issue requiring attention. 

Dimension 2: Verbal and Visual Quality

Results: The overall score for the seven items in the Verbal and Visual Quality dimension was 3.93.  While respondents indicated the dialog covered the essentials, was written in real-world language, and used consistent language, one commented that the prose was clear, but “may need some tweaking.”  The majority of the concern was with the visual elements, with only neutral to agree ratings on their internal consistency, and noncommittal response on the aesthetic quality.  Indeed, considerable comments and suggestions were offered on this specific quality.  Specifically:  the font and table presentation of the database were outdated; the bottom of the window needed something to visually mark its boundary; and frame sizes needed adjusting.  While the reviewer indicated corrections would be “quick and easy,” the fact that they should be made was indicated by the comment that the graphic design looked more like a school project than a client-worthy product.  A summary of individual item responses is presented in Table 2.

 

Table 2: Verbal and visual quality

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Mean*

Dialog covers the essentials.

0%

0%

0%

50%

50%

4.50

The system does NOT present irrelevant information.

0%

0%

0%

50%

50%

4.50

The material is presented in real-world language

0%

0%

0%

100%

0%

4.00

Specialized knowledge of technology and databases is NOT required in order to use this system.

0%

0%

0%

100%

0%

4.00

The system is consistent in its use of language

0%

0%

0%

100%

0%

4.00

Visual elements in the system are internally consistent

0%

0%

50%

50%

0%

3.50

Visual elements are aesthetically pleasing

0%

0%

100%

0%

0%

3.00

*Scale: Strongly Disagree (1) to Strongly Agree (5)

Assessment of Results and Changes: Six modifications were made as a result of this section of findings:  (1) The prose was “tweaked” throughout, as has already been indicated: made more straightforward and succinct, and bullets used to emphasize key points.  (2) A colored footer was added to each of the web pages to signal “end of page.”  (3) Font changes were made: the font in the drug table was changed to match the sans serif font in the rest of the web content, and the heading sizes were scaled down. (4) Table design was modernized by adding cell shading for alternating rows and changing the lines to white.  (5) A colored line added emphasis to the category name in the drug list.  (6) Frame size issues were addressed by decreasing the size of the category headers, and giving users control over the width of the frames. This feature was also noted in the Illustrated Guide page.

Dimension 3: Errors

Results: Neither respondent replied to the two questions in this dimension (error messages are easily understood; users are able to recover from errors), and one commented these questions were not applicable.  The other respondent indicated there was “not much chance for operator error” as you “just click on the drug.”  He indicated he did not see any error message, “unless you count a Quicktime symbol instead of an audio player.”  As noted earlier, this situation occurs when there is no audio available. 

Assessment of Results and Changes:  Appearance of the Quicktime symbol was not considered an error, as only a subset of drugs were to be presented in the product.

Dimension 4: Instructions and Help

Results: This dimension rated the highest score of the five areas of interest, with a mean score of 4.2 on a 5-point scale.  Subjects either agreed or strongly agreed with each of the items: instructions and help were visible, readily available, provided when needed, concise and to the point, and the space devoted was proportional to the rest of the screen content. One subject observed “there’s not really a help system in this site.  Instead, there are instructions, and they are clear.”   A summary of individual item responses is presented in Table 3.

Table 3: Instructions and help

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Mean*

Instructions are visible.

0%

0%

0%

50%

50%

4.50

Instructions are readily available.

0%

0%

0%

50%

50%

4.50

Help is provided when needed.

0%

0%

0%

100%

0%

4.00

When provided, help is concise and to the point.

0%

0%

0%

100%

0%

4.00

Space devoted to help is proportionate to the rest of the screen content.

0%

0%

0%

100%

0%

4.00

*Scale: Strongly Disagree (1) to Strongly Agree (5)

Assessment of Results and Changes: Despite the high rating in this section of the evaluation, other findings suggested that the instructions were not fully effective.  That is, one of the two participants tried to “play” audio for drugs that were not in the designated category.  As indicated earlier, the solution was to remove those additional categories from the system, since they were not part of the product deliverables.

Dimension 5: Organization

Results: Both eLearning experts agreed (M=4.0) that the structure was logical and the interface intuitive.  Added comments indicate “based on my understanding, I think you have delivered what the client requested,” and “just fine.  No problems.”

Table 4: Organization

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Mean*

The structure is logical.

0%

0%

0%

100%

0%

4.00

The interface is intuitive

0%

0%

0%

100%

0%

4.00

*Scale: Strongly Disagree (1) to Strongly Agree (5)

Assessment of Results and Changes: Based on these results, no changes were deemed needed.

Summary of comments by dimension

Comments on user control and interaction:

  • Video took a bit too long to download and begin playing.
  • Tried firefox 2, firefox 3 beta, and safari, all on Mac OS X 10.5.
  • In all 3 browsers the audio files (audio player) did not appear, and the phonetic pronunciation guide did not appear.  I also saw no video connected with any of the drug choices I made, and I checked about 30.

Comments on verbal and visual quality:

  • The database generated information (under the middle two tabs) looks 1997-ish.  I suggest sans-serif fonts and a “cleaner, more modern looking” table/cells.
  • Consider a footer, band of color, or something to signify the bottom of the window.  I always feel like I have to do some scrolling (and I don’t) on the 2 database pages.
  • The size of the left frame of the ePharm page should be reduced and the headings better formatted.  This frame is creating lots of empty space and putting too much distance between the list and the database generated information (table).
  • Overall, from a graphic design standpoint I think this looks like something to be submitted as a (really good) school project rather than something to be presented to a client.  However, I think the edits are quick and easy.
  •  Some of the prose may need some tweaking.  It’s clear though.

Comments on errors: 

  • Not applicable
  • Didn’t see any error messages, unless you count a quicktime symbol instead of an audio player.
  • Not much chance for operator error.  Just click on the drug.

Comments on Instructions and help:

  • NA = not applicable (This was noted for “When provided, help is concise and to the point” and “Space devoted to help is proportionate to the rest of the screen content.”)
  • There’s not really a help system in this site.  Instead, there are instructions, and they are clear.

Comments on Organization:

  • Based on my understanding, I think you have delivered what the client requested.
  • Just fine.  No problems.

PHASE 2 RESULTS: Learning usability testing using one-on-one trials

Context. Phase 2 testing took place on the campus of St. Jude Children’s Research Hospital on April 9-10, 2008.  One team member scheduled and observed the trials, using the observation protocol provided in Appendix B.  Each test took from three to seven minutes, from the time the participant keyed in the web address to the time the fifth drug audio file was played.  All three subjects used a Windows platform computer; two browers were used: Firefox and Internet Explorer.

Phase 2 of the evaluation was important because the participants provided valuable information as they demonstrated the use of the online performance support system.  Their impressions of the online performance support system helped to inform necessary changes to the system.  By observing individuals go through the system, the developers saw the kinds of problems which occur.  The effectiveness of the system could be obtained from the observations as well.

Results.  See Table 5 for transcribed results from the one-on-one observations, by participant. 

Participant one could not use the Firefox browser.  The Browser needed a plug-in.  In the Internet Explorer browser, the “spacebar” message kept appearing.  The participant did not have any problem locating and hearing the audio.  She noticed that Guanabenz was misspelled on the Participants Guide.  She thought the system would be a useful tool.

Participant two clicked the video first.  On the second attempt, she clicked the audio link.  This participant also used Internet Explorer and received the “spacebar” message.  After figuring out the system, the participant did not have any problems locating and hearing the audio.

Participant three used the Firefox browser.  It appeared the participant did not read the instructions.  She tried to locate the drug using the Illustrated Guide.  She became confused when the scrollbar would not work.  The observer clicked the ePharm link.  The participant clicked the audio link as well.   She noticed that Guanabenz was misspelled on the Participants Guide.  After figuring out the system, the participant did not have any problems locating and hearing the audio.

All the participants strongly agreed that the system would support efforts to learn to correctly pronounce the names of drugs, the system was easy to use, and they would recommend the system to other College of Medicine students.

Assessment of Results and Changes: It was recommended that the plug-in be installed on the testing laptop for future users.  The correct spelling of the word Guanabenz should be changed on the Participants Guide.  Also, a solution was needed to fix and eliminate the “spacebar” message that appeared.

PHASE 3 RESULTS: Learner usability testing using small group trials

Context. Small group trial evaluations were used to identify the strengths and weaknesses, other missing parts or improvement areas in the system, following the eLearning expert reviews and one-on-one trials.

Phase 3 testing took place in various locations: on the campus of St. Jude Children’s Research Hospital, in the Library at UTHSC, and in the University of Memphis library.  All testing occurred on April 11, 2008.  A Macintosh computer with Firefox was used in one case, with Windows and Internet Explorer used in the others.  In one case, unexplained technical difficulties caused the computer to shut down whenever the audio link was activated.  This situation (with the fifth and final subject) appeared to be influenced by the connectivity in the testing location, although no complete explanation was ever determined.  Except for the fifth test, each test took between two to six minutes long.

Results.  See Table 6 for transcribed results from the one-on-one observations, by participant. 
Four out of five participants strongly agreed that the system was easy to use.  Three out of five participants strongly agreed that they would recommend the system to others.  Participant one was neutral.  Two agreed the system would support efforts to learn drug names while two others strongly agreed. 

Participant five had trouble with the computer and did not get to hear the drugs being pronounced.  She did not answer the three questions.  However, she did feel the system was a good idea.

Assessment of Results and Changes:  It is recommended that a Portable Document Format (PDF) link be included so that users can print out the instruction pertaining to navigating through the ePharmacology system.  This was also suggested by the SME.

Table 5: Summary of One-on-one observation results

 

S1

S2

S3

Date

4/9/2008

4/10/2008

4/10/2008

Browser

Firefox; IE

IE

Firefox

Platform

Windows

Windows

Windows

Context

St. Jude, using observer’s laptop

U of M using wireless connection

Not reported

TIMES

 

 

 

Keys in web address

2:00 p Firefox; 2:03 p IE

2:00

2:11:05

Website fully loads

2:03:15

2:00:15

2:11:15

Reads instructions

2:04

2:00:30

2:12

Specified category located

2:04:40

2:01

2:12:15

Audio 1 played

2:05

2:01

2:15

Audio 2 played

2:05:15

2:02

2:16

Audio 3 played

2:05:45

2:02:15

2:17

Audio 4 played

2:06:15

2:02:30

2:18

Audio 5 played

2:06:45

2:02:45

2:18

Audio quality

Good

Good

Good

Number of times played each drug name

2

1

2

Remarks

  • Thinks this should be a good tool since it is difficult to learn how to pronounce drug names.  Couldn’t do the test on Firefox; needed a plug-in.
  • Easily found the next drug; message appears aking to press “spacebar” or “enter” to activate and use.
  • No problem finding other drug; same message appears when audio link is pressed.
  • Noticed that Guanabenz was spelled incorrectly (on testing form); message appears

 

  • Clicks on video first.  Then locates audio.  Spacebar message appears.  Participant has to clear message every time.
  • Only played audio once.
  • The spacebar message prevents a smooth transition.
  • Tried to locate drug using the Illustrated Guide; did not choose the correct drug from list; kept clicking audio and not getting feedback as a result.  Had trouble getting scrollbar to work.  Had to start over.
  • Transitioned from one drug to another with any problems.
  • Noticed drug 3 spelled wrong on participant’s guide.

 

 

Table 6: Summary of Small group observation results

 

SG1 (Medical student)

SG2
RN, St. Jude

SG3
RN, St. Jude

SG4
RN, St. Jude

SG5
RN, St. Jude

Date

4/11/2008

4/11/2008

4/11/2008

4/11/2008

4/11/2008

Browser

Firefox

IE

IE

IE

IE; Firefox

Platform

Macintosh

Windows

Windows

Windows

Windows

Context

UTHSC lab

Not reported

Not reported

Not reported

St. Jude computer*

TIMES

 

 

 

 

 

Keys in web address

12:02

12:10

4:10

4:24

4:30

Website fully loads

12:02

12:10

4:10:50

4:24

4:30:15

Reads instructions

12:03

12:11

4:11

No

4:32

Specified category located

 

12:11:15

4:11

4:25

4:32

Audio 1 played

12:03

12:12

4:13

4:25

5:00

Audio 2 played

12:04

12:12

4:14

4:26

5:01

Audio 3 played

12:04

12:13

4:14

4:26

5:01:15

Audio 4 played

12:04

12:14

4:15

4:27

5:01:30

Audio 5 played

12:04

12:14

4:16

4:27

5:01:45

Audio quality

Good

Good

Good

Good

Did not play

Number of times played each drug name

2

2

2

1

0

 

SG1 (Medical student)

SG2
RN, St. Jude

SG3
RN, St. Jude

SG4
RN, St. Jude

Remarks

Note: This was the first year medical student.

  • Participant hesitated before playing audio; kept reading message that appears in video area (download additional plug-in)
  • Transitioning from one drug to another was smooth
  • Transition was smooth
  • Forgot to play audio 3 twice; transition was smooth
  • Transition was fine

Clicks video.  Then asks if video not available.  Observer response “No it’s not avaialble.” Realizes is looking for audio and clicks the audio.  States that system is needed because it is so hard to pronounce drug names.

  • Clicks video; asks about video not loaded.  Observer states that focus is on audio.  Read instructions.
  • Participant had no problem.
  • The transition was fine.
  • It took a “minute” for the last drug to load.
  • Clicks video first.  Then audio; each once.  Did not read instructions; rushed through
  • Transition was fine.

* Subject 5: Tried to click drugs on guide.  Computer shuts down when she clicked the video could not play audio files.  Had to start over.  Clicked video again and computer shut off.  Tried to click audio, but no sound.  Had to use a laptop computer to get an Internet connection on the laptop.  Just located all the drugs; couldn’t hear audio due to computer shutting off.  Used IE on desktop, but wouldn’t connect to Internet.   States that system seems like a good idea.  Read all instructions.

Phase 2 and 3 survey results

As already presented in the one-on-one and small group evaluation discussion, and seen in Figure 2, survey results from the observation subjects were quite positive.

Figure 2: Survey Responses

Recommendations/Conclusions

Valuable information was gained from the eLearning expert reviews and the one-on-one and small group evaluations.  Changes that were deemed necessary were made to the product, as summarized in Table 7.

 

Table 7 : Summary of formative evaluation issue resolutions

Issue

Resolution

eLearning expert review

  • Attempts to play audio on non-autonomic categories failed
  • Removed all but Autonomic category from remaining tests, as only Autonomic was part of original scope.
  • Usability of the instructions
  • Streamlined language using bullet points; add a downloadable copy of the Illustrated Guide, with link placed on the guide page.
  • Video download time
  • No change made to video download time, as video placeholder simply reserves space for future additions.
  • Prose required tweaking
  • Prose edited for succintness.
  • Outdated font and table presentation of database table
  • Changed font to sans serif; modified table with alternating color rows and white borders.
  • Bottom of screen requires delimiter
  • Added colored band with copyright as footer to delimit end of page.
  • Left frame size too wide.

 

  • Adjusted left frame by (a) changing category style to smaller heading size (H4), (b) adding a bottom border color to aid visual tracking to right side of frame, and (c) making the frame border movable so user could control frame width.
  • Appearance of Quicktime symbol
  • No change was made as this was considered an acceptable situation.

One-on-one Trials

  • Some users mistook the Illustrated Guide for the actual live web page
  • Added at the top of the Guide page (a) a prominent link to the ePharm page and (b) a link to download a new pdf version of the guide. Also placed the link to the ePharm page on the home page.
  • Missing plug-in on observer’s test machine
  • The plug-in was installed before the next observation.  To the downloadable guide, added a notation about checking for plug-ins. 
  • Press spacebar message appears on IE browsers, interrupting the flow of the page.
  • The design team determined to continue to monitor the occurrence of the “spacebar” message to determine whether it persisted.  If so, code would need to be researched to suppress this message.  During research into a solution for this Windows problem, Microsoft pushed out an automatic update that appeared to have corrected this issue.  No further action was considered necessary.
  • User error in double-clicking the audio start button.
  • Appears to be a browser-specific and plug-in issue.  Made no modification, anticipating users who double-clicked would realize that a single click would suffice.  The issue of whether to single or double-click is a recurring issue with Windows applications, and users generally recognize that if one does not work, the other will.
  • Drug name misspelled on observation guide.
  • Unfortunate typo, but did not impact product’s effectiveness.  No action taken.

Small group trials

  • Tried to click drugs on guide.
  • This issue was noted in the one-on-one trials.  The corrective plan of action was implemented following the small group trials.
  • Computer shut down when one subject clicked the video.
  • This issue could not be replicated and thus a resolution was not found.  It may have been an anomaly perculiar to the workstation or the Internet connection or the firewall. 

 

Three global questions informed the formative evalaution process: system performance, usability, and utility.  The following discussion presents findings for each of these questions, summarized across three phases of the formative evaluation.  In addition, recommendations for future consideration are presented.

System Performance

Through the evaluation, it was concluded that the system did operate as designed:

  • The initial screen did appear in the browser window (Firefox, Internet Explorer and Safari were used).
  • The links from one page to another were functional
  • Drug categories and drug names loaded from the backend database to the front end interface.
  • The system permitted the user to select any drug within the specified category.
  • The system legibly presented drug information, including applicable brand names and phonetic pronunciation guide for the generic name about the selected drug.
  • The system operated without error

Two additional questions regarding system operation which were posed in the original evaluation plan were no longer applicable once the product reached testing phase.  ( i.e., Did the system support user selection of a specified drug category? When a user selected a specified category, did the system present the list of drugs that belonged to that category?)  The original plan was to let the user select a category from a list, see the list of drugs belonging to that category, and then select a drug to see its information.  This feature was abandoned as unnecessarily complex, requiring unnecessary keystrokes from the user as well as additional programming at the technical side.  The process was simplified by presenting a scrolling list of drugs grouped by category, from which the user needed only make a single selection.  (This design change also negated the need for an evaluation question in the next section.)

System Usability

The next series of questions for the evaluation focused on usability.  These findings were also very positive.

  • The system did provide users with an onscreen method to play audio pronunciation of a specified drug.
  • The audio files did play on the user’s system (when they selected a drug from the appropriate category).
  • The system was successful in permitting the user to play a given audio file repeatedly.
  • The audio was audible to the user.
  • The interface was intuitive enough for users to navigate without needing outside assistance.
  • The system did permit users to select a different drug from the same category without having to restart the system.
  • All subjects in the one-on-one and small group trials strongly agreed that the system was easy to use.

A user “error” which occurred was an attempt to “play” audio from the Illustrated Guide page instead of going to the ePharm page. As discussed earlier, a link to “Begin using ePharm” was added prominently on the home page as well as on the guide page (above the screen shot).  Additionally, a download link for the guide was added, for off-line reference if needed.  The design team discussed repositioning the ePharm page as the first page on the website.  Since the system was intuitive to use, and not all users appeared to read the instructions, this seemed to be a viable option.  However, this change was not made, since the team believed there were sufficient cues in the system for users to realize on their own which page had the active content.  

Further recommendations

Consider adding code to check for presence of plug-in and an advisory to user if it is missing.
If users continue to try to “play” audio from the Illustrated Guide, consider reorienting the four web pages, with the ePharm page placed as the first page the users encounter.

There was also some confusion about how to use the system, specifically with regard to selecting drugs that did not have audio files associated with them.  This issue arose from including drugs that were outside the scope of the project.  Once the owners of the product fully populate the database with audio links, this will not be a problem.  Until that time, however, the problem may continue. 

Further recommendation

If this confusion resurfaces, one solution might be to present only those drugs for which audio is available.  This can be done by modifying an existing query to add another condition to display an abbreviated version of the database.  The other solution would be for the course director to review this fact with the class while conducting an orientation to the product.

System Utility

The final question of interest in the evaluation was the extent to which the users believed the system would support their efforts to learn to pronounce the names of drugs correctly.  Participants in the one-on-one trials unanimously agreed this was the case.  Small group particpants also strongly agreed, although two (the single medical student and one of the St. Jude nurses) were not as strong as the others in their agreement.  The medical student was also equivocal about recommending the system to other medical students at the College of Medicine.

 

Appendix A
eLearning Expert Review Instruments, with transcribed results

Product: ePharm
Heuristic usability evaluation

Instructions: Please specify your test conditions:  (If you test the product with more than one platform or browser, and your responses differ accordingly, please use the space provided for each dimension to indicate which conditions are relevant to your responses.)

SECTION 1: Platform, OS version, Browser

Platform and OS version

Browser

 Windows, version:

Subject 1

 Firefox, version:

Subject 1, version 2.0.0.13
Subject 2: versions 2 and 3beta

 Macintosh, version:

Subject 2 (10.5)

 Internet Explorer, version:

 

`

 

 Other: (specify, with version)

Subject 2: Safari 3.1

Test date:

Subject 1: 4/3/8
Subject 2: 4/4/8

Duration of test:

Subject 1: not specified
Subject 2: 1 hour

 

 

SECTION 2: 5 Dimensions


Instructions: For each of the following FIVE dimensions, please indicate your agreement using the scale provided.  Then, add any comments relevant to the overall dimension, including specific details that will help us better identify areas we need to address.  THANK YOU for your time!

 

Dimension: User control and interaction

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The system provides users with appropriate feedback

 

S2

 

S1

 

Feedback is provided within a reasonable period of time

 

 

 

S1

S2

The system gives the user control over the ability to exit from the system at any time

 

 

 

S1

S2: I just close the browser, right?

The system incorporates shortcuts that can cut down user navigation time

 

S2: there are no shortcuts, but none are needed.  It’s just one click.

 

S1

 

Comments on user control and interaction:
S1: Video took a bit too long to download and begin playing.
S2:

  • Tried firefox 2, firefox 3 beta, and safari, all on Mac OS X 10.5.
  • In all 3 browsers the audio files (audio player) did not appear, and the phonetic pronunciation guide did not appear.  I also saw no video connected with any of the drug choices I made, and I checked about 30.

Browser:

 

Platform:

 


Dimension: verbal and visual quality

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The material is presented in real-world language

 

 

 

S1;S2

 

Specialized knowledge of technology and databases is NOT required in order to use this system.

 

 

 

S1;S2

 

The system is consistent in its use of language

 

 

 

S1;S2

 

Visual elements in the system are internally consistent

 

 

S1

S2

 

Visual elements are aesthetically pleasing

 

 

S1; S2 not bad, just fine

 

 

Dialog covers the essentials.

 

 

 

S1

S2

The system does NOT present irrelevant information.

 

 

 

S1

S2

Comments on verbal and visual quality:
S1:

  • The database generated information (under the middle two tabs) looks 1997-ish.  I suggest sans-serif fonts and a “cleaner, more modern looking” table/cells.
  • Consider a footer, band of color, or something to signify the bottom of the window.  I always feel like I have to do some scrolling (and I don’t) on the 2 database pages.
  • The size of the left frame of the ePharm page should be reduced and the headings better formatted.  This frame is creating lots of empty space and putting too much distance between the list and the database generated information (table).
  • Overall, from a graphic design standpoint I think this looks like something to be submitted as a (really good) school project rather than something to be presented to a client.  However, I think the edits are quick and easy.

S2: Some of the prose may need some tweaking.  It’s clear though.

Browser:

 

Platform:

 

 

Dimension: Errors

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Error messages are easily understood by ordinary users.

 

 

 

 

 

Users are able to recover from operator errors.

 

 

 

 

 

Comments on errors:  NOTE: when errors are encountered, please document where they occur and, if possible, what actions led to the error.

S1: Not applicable
S2:

  • Didn’t see any error messages, unless you count a quicktime symbol instead of an audio player.
  • Not much chance for operator error.  Just click on the drug.

Browser:

 

Platform:

 


 

Dimension: Instructions and help

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Instructions are visible.

 

 

 

S1

S2

Instructions are readily available.

 

 

 

S1

S2

Help is provided when needed.

 

 

 

S1;S2

 

When provided, help is concise and to the point.

 

 

S2: NA

S1

 

Space devoted to help is proportionate to the rest of the screen content.

 

 

S2: NA

S1

 

Comments on Instructions and help:

S2: NA = not applicable
There’s not really a help system in this site.  Instead, there are instructions, and they are clear.

Browser:

 

Platform:

 

 

Dimension: Organization

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The structure is logical.

 

 

 

S1;S2

 

The interface is intuitive

 

 

 

S1;S2

 

Comments on Organization:

S1: Based on my understanding, I think you have delivered what the client requested.
S2: Just fine.  No problems.

Browser:

 

Platform:

 

 

SECTION 3: Additional Issues


Instructions: Using the space provided, please document each additional issue you identify in the product.  Briefly describe the issue, specify where you encountered it in the system, and then use the two scales to rate its (a) severity and (b) extensiveness.   If needed, add more rows to the table.

Issue

 

(a) Severity

 

(b) Extent

Description

Location

 

3
(Critical)

2
(Non-critical but attention recommended)

1
(Fix if time permits)

 

 

3
(Occurs in 3 or more places)

2
(Occurs twice)

1
(Occurs only once)

S2:
No additional issues, but the ones I mentioned are very important.

 

 

Additional comments:

  • S1 returned the response form with the following message in e-mail: My feedback is attached.  I think it would be a good idea to get 1-2 non-tech-types to also provide feedback.  As far as the feedback goes, I very well may be an outlier with my data related to the graphic design, but I figured you’d want me to be honest.  Overall, very, very good!!
  • S2 returned the response form with the following coming in the e-mail:
    Here’s my experience.  Couldn’t view pronunciations or video unfortunately.  Call me if you like.
  • The project manager spoke with S2 by phone within twelve hours.  The reviewer had NOT linked to the Autonomic drugs, which were the only drugs for which there were pronunciations, video, or phonetic spellings.

References

Nielsen, J. How to conduct a heuristic evaluation. (nd.) Retrieved March 3, 2008, from http://www.useit.com/papers/heuristic/heuristic_evaluation.html

Nielson, J. Ten usability heuristics. (nd.) Retrieved March 3, 2008, from http://www.useit.com/papers/heuristic/heuristic_list.html

 

Download now Appendix A: Heuristic usability evaluation data collection form

Download now Appendix B: One-on-one and small-group evaluation testing forms

Page Last Updated Sunday, April 27, 2008 10:29 PM