User-centered Design of a PHR: Traditional Web Forms vs. Wizard Forms
|
If you are the presenter of this abstract (or if you cite this abstract in a talk or on a poster), please show the QR code in your slide or poster (QR code contains this URL). |
Abstract
Background: This work is part of an ongoing feasibility study aimed to design and deploy a PHR system for the citizens of the Province of Trento (NE Italy). The system is intended both to support storing health information important to patients and to improve relationships and communication between patients and their health care providers. With regard to the first topic, input data forms and navigation are crucial elements that would be used consistently in order not to compromise the use of the system. That's why considerable effort was dedicated to guide and support the management of health information in the system design phase. We adopted a citizen-centered approach within an iterative design-evaluation process. This work is supported by the Department of Health and the Department of Research and Innovation of the Autonomous Province of Trento (NE Italy).
Objective: Our objective was to explore the effectiveness of the traditional web forms vs. wizard step-by-step structure and to gather information about problems that users may encounter when interacting with such interfaces. We focused on testing usability and user experience of two user interfaces within an interaction design approach [1]. The two prototypes were designed and developed to support citizens to maintain drugs' information themselves.
Methods: An early evaluation by health researchers working in FBK was performed. In a second time interaction design methods were used for comparing the two user interfaces. We adopted a usability testing based on the think-aloud technique for observing users while surfing the system and filling the forms, post-task questionnaire based on a Likert-type scale for assessing user satisfaction, and debriefing semi-structure interview for exploring subjective user ex-perience behind what was previously observed [2]. According to Nielsen [3], eight participants were recruited among the administrative personnel of FBK. The inclusion criterion was the skill in using the web. The used task scenarios were stories of common drug prescription.
Results: The first evaluation identified mainly navigation problems from screen to screen. A brainstorming with the health researchers allows us to reorganize internal navigation according to the critiques. The usability testing was carried out on the refined user interfaces. Think-aloud reports were transcribed in a world processing file and a content analysis was performed using coding categories described in literature about human-computer-interaction [2]. The most frequent problems were related to the selection of malaises and diagnoses from the related lists. These lists were difficult to use since they were based on large medical terms organized in apparatus. The other issues were about navigation: the button for adding a new prescription in the main page need to be more highlighted and the tab label for editing personal evaluation should be reworded for clarity.
Conclusions: Repeated cycles of design-testing-measure-redesign allow pointing out wrong design assumption that could cause usability problems later. No one system prevailed against the other and the incidental preference coming from the debriefing interview was founded on subjective impressions. It could be interested to repeat this test after people have used the wizard interface for a period of time.
References:
1. Preece, J. Roger, Y. and Sharp, H. Interaction design: beyond human-computer interaction. New York, John Wiley & Sons, Inc.
2. Kushniruk, Andre W. and Patel Vimla L. Cognitive and usability engineering methods for the evaluation of clinical information systems. Journal of Biomedical Informatics, 37, 56-76, 2004
3. Nielsen, J. Usability engineering. Academic Press, San Diego, CA, 1993
Objective: Our objective was to explore the effectiveness of the traditional web forms vs. wizard step-by-step structure and to gather information about problems that users may encounter when interacting with such interfaces. We focused on testing usability and user experience of two user interfaces within an interaction design approach [1]. The two prototypes were designed and developed to support citizens to maintain drugs' information themselves.
Methods: An early evaluation by health researchers working in FBK was performed. In a second time interaction design methods were used for comparing the two user interfaces. We adopted a usability testing based on the think-aloud technique for observing users while surfing the system and filling the forms, post-task questionnaire based on a Likert-type scale for assessing user satisfaction, and debriefing semi-structure interview for exploring subjective user ex-perience behind what was previously observed [2]. According to Nielsen [3], eight participants were recruited among the administrative personnel of FBK. The inclusion criterion was the skill in using the web. The used task scenarios were stories of common drug prescription.
Results: The first evaluation identified mainly navigation problems from screen to screen. A brainstorming with the health researchers allows us to reorganize internal navigation according to the critiques. The usability testing was carried out on the refined user interfaces. Think-aloud reports were transcribed in a world processing file and a content analysis was performed using coding categories described in literature about human-computer-interaction [2]. The most frequent problems were related to the selection of malaises and diagnoses from the related lists. These lists were difficult to use since they were based on large medical terms organized in apparatus. The other issues were about navigation: the button for adding a new prescription in the main page need to be more highlighted and the tab label for editing personal evaluation should be reworded for clarity.
Conclusions: Repeated cycles of design-testing-measure-redesign allow pointing out wrong design assumption that could cause usability problems later. No one system prevailed against the other and the incidental preference coming from the debriefing interview was founded on subjective impressions. It could be interested to repeat this test after people have used the wizard interface for a period of time.
References:
1. Preece, J. Roger, Y. and Sharp, H. Interaction design: beyond human-computer interaction. New York, John Wiley & Sons, Inc.
2. Kushniruk, Andre W. and Patel Vimla L. Cognitive and usability engineering methods for the evaluation of clinical information systems. Journal of Biomedical Informatics, 37, 56-76, 2004
3. Nielsen, J. Usability engineering. Academic Press, San Diego, CA, 1993
Medicine 2.0® is happy to support and promote other conferences and workshops in this area. Contact us to produce, disseminate and promote your conference or workshop under this label and in this event series. In addition, we are always looking for hosts of future World Congresses. Medicine 2.0® is a registered trademark of JMIR Publications Inc., the leading academic ehealth publisher.
This work is licensed under a Creative Commons Attribution 3.0 License.