Content area
The human-computer interface (HCI) -- the point at which a human and computer (or other intelligent device) meet -- can be rife with misunderstandings. Computer jargon, even when written in English, can seem to be a foreign language. A human user, on typing a query, may not understand the computer response "error 404", which loosely translated means users cannot find what you are looking for. For the computer, a key stroke, mouse click, screen tap, or voice command from the human may suffice. Increasingly, natural user interfaces that respond to human gestures and speech may eventually replace the currently standard graphical user interface, which revolutionized human-computer usability by allowing a user to interact with the computer via images such as pointers, icons, and folders, instead of command line text. Human factors engineering, a highly interdisciplinary and systems science, has contributed greatly to our understanding of how humans not only interact with machines in general but with computers and embedded devices in particular.
The human-computer interface (HCI)-the point at which a human and computer (or other intelligent device) meet-can be rife with misunderstandings. Computer jargon, even when written in English, can seem to be a foreign language. A human user, on typing a query, may not understand the computer response "error 404," which loosely translated means "I cannot find what you are looking for."
HCI is also described and strongly influenced by the manner in which human-computer interactions occur. For the human, the computer predominately communicates through its display and use of colors, size, and location of fonts, icons, waveforms, images, audible responses/ alarms, synthesized speech, or through tactile stimulation.
For the computer, a key stroke, mouse click, screen tap, or voice command from the human may suffice. Increasingly, natural user interfaces (NUIs) that respond to human gestures and speech may eventually replace the currently standard graphical user interface (GUI),1 which revolutionized human-computer usability by allowing a user to interact with the computer via images such as pointers, icons, and folders, instead of command line text.
When these interaction techniques are mismatched or break down, mistakes are made, user frustration increases, and faith in the system declines. A software designer's job is to prevent or minimize such a breakdown and optimize the human-computer relationship. It requires at the very minimum some understanding of the concepts associated with human perception and cognition.
It is essential to take visual, auditory, and tactile communication aspects into account early in the design process, in order to produce a safe and usable medical device or software system. The focus of this paper is limited to the visual and cognitive factors that strongly influence user performance and satisfaction at the level of not only the HCI, but the human-medical device interface as well.
Classic Human Factors Model Of the Human-Machine System
Human factors engineering (HFE), a highly interdisciplinary and systems science, has contributed greatly to our understanding of how humans not only interact with machines in general but with computers and embedded devices in particular. These contributions have come largely through empirical research, models, and accident investigations, especially where a disregard for the human user was found to be a contributing cause of the accident.
A classic model that can be applied directly to our HCI study (Figure 1) was originally described by Meister2 and further refined by Proctor, et al.3 This model captures the essence of the human, computer, and operating environment, and the almost mythical, ethereal HCI zone. It is clear from the model that in addition to human perception and cognition functions, computer display design and input interaction techniques are also crucial for a meaningful relationship between human and computer to occur and be maintained.
Perception
Computers communicate with their human users largely through their visual (and auditory) displays. In order to do so they must be able to send properly coded information across the interface in such a way, and with sufficient intensity, as to physiologically stimulate visual (or auditory or tactile) receptor cells.
The lowest level of intensity that activates such receptors is referred to as the threshold of perception.2
While there are different forms or dimensions to perception, in the context of Figure 1, they are limited to detection and identification of transmitted information. At the level of the interface and from the human's perspective, visual perception is about being able to detect and identify characters, symbols, colors, all of which require adequate visual acuity, contrast sensitivity, and an ability to discern colors. Visual accommodation, scanning, and tracking a moving target, although not discussed below, are also essential for a human to obtain information from the computer interface.
Visual Acuity
The classic Snellen chart (Figure 2) has been used to assess visual acuity for more than 150 years.4 It is based on a series of letters, or optotypes, of varying sizes that all subtend an angle of five minutes of arc (or 0.0833) at a given viewing distance. Figure 2 also shows a character (exaggerated scale) of 0.35 inches (25 pt.) in height will subtend an angle of five minutes from 20 feet away. Such characters correspond to line 8 on the Snellen chart.
Individuals who can read such characters from this distance are said to have 20/20 vision. Here, the numerator of this Snellen fraction is the distance at which the test is administered, i.e., 20 feet, with the denominator corresponding to the distance at which the 0.35-inch characters subtend this standard five-minute angle. Should the height of the character have to be doubled before it can be read, such a person would have 20/40 vision, and so on. Alternatively stated, what a person with 20/40 vision can read at 20 feet, someone with 20/20 vision can read at 40 feet.
For comparison, the large "E" at the top of the Snellen chart is 3.5 inches in height and corresponds to an acuity of 20/200 vision (i.e., arctangent of 3.5/2400 = 5 mins of arc or 0.083o). This relationship can be used to determine the minimum character height needed for a given viewing distance and acuity. For example, the text height, h, needed for someone with 20/100 vision viewing text from 24 inches away, d, is determined by:
h = d tan[(0.083)(100/20)] = (24)(0.007) = 0.17 inch . 13 pt.
Of course, the type of viewing screen, choice of font, and resolution will also affect readability. If the displayed characters on the computer interface are not large enough.for the range of anticipated viewing distances.the human will not be able to read them. Accordingly, HCI designers should not assume all users have 20/20 vision, or that acuity remains constant with age, under different levels of contrast, or under different lighting conditions.
Consequently, allowing users to adjust the size of characters or icons, when feasible, is also a desirable design feature. Generalized guidelines for computer-generated fonts and icons, however, are more difficult to create than those for printed text due to wide variations in how they are presented.
Contrast Sensitivity
In addition to the computer producing adequately sized characters and images, and the human having adequate visual acuity, a second human criterion for meaningful and accurate communication across the interface is sufficient contrast sensitivity.the ability to distinguish between differences in the luminance and/or color between an object or character and its background.
As the human visual system is more sensitive to contrast than differences in brightness,5 high contrast in displays is particularly important; especially since it is possible for individuals to have good visual acuity but poor contrast sensitivity. General design guidelines recommend high-resolution displays with strong image polarity, e.g., dark characters on light backgrounds.6
Use of Color in HCI Design
As colors can be used to convey information, such as red for danger, it is generally not a good idea to give users control of color. However, in computer interface design, color can be a tremendously powerful information-coding tool, as color-coded displays tend to increase recognition speeds as well as offer valuable user feedback on system status, e.g., green for go. Color-coded controls, in addition to proper labeling, also provide a form of redundant feedback to users, telling them in multiple ways the function of a particular control.
In order for color to be effective, the user must be able to discriminate between different colors. Approximately 8% of males (and less than 1% of females) are red-green color-blind5.a limitation that should be acknowledged during interface design. Additionally, the human eye is not equally sensitive to all colors; our eyes are most sensitive to yellow-green light (555 nm)3 and less sensitive to blues and reds (Figure 3).
This difference explains why yellow grabs the attention, and why yellow on a dark background appears brighter than other colors on the same background. Once colors have been correctly perceived, how they become interpreted and processed by the brain is a function of cognition. The meaning and feelings we attach to colors are also to some extent influenced by culture, age, and gender.
Cognition
Once the desired information has been meaningfully coded, transmitted across the interface, and perceived by the user, the next essential step, and arguably the most complex, is cognition-accurate and timely interpretation and processing of this information. Cognition is a particularly critical stage in this stimulusresponse loop that makes up the human-computer system because people are fundamentally limited in how much information they can process at any given time, what they may be able to accurately retrieve from memory, or the extent of any needed mental calculations that they may need to perform.
Because it is such a crucial element in all areas of humanmachine interaction, it is a topic that has been extensively studied by human factors and cognitive scientists alike. Fortunately, an aspect of human cognition that software designers have been able to take advantage of is our general reliance on the use of metaphors and analogies as an aid in both learning and thinking.6
Because humans tend to learn by analogy and association of a known concept with an unknown or new domain, the metaphor has become a well-established interface design tool. Windows' desktop, folders, recycle bin, and snipping tool (scissors) are classic examples. In this context, metaphors may convey considerable amounts of information and reinforce underlying or existing assumptions, e.g., clicking on a printer icon to print a document.
In general, any interaction technique that minimizes short-term memory loads on the user tends to be preferred. Menus have replaced original command-line interfaces in popularity and efficacy because human recognition memory is much better than absolute recall.6
The Effect of Color on Cognition
Color is not only a powerful information-coding tool, but is also capable of affecting cognition by changing the way people feel. Color figures prominently in marketing and advertising because it sends messages or warnings, conveys intent, and influences behavior on an almost subliminal level.
To further complicate matters, the meaning and feeling we attach to colors are also to some extent influenced by culture, age, and gender. Table 1, for example, summarizes some of the popular symbols and emotions associated with color in predominately Western societies.7 As other cultures may interpret these differently, medical device designers need to be particularly mindful if their products are to be used across differing cultures. Also, as colors exert such a profound influence on the human user, they should be used judiciously and somewhat sparingly.
Conclusions
Many basic and perceptual and cognitive factors that influence device usability have been embodied by a number of heuristics or HCI design guidelines, the use of which may be tremendously helpful for both designers and users evaluating new systems. In particular, Shneiderman's eight golden rules of interface design8 are often cited:
1. Strive for consistency in the use of terminology, sequence of actions, colors, and menus.
2. Cater to universal usability by acknowledging the technical diversity among users, e.g., by giving more help and guidance to novice users while allowing experts to take shortcuts.
3. Offer informative feedback in the system by letting the user know for each user action that a response is appropriate.
4. Design dialogs to yield closure by letting the user know what actions represent the beginning and the end of a needed sequence of events.
5. Prevent errors by discouraging or preventing the user from making mistakes, for example, by graying out inappropriate menu items, and prohibiting the use of numbers in text fields.
6. Permit easy reversal of user actions that they may have accidently wandered into.
7. Support user control by showing that the user is in charge of the interface. Perceived lack of user control is often a source of considerable user frustration.
8. Reduce short-term memory load by limiting the number of items that the user must remember to the "7 ± 2" chunks of information rule-of-thumb.2
As users and designers in both the medical device and information technology communities know all too well, the HCI is tremendously complex and is where the greatest disconnects and mistakes in communication between the human and computer occur. Designers of software and medical devices must consider the various needs of the interface in order to make their designs accessible. The ultimate aim is that with careful HCI design and usability testing, patients and providers can use medical devices and software intuitively and seamlessly. n
References
1. Wigdor D, Wixon D. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Amsterdam: Morgan Kaufman, 2011; 16.
2. Meister D. Human Factors: Theory and Practice. New York: Wiley-Interscience, 1971; 10.
3. Proctor RW, VanZandt, T. Human Factors in Simple and Complex Systems. Boston: Allyn and Bacon, 1994; 4.
4. Standards for Visual Acuity. Available at: www.isd.mel.nist.gov/US&R_Robot_Standards/ Visual_Acuity_Standards_1.pdf. Accessed July 17, 2013.
5. Johnson J. Designing with the Mind in Mind. Amsterdam: Morgan Kaufman, 2010; 55-59.
6. Dillon A. User Interface Design. MacMillan Encyclopedia of Cognitive Science, Vol. 4. London: MacMillan, 453-458.
7. Psychological Properties of Colours. Available at: www.colour-affects.co.uk/psychologicalproperties- of-colours. Accessed July 18, 2013.
8. Shneiderman B, Plaisant C. Designing the User Interface. Strategies for Effective Human-Computer Interaction. 5th ed. Addison-Wesley, 2010; 70-71.
About the Author
Larry Fennigkoh, PhD, PE, CCE, is a professor of biomedical engineering at the Milwaukee School of Engineering, Milwaukee, WI. E-mail: fennigko@ msoe.edu
Copyright Allen Press Publishing Services Fall 2013