1. Introduction
There is extensive research on showing how decision-making is dependent on the context of the choices given. Inclusion of alternatives in the choice set can alter one’s preference among the others, and such contextual alternatives are called decoys. Decoy effects are prevalent in decision-making situations, such as when purchasing an electronic good, selecting a phone data plan or internet plan. Companies are inherently interested in guiding customers toward higher-margin products and often utilize the decoy effect to maximize their profits. When they price items to create decoys and attract customers to certain options, it is difficult for customers to discern any spurious choices and make an optimal decision. In other words, individual consumers who do not have control over the information could be easily deceived.
For better understanding, we introduce the well known example below. Suppose that you want to subscribe to the Economist magazine. If you have two options, as shown in Figure 1a, which one would you choose? Then, if you have three options as shown in Figure 1b, which one would you choose? Did you experience differences in your preference regarding options available in the two figures? This is an interesting experiment [1] demonstrating that consumers’ preferences can be manipulated by introducing a seemingly irrelevant option (a decoy; the second option “Print subscription” in Figure 1b). The bar graphs on each side show the percentage of participants who selected a corresponding option. The “Print subscription” option is irrelevant because it is as equally expensive as the “Print & web subscription” ($125) but does not provide online access, and thus is clearly dominated by the “Print and web subscription” option and is called a decoy. By introducing this decoy, the “Print & web subscription” is much more attractive to participants in Figure 1b (84 of 100 votes) than Figure 1a (32 out of 100 votes). This is called the “decoy effect," or ”attraction effect," the phenomenon whereby consumers will tend to have a specific change in preference between two options when presented with a third option that is asymmetrically dominated [2,3,4,5].
Many researchers in behavioral economics, marketing, and psychology have repeatedly studied the decoy effect since it was introduced by Huber et al. [2], which consistently demonstrates flaws in normative decision theories, which claim that decision makers have consistent preferences toward given option [6,7,8]. However, interestingly little research has examined how to break the decoy effect despite the clear benefits for consumers. Therefore, we conducted an experiment to show if visualization can help with debiasing the decision-making process, specifically regarding on the decoy effect. Our research was also partially motivated by Dimara et al.’s study [9,10], wherein the authors found that the attraction effect exists in 2D scatterplots with small number of options; however, the 2D scatterplots could mitigate the attraction effect in a relatively complicated decision-making context (e.g., 10 options). Still, however, is it possible to help with this cognitive bias in a more realistic, conservative decision-making context (e.g., having three options as in Figure 1b)? If we use different types of visualizations, how does it affect the decision-making process?
To address these questions, our first experiment was a quantitative experimental study using a set of visualizations and a real-world dataset to see if visualizations can help alleviate the decoy effect. It was conducted on a crowdsourcing platform as we can recruit a large and diverse group of participants compared to lab experiments [11,12]. Although there are concerns about the data quality, several studies reported that they replicated the results of prior laboratory experiments using crowdsourcing [13,14]. The second study was a qualitative interview study with a smaller set of participants to better understand the decision-making processes while making a choice with a decoy option. We replicated previous findings showing the decoy effect with a tabular representation and tested the effectiveness of a four types of visualizations: one-sided bar chart, two-sided bar chart, scatter plot, and parallel-coordinate plots. After describing the results, we document key findings of the research and discuss several issues. The contributions of our research include:
It provides empirical evidence that visual interfaces could mitigate cognitive bias in everyday decision-making.
It provides the first step that different types of visualization could influence the decoy effect differently.
It provides insights on how different task types and decision-making styles affect the decoy effect.
2. Literature Review
2.1. Decision-Making and Visualization
Because information visualization techniques can help people comprehend data and transform them into information, several techniques have been used to support the decision-making process. Visual representations and interactions are especially well known ways to help users perceive aspects of data, by augmenting cognitive reasoning with perceptual reasoning and leading to efficient analytic reasoning [15,16].
Data visualization has been utilized for multi-attribute decision-making contexts wherein a decision maker has to select one option from a set of alternatives. The data that need to be considered can easily be organized in a tabular format [17]. Because comparing attribute values is important, parallel-coordinate plots have been used because they can easily project high-dimensional data into a two-dimensional space [18]. The representation is effective because it provides an overview and helps compare values. However, parallel-coordinate plots have a shortcoming lacking the tabular view with which general users are familiar. In contrast, parallel bargrams keep the tabular representation but still support sorting all attributes in parallel rows at the same time [19] and LineUp [20] and SimulSort [21] also visualize the value of a cell with stacked bar charts to help visually sum multiple items. Dimara et al. [22] examined how parallel-coordinate plots, scatterplot matrices, and tabular visualization are effective for analytic tasks involving multi-attribute decision-making. These techniques help users browse through the data by making data accessible rather than making a decision automatically.
Although the majority of research in the field has focused on how to augment human cognition through visualization techniques and tools, recent studies raised an issue about cognitive biases for information visualization [22,23].
2.2. Decision-Making Biases and Decoy Effect
H. A. Simon [24] introduced the concept of bounded rationality, which is the basis of behavioral decision research. It was an alternative approach to normative decision theory, assuming that decision makers are fully rational. Simon viewed decision makers as satisficers who seek a satisfactory solution rather than an optimal one. Because human cognitive capacity is limited, decision makers may keep information demands within their cognitive capacity. Due to this, as people make mental shortcuts known as heuristic strategies, in some cases, they lead to cognitive biases that affect the decision-making process, which are systematic errors in judgement.
Among several cognitive biases such as confirmation and anchoring effects [25], the decoy effect is a well documented phenomenon that is used in many choice models and marketing in industry [26,27,28]. The decoy effect appears in a decision-making task that involves a set of options and is characterized by attributes. The decision maker has to make a choice, weighing their preference for each attribute. Given a set of options, the attraction effect refers to an enhancement in the choice probability of an option through the introduction of a similar but inferior alternative called a decoy. The three common decoys that are discussed are symmetric dominated, asymmetric dominated, and phantom decoys [29]. Figure 2 graphically depicts these three types of decoys. The two dimensions are attributes that the decision maker should consider and options along the red dotted line are not comparable, which means any option along the line can be optimal. It is a matter of personal preference.
Symmetrically dominated decoys are in the SD region shown in Figure 2. One alternative dominates another when it is clearly superior based on at least on attribute. The symmetric decoy is dominated in both aspects considered by the decision maker. Wedell [30] showed that the effect of the symmetrically dominated decoy is not statistically significant in terms of changing one’s preference structure. For example, considering the context in Figure 1, it would be an option that costs $59.00 and only offers a 6-month subscription. Asymmetrically dominated decoys in the ASD region are shown in Figure 2. An ASD decoy is dominated by one alternative in a set of options but not by the other options. For example, it is the third option, Print & web subscription, in the economist example (Figure 1). The item dominating the decoy is called the target, and the other item is the competitor. Extensive research has shown that adding an ASD decoy dramatically increases the likelihood that the decision maker will select the target option [29]. Note that the target and competitor are comparable favoring one dimension over the other. These types of decoys are most extensively studied (e.g., [2,30,31]). The choices in the non-dominated region (ND) are not dominated by a certain dimension and are alternatives that can be chosen based on one’s preference.
The phantom decoy was introduced relatively recently [32]. It is a highly attractive option included in the choice set but unavailable at the time of choice. Although the decoy appears to be a better choice because it dominates its target, it is an unattainable choice or is so dominant in an attribute that it becomes unattractive [29]. Although this seems counter-intuitive, it is best explained by the loss aversion principle used in the relative advantage model, which states that losses loom larger than gains [33].
Among the various decoy classes, asymmetrically dominated decoys show the strongest decoy effect and have been studied in many contexts and with a wide range of alternative values, and the reported results are stable and strong [2,30,34,35,36]. In many cases when people refer to decoy effect, it refers to an asymmetric dominance effect. Therefore, our research also focused on the asymmetrically dominated decoy effect.
There are several explanations for why preference reversals occur with a decoy. According to the weight-change model, adding a decoy changes the relative importance of each attribute. Therefore, the preference between target and competitor can be changed [37]. However, several studies found evidence against this model [31,38]. According to the value shift model [39], the changes do not result from the weight change of the attribute, but rather change the decision maker’s evaluation of the attribute values of the target. Both share a common assumption that the decoy effects result from evaluating the value of a target considering both attributes.
The third explanation is the dominance-valuing model. This is based on modern theories of behavioral decision-making that decision makers use heuristic strategies within a cost–benefit framework. A decision maker can adopt strategies to minimize highly cognitive activities or find compelling or simple justifications. With a low selection of decoy options, decision makers easily detect the deteriorated option. Once they have detected the decoy, the decision maker may choose the targeted alternative because it is easier to justify. The dominance-valuing model differs from both the weight-change model and the value-shift model because the value of the option is perceived based on the dominance structure which increases the attractiveness of the target rather than calculating its value from the attributes.
2.3. Decision-Making Styles
Decision-making styles, defined as “the learned, habitual response pattern exhibited by an individual when confronted with a decision situation” [40], are another well-known approach to analyzing the individuals’ decision-making behaviors. Decision-making styles are relatively stable and lasting cognitive factors which can be applied to interpret people’s attributes while making choices [41].
Despite their wide use and strong impact in decision-makings, decision-making styles in the field of information visualization have not yet been fully explored. A study [42] has found a significant interaction effect between information visualization techniques and decision-style on task completion time, but little research was undertaken to analyze the main effect of decision-making styles in various information visualization strategies.
There are several approaches to classify the decision-making styles. Herbert Simon [24] proposed the two different types of decision-making: a rational analytical and an intuitive style. According to the study, rational-style people have a tendency to evaluate information with explicit reasoning process, whereas the intuitive-style is more focused on their prior expertise and experience. Another example of decision-making styles is spontaneous, dependent, and avoidant [43].
In this research, we adapted the Scott and Bruce’s (1995) approach of GDMS (General Decision Making styles), which has been widely used in various fields of research [44], including marketing [45] and job decisions [46]. With the 25 items, GDMS distinguishes decision-makers between five categories: (a) Rational: making decisions based on the logical evaluations of alternatives and having a sequential information process; (b) Intuitive: having strong dependence on emotional and gut feelings; (c) Dependent: Reliance on the directions and advice from others; (d) Avoidant: attempts to avoid making decisions since they feel uncomfortable when doing it; and (e) Spontaneous: a tendency to “get through decision-making process as soon as possible”. According to the prior research [47], these styles are also divided into two categories. Rational, intuitive, and spontaneous decision-making styles are distinguished as a “core decision process” which are concerned more with the cognitive way individuals make choices. By contrast, the other two—dependent and avoidant styles—are related to the benefit–risk assessment identified as a “decision-regulatory process”.
2.4. Debiasing the Decoy Effect
Although the decoy effect is a well-known cognitive bias, little research has explored how to minimize or decrease the decoy effect. Current research mainly has focused on the underlying for the decoy effect and how to utilize the effect from the marketer’s point of view.
Early work breaking the decoy by Teppan and Felfernig suggested a decoy minimization method to break the decoy effect [48]. Decoy minimization mainly involves two methods: (1) excluding decoys from the alternative set; (2) including counteracting decoys in the set. They suggested including counteracting decoys because the elimination of decoys is hard to accomplish; it can be ambiguous whether an alternative belongs to the set of the decoy or not, especially when the decoy is not dominated [49]. Counteracting decoy methods introduce another decoy of a competitor to neutralize the effect of the existing decoy. Teppan and Felfernig tested the asymmetric dominance effect of decoys through an unsupervised online study [48]. Subjects in the study were asked to evaluate the results of a fight tournament, and each avatar had different mobility, quickness, and punching-power ratings. For two given alternatives as shown in Table 1, their respective asymmetrically dominated decoys were also generated and included as part of the choice set. The experiment featured 11 sets of choices. To neutralize the decoy effect of A- on A, they introduced B—whose mobility was 5, and the power was 2. In the user study, they found that this counteracting decoy reduced the decoy effect.
However, adding another option in the given set of options to decrease the decoy effect is not realistic because it is not a common strategy that a person would normally employ in a real-world setting. Hence, recent work was introduced utilizing visualization with the first approach, excluding decoys from the alternative set [10]. Dimara et al. demonstrated that interactive scatterplots helped to remove locally dominated points. They assumed that interactive scatterplots would help execute elimination by aspect, a common decision-making strategy that helps remove salient inappropriate data to minimize the amount of information to consider. While the study presented some evidence that interactive visualization could mitigate the decoy effect, it considered a relatively large number of options (e.g., 10 options), which far exceeds the common number of three options that most studies used in their experiment. For a three-option setting, scatterplots still replicated the decoy effect [9].
We believe that various visualization techniques should be investigated to see whether they can help to alleviate the decoy effect with more realistic datasets.
3. Methods
Two experiments were conducted. The first crowdsourcing experiment was to run a quantitative analysis on the decoy effect and examine if visual interfaces could mitigate cognitive biases. The second qualitative study was conducted with a semi-structured interview to get a deeper understanding of the users’ decision-making strategies. The same data sets and visualizations were used for both studies.
3.1. Design Rational
3.1.1. Data Sets
To create data sets, we considered both different scenarios and proper attribute values for each context. First, several scenarios have been used in decision-making studies. One of criteria we used for the scenario is that it needs to have a price range as an attribute. Because price is an attribute for which most people have a similar preference (e.g., the cheaper, the better), we assumed that it would minimize individual differences in preference. We also wanted to include different contexts and attribute values in the experiment to consider diverse decision-making situations. The resulting four scenarios capture decision-making contexts that commonly occur in daily life; subscribing to a magazine, subscribing to a video streaming service, purchasing coffee gift cards, and selecting a data plan for a cell phone. Each scenario consisted of two attribute dimensions for making a decision (e.g., price and number of cups of coffee) (see Table 2).
Second, we needed numerical values for each attribute. To replicate the scenario shown in Figure 1, as it is the most well known scenario for decoy effect experiments, we needed to change the subscription type (i.e., web and paper) to numerical values, done in several previous studies (e.g., [36,50]). Therefore, in addition to the price attribute, we selected the number of issues per year. The range for this attribute was determined by the unit price per issue. To control the level of difficulty, the unit price per issue was set to be close. For example, for the economist magazine, the unit prices for the competitor, target, and decoy were $2.80, $2.40, and $2.97, respectively (see Table 2). However, when spending more money, the unit price was lower per item, which indicates a greater benefit from spending more. The target is Option B, with a high price and more issues. The competitor is Option A, with a lower price and fewer issues. Option C is the decoy for Option B and has the same price but fewer issues. We positioned the decoy at the end for two scenarios and in the middle for two scenarios to minimize the order effect. For other attribute values, we tried to incorporate the values from the real world so it would be realistic to the participants.
We also added a filtering question to maintain quality of the data as we conducted an crowdsourcing experiment. The options are designed to be obvious so that, if they answered the questions wrong, we could assume that the participant was not paying proper attention to the task. The task was to make a choice for gym membership that had two options such as $290 for 24 months and $365 for 12 months. It would make sense to select the first option as it is cheaper, but one gets a longer duration.
3.1.2. Stimuli
To investigate whether the visual representation styles may influence the attraction effect, we considered five types of representations: table, one-sided bar chart, two-sided bar chart, scatterplots, and parallel-coordinate plot (See Figure 3). For the stimuli selection, we considered familiarity of the visualization based on education and exposure through media [51] and effectiveness shown in the visualization community. The table was selected for the baseline because most previous work has represented the data in a tabular form with text; we also wanted to determine if we could replicate decoy effects with our data sets.
Scatterplots were used in a previous study to debias decoy effects [10]. The researchers selected 2D scatterplots because they are suitable for presenting data with two dimensions, and the most common is the attraction effect literature (e.g., [2,5]), which means that the representation could help consumers find the decoy easily. Although scatterplots have the advantage of visualizing quantitative values, even for large amounts of data, they are not a common representation seen in everyday life. They are often used for statistical purposes. Therefore, we also selected more common representations.
Our criteria for selecting the visualizations to represent the data were that it had to (1) handle different attributes, and (2) make each alternative comparable. The first common choice was a bar chart. Bar charts are familiar to the general public because they are taught in the K12 curriculum and are well known for supporting comparison tasks [51,52,53]. Because there are two attributes for an alternative, we needed to layer two bars on the same side. To help detect each attribute visually, we color-coded each attribute. However, layering two attributes on the same side for an alternative has limitations because each attribute is measured by a different unit. It could be misleading and, if the numeric values differ significantly, the attribute with smaller values may not appear properly. To overcome this issue, we made a variation of the one-sided bar chart, creating a two-sided bar chart in which the bars were split into two directions. This approach placed the attributes opposite each other from a central axis. This contrast allowed the attributes to be compared easily while still showing the relative size of each alternative.
Last, we chose parallel-coordinate plots. Although it is not common to use parallel-coordinate plots for data sets with only two attributes and three alternatives, they suit the task of comparing attributes for alternatives. Additionally, because the decoy is dominated by the target especially considering one of the attributes, the delta (angle) of the lines connecting the attributes will be different. We tested whether this has influence while comparing alternatives.
To examine whether visualization types can mitigate the decoy effect, we conducted an experiment through a crowdsourcing platform Amazon Mechanical Turk (MTurk). The crowdsourcing approach has several advantages over conventional, controlled laboratory studies [11], including recruiting a large number of participants with diverse backgrounds, and it has been used in several information visualization experiments [14,54].
3.2. Crowdsourcing Experiment
3.2.1. Experimental Design
We conducted a between subject design with two factors; an option-type with two levels with and without decoy; and a visualization-type with five levels: table, one-sided bar, two-sided bar, scatterplots, and parallel coordinates plot (see Figure 4). To evaluate the attraction effect, participants from the control group were asked to choose the most attractive option between the two alternatives (i.e., target and competitor). For the experimental group, three-alternatives were presented (i.e., target, competitor, and decoy).
3.2.2. Participants
A total of 576 participants were originally recruited through MTurk. The workers’ requirements to participate in our task on MTurk was an HIT approval rate greater than 95%, located from the United States, and more than 500 approved HITs. We found that 107 were identified participants as insecure, and they did not pass our filtering question, which was asking to make an obvious decision (see Section 3.1.1 for details). One participant of the scatter-plot condition was removed, since the participant gave less than two points at all given four tasks, which means that the person did not understand how to read the visualization. The remaining, legitimate participants (n = 468) were divided among the condition; each condition had at least 43 participants and the condition with the most participants had 52. Among the 469 participants, 221 were female and 248 were male with a self-reported age range of 20 to 77; and an average age of 37.8. None of them participated in more than one condition. The education levels of the participants were as follows: bachelor’s degree in college, 42.0%; some college but no degree, 20.0%; high school graduate, 12.3%; and associate’s degree; 11.5%. The baseline payment for participation was $0.30. An additional bonus reward was $1.00.
3.2.3. Tasks
The task was to select the best choice of the given options. After selecting one, the participants had to rate their preference on a scale of 0 to 10. The experimental website is shown in Figure 5.
3.2.4. Measures
Four types of quantitative data were collected to answer research questions. Since the attraction effect was evaluated based on the proportions of choosing alternatives [2], we first collected the choice proportions of each alternatives. Next, decision competence was measured with a seven-point Likert-type questionnaire to confirm that the participants deliberately took part in the experiment. In addition, a questionnaire was included asking the reading ability of each visualization type, which was scored on the seven-point Likert scale. Furthermore, the GDMS instrument [44], comprised of 25 items, was used to identify the decision-making styles of participants. The GDMS scale, a reliable and valid scale for assessing decision-making [55], consisted of the seven-point Likert scale questions with five items allocated for each style: (1) rational (e.g., “ I double-check my information sources to be sure I have the right facts before making a decision”); (2) intuitive (e.g., “When making a decision, I rely upon my instincts”; (3) spontaneous (e.g., “I make quick decisions”); (4) dependent (e.g., “I use the advice of other people in making important decisions”); and (5) avoidant (e.g., “I generally make important decisions at the last minute”). Given that the experiment consisted of the tasks where participants supposed to make decisions by themselves, three of the ‘core decision-making process’ styles (rational, intuitive and spontaneous) were analyzed in this research.
3.2.5. Procedure
The experiment began with a tutorial briefing the participants about the study details, including the compensation policy. Then, they faced four scenarios with choice-making tasks, selecting the most attractive alternative among the options presented. In terms of order, the scenarios were randomly shown to each participant. After each decision-making task, they filled out two survey questionnaires evaluating their confidence level of the decision and clarifying their reading competence of the assigned visualization chart. Furthermore, a screening question was given to make sure that participants were paying attention and included between the decision-making tasks. After finishing all decision-making tasks, the participants filled out a demographic survey and 25 items of the General Decision-making Style (GDMS) test. Finally, an open question survey was conducted asking about the participants’ decision-making strategy to gain a deeper understanding of participants’ decision-making strategies and their difficulties during the experiment.
3.2.6. Hypotheses
Given the aforementioned results from the literature review, the research hypotheses are as follows:
A larger proportion of participants will choose the target when the decoy is present.
The decoy will have differing influences depending on the format of visual presentations.
The attraction effect will be influenced by decision-making characteristics of people.
3.3. Qualitative Experiment
To gain a deeper understanding of decision-making strategies and reasons for the attraction effect, we conducted a semi-structured interview with the same survey we used in the experiment. We also analyzed the open-ended question we asked to explain the strategy participants used in the crowdsourcing experiment.
3.3.1. Participants
We recruited eight participants for the interview (mean age = 20.25, , and five were female). All were undergraduate students recruited from two classes at a University in South Korea and they took part in the interview voluntarily.
3.3.2. Procedure
The interview began with a short description of the research purpose and procedure. Then, participants were asked to conduct the decision tasks via the online survey website, to be consistent with the prior experiment. After that, the semi-structured interview was conducted. During the interview, we asked participants a few questions about their decision-making strategies and impressions regarding the decision tasks to check their familiarity with each task. We also asked participants how they would describe themselves among three core decision-making styles, to investigate the influence on their decisions.
4. Results
4.1. Crowdsourcing Experiment
As previous studies suggested [56], a chi-square test was used to evaluate the attraction effect. A chi-square test informs whether or not there is a statistically significant difference between different segments, in our case the presence of the decoy option, especially when the data are presented in a cross tabulation form. For the first step, we calculated descriptive statistics to analyze the characteristics of collected choices. Among 1876 choices, the decoy selection rate was reported as 4.5% on average, which is relatively low compared to previous studies [3]. Consistent with previous studies [9], choices selecting the decoy were removed from the analysis.
4.1.1. Presence of Decoy
Overall, decoy appeared to have a strong effect on the decision-making process. There was a significant difference between option-types (, ), which shows the strong attraction effect. More specifically, the choice changed, with 64.8% choosing the target in the decoy condition and 53.8% choosing it the without the decoy. These results indicate that people are highly affected by the presence of a decoy when making decisions.
However, mixed results were found across the decision tasks. As shown in Table 3, the attraction effect was appeared in three tasks (video streaming, magazine subscription, and coffee gift card), while there was no statistical difference in the phone data plan task. Among these three tasks with positive results, a strong attraction effect was shown in the coffee task. For example, the proportion of participants selecting the target in the coffee task was more than 55%, an increase of 8.4% compared to the without-decoy condition (). Meanwhile, even though there was no statistical change in choices pattern in the coffee and data plan task, the target selection rates lightly increased from 57.3% (without-decoy) to 58.5% (with decoy). Hence, H1 was partially supported.
4.1.2. Visualization Type
Mixed results were found across the five visualization types. A significant difference was found in three visualization conditions (i.e., table, scatterplots, and parallel-coordinate plots), but no statistical differences emerged in the other two conditions (i.e., one-sided and two-sided bar), meaning that there was no decoy effect. A strong effect was reported especially in the table condition (, ) (see Table 4). In addition, the target had a greater share in the decoy × scatterplots and with-decoy × parallel-coordinates plot condition. Similar to these results, although there was no statistical significance, participants in the one-sided bar condition had a tendency to select the target option rather than the competitor when the decoy was present (). By contrast, there was no significant change in the choices pattern of the two-sided bar condition, regardless of the presence of a third option (decoy). The overall comparison of the selection of the three options is shown in Figure 6.
4.1.3. Decision-Making Style
We performed a logistic regression to evaluate the influence of the individuals’ decision-making style (e.g., rational, intuitive, and spontaneous styles) on the attraction effect. As a result, the Nagelkerke was reported as 0.013, indicating that the logistic model can not appropriately explain the effects of dependent variables (decision-making styles) on the independent variables (the attraction effect). Thus, H3 was not supported.
4.1.4. Confidence Level of Choice
The mean decision-confidence level of participants in their choices was 5.9 () on a 7-point Likert scale, and there was no big difference in these five visualization-types (Min = 5.82, Max = 6.01) and in four scenarios (Min = 5.76, Max = 5.97). This indicates that overall participants had confidence in their decisions.
4.1.5. Level of Visualization Comprehension
Lastly, we analyzed the self reported mean reading-comprehension level on each visualization-type. The mean level was reported as 6.15 (Min = 6.00, Max = 6.34, among five visualization-types), indicating that most participants were able to read their assigned visual representation. There was no statistical difference among the mean rating for the visualizations: table (, ), one-sided bar (, ), two-sided bar (, ), scatterplots (, ), and parallel-coordinate plots (, ).
4.2. Qualitative Experiment
Participants mainly took two factors into accounts when making decisions: unit price and their preference for a certain attribute. During the interview, three participants reported that they made decisions regarding their preference on a certain attribute. For example, one of the participants responded that she selected the option with the shortest time period of magazine subscriptions, factoring in her usual buying habits. Consistent with qualitative findings, some participants reported that they first considered the amount of data they usually used to make decisions on phone data plans; therefore, if they needed more than 5 GB, they would select the option with the highest amount without considering the price.
“When choosing a service or a product, I usually like to experience it briefly first. Therefore, in this case, I chose the option with the shortest time period over the price”—p5
In contrast, three out of eight participants from the interview indicated that the unit price was the most important factor when making decisions. They would calculate the unit price first and selected an option that would maximize their cost effectiveness.
“Usually I think of it as units. I compared the price of each unit first and choose the most beneficial one”—p2
A similar tendency was found in the responses of the open-question survey which was included in the crowdsourcing experiment. Figure 7 shows two main decision-making strategies people mentioned a lot during the interview: unit price and individual preferences. We also reported how frequently they mentioned specific strategies. As shown in the figure, calculating unit price appeared to be the most common strategy that people took while making decisions. Some of the participants quickly glanced at the price, while others actually went through a price comparison. In addition, several participants mentioned that they used the ‘double-count’ strategy where they doubled the price of the cheaper option and compared it with other alternatives.
“I was first considering the price value of the package, such as dollar per unit or session”
“I tried to get the most for my money. I tried to see if it was cheaper for [option] 2 if [option] 1 were doubled.”
Aside from these numerical strategies, there were participants who explicitly used visualization as one of their decision-making strategies. For example, a participant reported that they used a bar-chart as the ‘virtual ruler’ to compare the options with the naked eyes.
“I usually divided blue portion by the orange portion and went with which one was more.”
“I determined the monthly price based on PPY [price per year] and what was provided. The charts provided a pretty accurate representation of those numbers/values.”
On the other hand, there were some negative comments on scatterplots. Some participants responded that scatterplots were not as helpful in making decisions.
“I pretty much ignored the diagram as it didn’t make too much sense.”
“I mostly wanted to get the best amount of service for my dollar. Having to refer to a chart made it a bit more difficult to decipher how much I was spending.”
5. Discussion
5.1. Effect of Visualization Types on Decoy Effect
Initially, we hypothesized that visual interfaces may be used to decrease the decoy effect, and our study proved that specific types of visualizations help people avoid the decoy. Among the five types of visualizations, one-sided bar charts and two-sided bar charts turned out to be effective in preventing the decoy effect; we suspect this is because these two visualizations supported their decision-making strategy. From the qualitative analysis, we found the most common strategy people employed to calculate unit price and that they were able to measure the unit price more easily using one-sided and two-sided bar charts, with the help of a visual indicator (i.e., bar size).
Interestingly, we initially hypothesized that parallel-coordinate plots would be most effective in minimizing the decoy effect because the slope could be intuitive for calculating the unit price or comparing relative values. This was not statistically proven, possibly due to unfamiliarity with the visualization type. People are not typically familiar with parallel-coordinate plots and thus are not good at interpreting them, which might have negatively affected their decision-making process [57].
Scatterplots did not seem to help avoid the decoy effect in our experimental setting. They did not support people’s decision-making strategy, and a few participants expressed a negative reaction to the visualization itself because they felt it did not make sense, and it made it hard to decipher the values.
To gain a better explanation on understanding on the underlying process of why certain visualizations help the decision-making process, we examined graph comprehension models [58,59,60,61]. According to the models, the performance for a task while using a visualization is attributed to the match between the task and the visualization. If the necessary information to accomplish the task can be extracted directly from the visualization, only perceptual processing is required (e.g., retrieving a value from a bar chart). However, if the necessary information cannot be extracted directly, spatial processing may be required.
In general, spatial cognition tasks can be solved by a non-spatial strategy, usually mathematically, but people prefer spatial strategies if the visualization is properly given [60]. In our study, the majority of the participants mentioned that the decision-making tasks required mathematical calculation. Since the mathematical calculation is known to be cognitively challenging, people may have fallen into a decision-making bias, applying it as a strategy within the cost–benefit framework (see Section 2.2), and justifying their decision by saying that the target option was at least better than the decoy. However, with proper visualizations, namely one-sided and two-sided bar charts, the task might have been accomplished with spatial processing. That is, with the bar charts, calculating the proportion and comparing values might have offloaded the mental load, helping the participants make a better decision.
5.2. Effect of Scenarios on the Decoy Effect
It turned out that different task types, or scenarios, do affect the decoy effect. While relatively ”light" scenarios such as video streaming, magazine subscriptions, and coffee gift cards showed the decoy effect, the phone data plan task did not exhibit the same pattern. In other words, people were not affected by the decoy in choosing the data plan. One possible explanation is that a relatively large number of people have a strong preference for data size, which might have affected their decision-making process. That is, those people would choose the option with the largest data size no matter how much it costs, which weakens the influence of the decoy.
5.3. Effect of Decision-Making Style on the Decoy Effect
Different decision-making styles may affect an individual’s degree of cognitive bias. While we examined the relationship between decision-making styles (e.g., rational, intuitive, and spontaneous) and the decoy effect, no statistical significance was found. In other words, even when a person has a rational decision-making style, he or she is still prone to cognitive bias such as the decoy effect in decision-making.
It was also not explored whether certain types of decision-making styles may be better supported by visual interfaces, and it would be interesting to further examine how individual differences in decision-making influence cognitive bias and where in the process visualizations might help.
5.4. Limitations
Our study has several limitations. First, for experiments to observe a decoy effect, both between-subject and within-subject experiments had been conducted in previous research. To be more conservative in capturing the effectiveness of the visual representations, we conducted a between-subject study with the advantage of avoiding carryover effects. The participants only experienced one type of visual representation and one condition for the presence of a decoy. However, as decoy effect is about capturing the change in preference to attributes, if we want to directly measure whether a person changes his or her choice in the presence of a decoy, a within-subject study needs to be employed. If this was the case, we could have had a better understanding of whether individuals’ decision-making style had an impact on the decoy effect.
While we tried to incorporate different visualization types in our experiment, testing with more diverse visual interfaces would yield other meaningful insights. We also only focused on the case in which we had three options. If the number of options increases, the effectiveness of visualization could change.
As mentioned previously, it turned out that decision-making scenarios affect the results, which makes it hard to generalize the findings to all decision-making contexts. However, this is an unavoidable problem in decision-making studies because so many different cases exist in our everyday decision-making. Therefore, for further research, we could focus on a certain market segmentation for a limited context to avoid this problem. Additionally, even though we tried to create scenarios that reflected common decision-making situations in everyday life, some participants found them irrelevant. For example, some participants mentioned that they do not read magazines, so they will select the cheapest one. This might have decreased the ecological validity of the study.
6. Conclusions
In this study, we sought to examine whether the decoy effect, a well-known cognitive bias in decision-making, could be mitigated with the help of visual interfaces. After conducting a crowdsourcing experiment using Mturk, we analyzed quantitative data from 469 participants, as well as qualitative data from open-ended questions and follow-up interviews. The results showed that, while different decision-making scenarios may affect the decoy effect, certain types of visualizations such as one-sided and two-sided bar charts help alleviate the decoy effect in decision-making. We believe that this research could be the first step towards uncovering the role of visualization in decreasing cognitive biases, eventually helping people make more informed decisions.
Not only for marketing, decoy effect and visualization can be used to help users make better decisions in everyday life. Consumers are exposed to several situations to compare attributes and make a choice such as for selecting a meal plan or a work-out plan. These representations could help to make a better choice. The use of visualizations is increasing on websites and mobile apps and are known to lower the cognitive load even for processing more information if properly presented [15]. Therefore, we believe that further research on cognitive biases and proper visualizations could help with the decision-making process for various situations.
Conceptualization, Y.J. and S.-H.K.; methodology, Y.J., Y.K., S.-H.K.; software, S.O.; formal analysis, Y.J.; writing and editing, Y.J., S.O., Y.K., S.-H.K.; funding acquisition, S.-H.K. All authors have read and agreed to the published version of the manuscript.
This research was funded by the National Research Foundation of Korea Grant, Grant No. NRF-2019R1C1C1005508 and Institute of Information & communications Technology Planning & Evaluation(IITP) grant funded by the Korea government(MSIT), Grant No. IITP-2020-0-01791.
Not applicable.
Informed consent was obtained from all subjects involved in the study.
The data presented in this study are available on request from the corresponding author.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Figure 1. Two subscription pages of the Economist magazine adopted from [1]: (a) a subscription page with two common options; (b) a subscription page with the two common options and a decoy.
Figure 3. Examples of experimental stimuli. Four visualizations and a table for a baseline: (a) Table, (b) One-Sided Bar Chart, (c) Two-Sided Bar Chart, (d) Scatterplots, and (e) Parallel-Coordinate Plot.
Figure 4. Between subject design with two factors, option type and visualization type.
Figure 5. Capture of experimental website with a two-sided bar chart with two options without decoy.
Figure 6. Percentage of selection of each option across all scenarios and visualizations.
Figure 7. Frequency of each strategy mentioned by the participants in the interview.
Example of Primary Alternatives and a Decoy [
Mobility | Power | |
---|---|---|
Character A | 3 | 8 |
Character A- | 1 | 6 |
Character B | 7 | 4 |
Character B- | 5 | 2 |
Data set for each scenario.
Scenario | Options | Attributes | Type | |
---|---|---|---|---|
Economist |
Price |
Number |
||
A | 59 | 21 | Competitor | |
B | 125 | 52 | Target | |
C | 125 | 42 | Decoy | |
Video |
Price ($) | Duration |
||
A | 47.99 | 6 | Competitor | |
B | 78.99 | 12 | Target | |
C | 78.99 | 10 | Decoy | |
Gift Card |
Price ($) | Number |
||
A | 65 | 21 | Competitor | |
B | 124 | 42 | Decoy | |
C | 124 | 52 | Target | |
Phone |
Price |
Data Plan |
||
A | 28.99 | 5 | Competitor | |
B | 48.99 | 8 | Decoy | |
C | 48.99 | 10 | Target |
Summary of choice probabilities for the decoy placement in each scenario.
Scenario | Decoy Option | Competitor | Target | Decoy |
|
p-Value |
---|---|---|---|---|---|---|
Economist |
Without | 62.3% | 37.7% | 8.35 | 0.004 | |
With | 45.4% | 47.2% | 7.4% | |||
Video Streaming |
Without | 28.9% | 71.1% | 5.34 | 0.021 | |
With | 18.3% | 75.1% | 6.6% | |||
Gift Card |
Without | 53.1% | 46.9% | 11.303 | 0.001 | |
With | 32.6% | 55.3% | 12.1% | |||
Phone |
Without | 42.7% | 57.3% | 2.471 | 0.116 | |
With | 32.3% | 58.5% | 9.2% |
Summary of choice probabilities for decoy placement for each visualization.
Visualization | Decoy Option | Competitor | Target | Decoy |
|
p-Value |
---|---|---|---|---|---|---|
Table | Without | 45.3% | 54.7% | 17.611 | 0.000 | |
With | 21.7% | 67.8% | 10.5% | |||
One-sided Bar | Without | 44.3% | 55.7% | 3.134 | 0.077 | |
With | 32.5% | 60.1% | 7.4% | |||
Two-sided Bar | Without | 42.3% | 57.7% | 0.738 | 0.390 | |
With | 34.3% | 55.9% | 9.8% | |||
Scatterplots | Without | 51.6% | 48.4% | 5.392 | 0.020 | |
With | 33.7% | 51.7% | 14.6% | |||
Parallel-coordinate |
Without | 50.5% | 49.5% | 4.256 | 0.039 | |
With | 38.0% | 58.0% | 4% |
References
1. Ariely, D. Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That Shape Our Decisions; 1st ed. Harper Perennial: London, UK, 2010.
2. Huber, J.; Payne, J.W.; Puto, C. Adding Asymmetrically Dominated Alternatives: Violations of Regularity and the Similarity Hypothesis. J. Consum. Res.; 1982; 9, pp. 90-98. [DOI: https://dx.doi.org/10.1086/208899]
3. Frederick, S.; Lee, L.; Baskin, E. The limits of attraction. J. Mark. Res.; 2014; 51, pp. 487-507. [DOI: https://dx.doi.org/10.1509/jmr.12.0061]
4. Simonson, I. Vices and virtues of misguided replications: The case of asymmetric dominance. J. Mark. Res.; 2014; 51, pp. 514-519. [DOI: https://dx.doi.org/10.1509/jmr.14.0093]
5. Ratneshwar, S.; Shocker, A.D.; Stewart, D.W. Toward understanding the attraction effect: The implications of product stimulus meaningfulness and familiarity. J. Consum. Res.; 1987; 13, pp. 520-533. [DOI: https://dx.doi.org/10.1086/209085]
6. Schumpe, B.M.; Bélanger, J.J.; Nisa, C.F. The reactance decoy effect: How including an appeal before a target message increases persuasion. J. Personal. Soc. Psychol.; 2020; 119, 272. [DOI: https://dx.doi.org/10.1037/pspa0000192] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32297770]
7. Wu, L.; Liu, P.; Chen, X.; Hu, W.; Fan, X.; Chen, Y. Decoy effect in food appearance, traceability, and price: Case of consumer preference for pork hindquarters. J. Behav. Exp. Econ.; 2020; 87, 101553. [DOI: https://dx.doi.org/10.1016/j.socec.2020.101553]
8. Wu, C.; Cosguner, K. Profiting from the decoy effect: A case study of an online diamond retailer. Mark. Sci.; 2020; 39, pp. 974-995. [DOI: https://dx.doi.org/10.1287/mksc.2020.1231]
9. Dimara, E.; Bezerianos, A.; Dragicevic, P. The attraction effect in information visualization. IEEE Trans. Vis. Comput. Graph.; 2017; 23, pp. 471-480. [DOI: https://dx.doi.org/10.1109/TVCG.2016.2598594] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27875163]
10. Dimara, E.; Bailly, G.; Bezerianos, A.; Franconeri, S. Mitigating the attraction effect with visualizations. IEEE Trans. Vis. Comput. Graph.; 2019; 25, pp. 850-860. [DOI: https://dx.doi.org/10.1109/TVCG.2018.2865233] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30137000]
11. Kittur, A.; Chi, E.H.; Suh, B. Crowdsourcing user studies with Mechanical Turk. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Florence, Italy, 5–10 April 2008; pp. 453-456.
12. Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson, B. Who are the crowdworkers? Shifting demographics in Mechanical Turk. Proceedings of the CHI’10 Extended Abstracts on Human Factors in Computing Systems; Atlanta, GA, USA, 10–15 April 2010; pp. 2863-2872.
13. Horton, J.J.; Rand, D.G.; Zeckhauser, R.J. The online laboratory: Conducting experiments in a real labor market. Exp. Econ.; 2011; 14, pp. 399-425. [DOI: https://dx.doi.org/10.1007/s10683-011-9273-9]
14. Heer, J.; Bostock, M. Crowdsourcing graphical perception: Using mechanical turk to assess visualization design. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2010; Atlanta, GA, USA, 10–15 April 2010; pp. 203-212.
15. Kim, S.H. Understanding the Role of Visualizations on Decision Making: A Study on Working Memory. Informatics; 2020; 7, 53. [DOI: https://dx.doi.org/10.3390/informatics7040053]
16. Thomas, J.; Cook, K. Illuminating the Path: The Research and Development Agenda for Visual Analytics; Technical Report Pacific Northwest National Laboratory (PNNL): Richland, WA, USA, 2005.
17. Yoon, K.; Hwang, C. Manufacturing plant location analysis by multiple attribute decision making: Part I—single-plant strategy. Int. J. Prod. Res.; 1985; 23, pp. 345-359. [DOI: https://dx.doi.org/10.1080/00207548508904712]
18. Inselberg, A.; Dimsdale, B. Parallel Coordinates: A Tool For Visualizing Multi-dimensional Geometry. Proceedings of the First IEEE Conference on Visualization: Visualization ′90; San Francisco, CA, USA, 23–26 October 1990; pp. 361-378.
19. Wittenburg, K.; Lanning, T.; Heinrichs, M.; Stanton, M. Parallel bargrams for consumer-based information exploration and choice. Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, UIST ’01; Orlando, FL, USA, 11–14 November 2001; ACM: New York, NY, USA, 2001; pp. 51-60.
20. Gratzl, S.; Lex, A.; Gehlenborg, N.; Pfister, H.; Streit, M. Lineup: Visual analysis of multi-attribute rankings. IEEE Trans. Vis. Comput. Graph.; 2013; 19, pp. 2277-2286. [DOI: https://dx.doi.org/10.1109/TVCG.2013.173]
21. Hur, I.; Yi, J.S. SimulSort: Multivariate Data Exploration Through An Enhanced Sorting Technique. Human-Computer Interaction. Novel Interaction Methods and Techniques; Jacko, J. Lecture Notes in Computer Science Springer: Berlin/Heidelberg, Germany, 2009; Volume 5611, pp. 684-693.
22. Dimara, E.; Bezerianos, A.; Dragicevic, P. Conceptual and methodological issues in evaluating multidimensional visualizations for decision support. IEEE Trans. Vis. Comput. Graph.; 2018; 24, pp. 749-759. [DOI: https://dx.doi.org/10.1109/TVCG.2017.2745138] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28866571]
23. Dimara, E.; Franconeri, S.; Plaisant, C.; Bezerianos, A.; Dragicevic, P. A task-based taxonomy of cognitive biases for information visualization. IEEE Trans. Vis. Comput. Graph.; 2018; 26, pp. 1413-1432. [DOI: https://dx.doi.org/10.1109/TVCG.2018.2872577] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30281459]
24. Simon, H.A. A behavioral model of rational choice. Q. J. Econ.; 1955; 69, pp. 99-118. [DOI: https://dx.doi.org/10.2307/1884852]
25. Weinstein, I. Don’t believe everything you think: Cognitive bias in legal decision making. Clin. L. Rev.; 2002; 9, 783.
26. Boatswain, M.L. Decoy Effects in Brand Positioning. Ph.D. Thesis; Kingston University: Kingston upon Thames, UK, 2015.
27. Cui, Y.G.; Kim, S.S.; Kim, J. Impact of preciseness of price presentation on the magnitude of compromise and decoy effects. J. Bus. Res.; 2021; 132, pp. 641-652.
28. Zhang, T.; Zhang, D. Agent-based simulation of consumer purchase decision-making and the decoy effect. J. Bus. Res.; 2007; 60, pp. 912-922. [DOI: https://dx.doi.org/10.1016/j.jbusres.2007.02.006]
29. Pettibone, J.C.; Wedell, D.H. Testing alternative explanations of phantom decoy effects. J. Behav. Decis. Mak.; 2007; 20, pp. 323-341. [DOI: https://dx.doi.org/10.1002/bdm.557]
30. Wedell, D.H. Distinguishing among models of contextually induced preference reversals. J. Exp. Psychol. Learn. Mem. Cogn.; 1991; 17, 767. [DOI: https://dx.doi.org/10.1037/0278-7393.17.4.767]
31. Wedell, D.H.; Pettibone, J.C. Using Judgments to Understand Decoy Effects in Choice. Organ. Behav. Hum. Decis. Process.; 1996; 67, pp. 326-344. [DOI: https://dx.doi.org/10.1006/obhd.1996.0083]
32. Pratkanis, A.R.; Farquhar, P.H. A brief history of research on phantom alternatives: Evidence for seven empirical generalizations about phantoms. Basic Appl. Soc. Psychol.; 1992; 13, pp. 103-122. [DOI: https://dx.doi.org/10.1207/s15324834basp1301_9]
33. Tversky, A.; Kahneman, D. Availability: A heuristic for judging frequency and probability. Cogn. Psychol.; 1973; 5, pp. 207-232. [DOI: https://dx.doi.org/10.1016/0010-0285(73)90033-9]
34. Doyle, J.R.; O’Connor, D.J.; Reynolds, G.M.; Bottomley, P.A. The robustness of the asymmetrically dominated effect: Buying frames, phantom alternatives, and in-store purchases. Psychol. Mark.; 1999; 16, pp. 225-243. [DOI: https://dx.doi.org/10.1002/(SICI)1520-6793(199905)16:3<225::AID-MAR3>3.0.CO;2-X]
35. Heath, T.B.; Chatterjee, S. Asymmetric decoy effects on lower-quality versus higher-quality brands: Meta-analytic and experimental evidence. J. Consum. Res.; 1995; 22, pp. 268-284. [DOI: https://dx.doi.org/10.1086/209449]
36. Park, J.; Kim, J. The effects of decoys on preference shifts: The role of attractiveness and providing justification. J. Consum. Psychol.; 2005; 15, pp. 94-107. [DOI: https://dx.doi.org/10.1207/s15327663jcp1502_2]
37. Ariely, D.; Wallsten, T.S. Seeking Subjective Dominance in Multidimensional Space: An Explanation of the Asymmetric Dominance Effect. Organ. Behav. Hum. Decis. Process.; 1995; 63, pp. 223-232. [DOI: https://dx.doi.org/10.1006/obhd.1995.1075]
38. Mellers, B.A.; Cooke, A.D.J. Trade-offs depend on attribute range. J. Exp. Psychol. Hum. Percept. Perform.; 1994; 20, pp. 1055-1067. [DOI: https://dx.doi.org/10.1037/0096-1523.20.5.1055]
39. Simonson, I. Choice Based on Reasons: The Case of Attraction and Compromise Effects. J. Consum. Res.; 1989; 16, pp. 158-174. [DOI: https://dx.doi.org/10.1086/209205]
40. Driver, M.J.; Brousseau, K.R.; Hunsaker, P.L. The Dynamic Decision Maker: Five Decision Styles for Executive and Business Success; IUniverse: Lincoln, NE, USA, 1998.
41. Walsh, G.; Hennig-Thurau, T.; Wayne-Mitchell, V.; Wiedmann, K.P. Consumers’ decision-making style as a basis for market segmentation. J. Target. Meas. Anal. Mark.; 2001; 10, pp. 117-131. [DOI: https://dx.doi.org/10.1057/palgrave.jt.5740039]
42. Daud, N.G.N.; Adnan, W.A.W.; Noor, N.L.M. Information visualization techniques and decision style: The effects in decision support environments. Int. J. Digit. Content Technol. Its Appl.; 2008; 2, pp. 20-24.
43. Leykin, Y.; DeRubeis, R.J. Decision-making styles and depressive symptomatology: Development of the Decision Styles Questionnaire. Judgm. Decis. Mak.; 2010; 5, 506.
44. Scott, S.G.; Bruce, R.A. Decision-making style: The development and assessment of a new measure. Educ. Psychol. Meas.; 1995; 55, pp. 818-831. [DOI: https://dx.doi.org/10.1177/0013164495055005017]
45. Del Campo, C.; Pauser, S.; Steiner, E.; Vetschera, R. Decision making styles and the use of heuristics in decision-making. J. Bus. Econ.; 2016; 86, pp. 389-412. [DOI: https://dx.doi.org/10.1007/s11573-016-0811-y]
46. Crossley, C.D.; Highhouse, S. Relation of job search and choice process with subsequent satisfaction. J. Econ. Psychol.; 2005; 26, pp. 255-268. [DOI: https://dx.doi.org/10.1016/j.joep.2004.04.001]
47. Fischer, S.; Soyez, K.; Gurtner, S. Adapting Scott and Bruce’s general decision-making style inventory to patient decision-making in provider choice. Med. Decis. Mak.; 2015; 35, pp. 525-532. [DOI: https://dx.doi.org/10.1177/0272989X15575518] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/25810267]
48. Teppan, E.C.; Felfernig, A. Minimization of Product Utility Estimation Errors in Recommender Result Set Evaluations. Proceedings of the 2009 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT ’09; Milan, Italy, 15–18 September 2009; IEEE Computer Society: Washington, DC, USA, 2009; Volume 1, pp. 20-27. [DOI: https://dx.doi.org/10.1109/WI-IAT.2009.11]
49. Teppan, E.; Friedrich, G.; Felfernig, A. Impacts of Decoy Effects on the Decision Making Ability. Proceedings of the 2010 IEEE 12th Conference on Commerce and Enterprise Computing (CEC); Shanghai, China, 10–12 November 2010; pp. 112-119. [DOI: https://dx.doi.org/10.1109/CEC.2010.30]
50. Malkoc, S.A.; Hedgcock, W.; Hoeffler, S. Between a rock and a hard place: The failure of the attraction effect among unattractive alternatives. J. Consum. Psychol.; 2013; 23, pp. 317-329. [DOI: https://dx.doi.org/10.1016/j.jcps.2012.10.008]
51. Lee, S.; Kim, S.H.; Kwon, B.C. Vlat: Development of a visualization literacy assessment test. IEEE Trans. Vis. Comput. Graph.; 2016; 23, pp. 551-560. [DOI: https://dx.doi.org/10.1109/TVCG.2016.2598920] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27875171]
52. Pinker, S. A theory of graph comprehension. Artificial Intelligence and the Future of Testing; Psychology Press: New York, NY, USA, 1990; pp. 73-126.
53. Shah, P.; Hoeffner, J. Review of graph comprehension research: Implications for instruction. Educ. Psychol. Rev.; 2002; 14, pp. 47-69. [DOI: https://dx.doi.org/10.1023/A:1013180410169]
54. Paolacci, G.; Chandler, J.; Ipeirotis, P. Running Experiments on Amazon Mechanical Turk. Judgm. Decis. Mak.; 2010; 5, pp. 411-419.
55. Baiocco, R.; Laghi, F.; D’Alessio, M. Decision-making style among adolescents: Relationship with sensation seeking and locus of control. J. Adolesc.; 2009; 32, pp. 963-976. [DOI: https://dx.doi.org/10.1016/j.adolescence.2008.08.003] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/18848722]
56. Bateman, I.J.; Munro, A.; Poe, G.L. Decoy effects in choice experiments and contingent valuation: Asymmetric dominance. Land Econ.; 2008; 84, pp. 115-127. [DOI: https://dx.doi.org/10.3368/le.84.1.115]
57. Munzner, T. Visualization Analysis and Design; CRC Press: Boca Raton, FL, USA, 2014.
58. Cleveland, W.S.; McGill, R. Graphical perception: Theory, experimentation, and application to the development of graphical methods. J. Am. Stat. Assoc.; 1984; 79, pp. 531-554. [DOI: https://dx.doi.org/10.1080/01621459.1984.10478080]
59. Simkin, D.; Hastie, R. An information-processing analysis of graph perception. J. Am. Stat. Assoc.; 1987; 82, pp. 454-465. [DOI: https://dx.doi.org/10.1080/01621459.1987.10478448]
60. Trickett, S.B.; Trafton, J.G. Toward a comprehensive model of graph comprehension: Making the case for spatial cognition. International Conference on Theory and Application of Diagrams; Springer: Berlin/Heidelberg, Germany, 2006; pp. 286-300.
61. Carpenter, P.A.; Shah, P. A model of the perceptual and conceptual processes in graph comprehension. J. Exp. Psychol. Appl.; 1998; 4, 75. [DOI: https://dx.doi.org/10.1037/1076-898X.4.2.75]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The decoy effect is a well-known, intriguing decision-making bias that is often exploited by marketing practitioners to steer consumers towards a desired purchase outcome. It demonstrates that an inclusion of an alternative in the choice set can alter one’s preference among the other choices. Although this decoy effect has been universally observed in the real world and also studied by many economists and psychologists, little is known about how to mitigate the decoy effect and help consumers make informed decisions. In this study, we conducted two experiments: a quantitative experiment with crowdsourcing and a qualitative interview study—first, the crowdsourcing experiment to see if visual interfaces can help alleviate this cognitive bias. Four types of visualizations, one-sided bar chart, two-sided bar charts, scatterplots, and parallel-coordinate plots, were evaluated with four different types of scenarios. The results demonstrated that the two types of bar charts were effective in decreasing the decoy effect. Second, we conducted a semi-structured interview to gain a deeper understanding of the decision-making strategies while making a choice. We believe that the results have an implication on showing how visualizations can have an impact on the decision-making process in our everyday life.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
1 Management of Technology, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, Korea;
2 Department of IT Convergence, Dong-eui University, 176 Eomgwang-ro, Busanjin-gu, Busan 47340, Korea;
3 Underwood International College, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, Korea;
4 Department of Industrial ICT Engineering, Dong-eui University, 176 Eomgwang-ro, Busanjin-gu, Busan 47340, Korea