Cognitive Bias and the Design Process, Part 2

Published
Reading Time
10 mins
Cognitive Bias and the Design Process, Part 2

At any given moment we’re faced with countless bits of information to process so that we can form judgments, make decisions, and get through the day. We’ve developed mental shortcuts to avoid becoming overwhelmed but these shortcuts have a downside: they often lead to fallacious reasoning and unintended results. Designers are just as vulnerable to the blindspots and fallacies that come with cognitive bias as the people who use the products and services we design. It’s impossible to entirely eliminate bias, but we can seek to understand them and identify how they might impact our design decisions.

In this article, I continue my exploration of the most common cognitive biases that designers must deal with during the design process. It’s by no means a comprehensive list, rather a dive into the ones that I’ve experienced regularly in my work. If you haven’t read the first part of this series, you can check it out here. Let’s get started!


Space shuttle launch

You are not the User

A 1976 study1 conducted at Stanford University presented participants with hypothetical situations and asked them to determine what percentage of people would make one of two choices. For example, one situation asked participants to determine what percentage of their peers would vote for a referendum that allocated large sums of money towards a revived space program with the goal of manned and unmanned exploration of the moon and planets nearest Earth. Some additional context was given, explaining that supporters would argue that this would provide jobs, spur technology, and promote national pride and unity. In contrast, opponents argue that a space program will increase taxes or else drain money from important domestic priorities while not achieving the goals of the program.

Once the participants provided their estimate, they were asked to disclose what they would do given the situation and fill out two questionnaires about the personality traits of those who would make each of the two choices. Researchers concluded that the participants not only expected that most of their peers would make the same choices they did but assumed those that opted for the choice had extreme personality traits.

People often overestimate the degree to which other people will agree, think, and behave the way they do.

The study highlights our tendency to assume that our personal qualities, characteristics, beliefs, and actions are relatively widespread while alternate points of view are rare, deviant, and more extreme. This bias, known in psychology as the false-consensus effect, can skew our thinking and negatively influence design decisions.

How it occurs

As the study described above indicates, people often overestimate the degree to which other people will agree, think, and behave the way they do. This applies to designers, developers, and UX researchers who are tasked with creating digital interfaces as well: we infer generalizations when we assume that others will perceive and understand the interface in the same way we do. The false-consensus effect can be found in the design process when there’s a lack of data on how the actual users respond to your designs.

Identify your assumptions

The first step is to identify the assumptions you or your team are making about the intended users. Whether it’s assumptions related to their needs, pain points, or how they accomplish tasks, we must begin by acknowledging the things we are assuming. Once assumptions have been identified, they should be prioritized by the amount of risk they carry and investigated via tests.

Test with real users

The next step is to conduct user interviews and usability tests with the intended audience. User interviews are critical in understanding the decision-making process, needs and frustrations, opportunities, and how steps are taken to complete tasks. Usability tests are also quite effective for understanding how actual users respond to your designs by watching them use these designs and challenging your assumptions.

Availability bias

A mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method, or decision.

Negativity bias

The notion is that even when of equal intensity, things of a more negative nature (e.g. unpleasant thoughts, emotions, or social interactions; harmful/traumatic events) have a greater effect on one’s psychological state and processes than neutral or positive things.

Loss aversion

The tendency to prefer avoiding losses to acquiring equivalent gains.


Flooding

Reason over recall

A study2 conducted by Amos Tversky and Daniel Kahneman in 1982 asked a sample of 245 University of British Columbia undergraduates to evaluate the probability of several catastrophic events in 1983. The events were presented in two versions: one that included only the basic outcome and another that included a more detailed scenario leading to the same outcome. For example, half of the participants evaluated the probability of a massive flood somewhere in North America in which more than 1000 people drown. The other half evaluated the probability of an earthquake in California, causing a flood in which more than 1000 people drown. A 9-point scale was used for the estimations: less than .01%, .1%, .5%, 1%, 2%, 5%, 10%, 25%, and 50% or more.

What Tversky and Kahneman found was that the estimates of the conjunction (earthquake and flood) were significantly higher than the estimates of just the flood (p < .01, by a Mann-Whitney test). Even though the chance of a flood in California is smaller than that of a flood for all of North America, participants estimated that the chance of the flood provoked by an earthquake in California is higher. The researchers concluded that since an earthquake causing a flood in California is more specific and easier to imagine (versus a flood in an ambiguous area like all of North America), people are more likely to believe its probability. The same pattern was observed in other problems.

When we make decisions based on what’s easy to recall, we fall into the trap of letting limited information guide us rather than reasoning through the situation.

The study highlights our tendency to believe that the easier that something is to recall, the more frequent it must happen. This tendency, known as the availability heuristic, is a common bias we fall into as a way for our brains to make conclusions with little mental effort or strain based on evidence that is instantly available into our mind. It can also heavily influence our decision-making abilities and prevent us from seeing the bigger picture.

How it occurs

Mental shortcuts such as the availability heuristic most commonly occur when we need to make quick decisions and therefore rely on information that is easily recalled. This falls within the category of System 1 thinking, or the mental events that occur automatically and require little or no effort. When we let this bias drive our design decisions, it can easily lead to setting goals based on what’s easy to measure (versus what’s valuable to measure) or blindly mimicking competitors as a result of ‘competitor research’.

Invoke System 2 thinking

When we make decisions based on what’s easy to recall, we fall into the trap of letting limited information guide us rather than reasoning through the situation. One effective approach to avoiding the pitfalls of this bias is to intentionally invoke System 2 thinking, which requires deliberate processing and contemplation. The enhanced monitoring of System 2 thinking works to override the impulses of System 1 and give us the room to slow down, identify our biases, and more carefully consider the impact of our decisions.

Valuable metrics only

Relying on data that quickly comes to mind can easily impact how success is defined within a product or service. When project goals are set based on what’s easy to measure versus what’s valuable, the measure becomes a target and we will game the system to hit that target (i.e. clicks, DAU, MAU). This ultimately leads to losing track of the needs and goals of the people who use the product or service. The alternative is taking time to understand what the needs and goals of people actually are and then defining the appropriate metrics that correspond to them.

Survivorship bias

The logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility.

Confirmation bias

The tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values.


Refraction and motility of the eye diagram

Know your blind spots

A study3 published in 2006 by Emily Pronin and Matthew Kugler explored how people make judgments about themselves and others. The study began with the researchers explaining cognitive bias to the participants. Next, they were asked how might cognitive bias might affect their judgment concerning the other participants. What the researchers found was that test participant rated themselves as less susceptible to bias than others in the experiment. When they were asked to explain the judgment of others, they did so openly. In contrast, when they were asked to explain their own judgments, they looked inward and searched their thoughts and feelings for biased motives. The fact that biases operate unconsciously means that this introspection was not informative, but people mistakenly interpret them as evidence that they are immune to bias.

The implicit biases that govern our decision-making and judgments are always present.

We believe that we are rational, that our actions and judgments are accurate, and they are not influenced by unconscious bias. However, the study described highlights the tendency for people to not see their own biases and see themselves as less susceptible to bias than others. In psychology, this is known as a bias blind spot, and its potential to undermine the decision-making process has a profound impact on design teams.

How it occurs

The implicit biases that govern our decision-making and judgments are always present. They are part of our design as humans, enabling us to efficiently process information and get through the day without getting overwhelmed. They are useful mental shortcuts that can become a problem when we’re unaware of them and we base our decisions and judgments on the assumptions and perspectives we believe to be true.

Acknowledge your bias

The moment we acknowledge that we are operating under unconscious bias, we are then empowered to change it. We must first seek to understand the biases that affect our judgment and decision-making ability. Once we recognize the fact that our own biases can influence us, we can begin the process of identifying them and determine the best approach for overcoming them.

Seek Perspective

Feedback helps to identify our unconscious bias and how it’s affecting our judgment and decision-making ability. Our work becomes more inclusive once we embrace a plurality of perspectives. We must seek a variety of perspectives to broaden how we think of our work and the value it provides people.

Cultivate diversity

Diverse teams bring diversity in life experience and increase a team’s ability to counter the unconscious bias present in homogeneous groups. We must cultivate diversity within our teams to bring a diversity of thinking to our work.

Egocentric bias

The tendency to rely too heavily on one’s perspective and/or have a higher opinion of oneself than reality.

False-consensus effect

The tendency to assume that our personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.


These biases enable us to efficiently process information and get through the day without getting overwhelmed, but can also become problematic when we’re unaware of them and we base our decisions and judgments on the assumptions and perspectives we believe to be true. Once we acknowledge that we are not the user, we break from the self-referential thinking that can limit our ability to design truly effective solutions. We can use reason over recall to avoid setting goals based on what’s easy to measure or blindly mimicking competitors as a result of ‘competitor research’. Finally, we must accept that we are susceptible to implicit biases that govern our decision-making and judgments.


  1. L. Ross, D. Greene, P. House. 1977. The “False Consensus Effect: An Egocentric Bias in Social Perception and Attribution Processes”. Journal Of Experimental Social Psychology. ↩︎

  2. Tversky, A., & Kahneman, D. (1983). “Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment”. Psychological Review, 90(4), 293–315. ↩︎

  3. Pronin, E., Kugler, M. (2007). “Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot”. Journal of Experimental Social Psychology. Elsevier. 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011. ↩︎