
4 Cognitive biases that are short-circuiting your decision making
Erik Zager, DVM, DACVECC, kicks off Fetch Long Beach with an evidence-based framework for everyday clinical decision-making.
Erik Zager,DVM, DACVECC, is a board-certified emergency and critical care specialist and head of Emergency and Critical Care at Philadelphia Animal Specialty & Emergency (PASE). He also serves as vice president of Global Instruction for Veterinary Empowerment (GIVE), a nonprofit focused on expanding specialty-level veterinary education worldwide.
Introduction
On day 1 of the Fetch dvm360 Conference in Long Beach, California, Erik Zager, DVM, DACVECC opened the event with a keynote focused on the “why” behind clinical choices. Through the lens of a case study on a schnauzer named Panky, he examined common cognitive biases that may lead to suboptimal decision making in veterinary practice. By being aware of these biases, clinicians can take steps toward preventing or correcting for them, busy clinicians can immediately benefit from clearer decisions that reduce unnecessary tests, lower client cost and anxiety, and focus clinic resources where they improve outcomes.1
Fast vs slow thinking
Zager drew on the work of Nobel laureate Daniel Kahneman, whose book Thinking Fast and Slow posits that human thinking is governed by 2 distinct modes: System 1 (fast) and System 2 (slow). System 1 is the fast, automatic, and intuitive mode. It handles effortless tasks like recognizing a friend's face or making snap judgments. System 2 is the slow, analytical, and effortful mode, which is engaged for complex tasks like solving a math problem or learning a new skill.2
Cognitive biases are features of System 1 thinking. They are the shortcuts the brain uses to save energy and make rapid assessments. They are essential for navigating daily life, but they become problematic when people rely on them for high-stakes decisions that require the slower, more deliberate System 2. “Oftentimes our brain defaults to System 1 unless we are trying to dial up specifically System 2,” Zager said. 2
4 Biases that influence decision making
1. Anchoring bias
Anchoring bias is the tendency to rely too heavily on the first piece of information offered when making decisions. All subsequent judgments and negotiations are then adjusted based on that initial reference point, which can dramatically skew the final result.
Panky the schnauzer case study: The first pieces of information the doctor received were "schnauzer" and "day after Thanksgiving." This immediately anchored the doctor's thinking to a common diagnosis for that breed after a holiday: pancreatitis. This anchor caused them to view all subsequent information through that specific lens.
2. Search satisfaction bias
Search satisfaction bias is the tendency to stop searching for answers once one finds an explanation that seems to fit the situation. Instead of treating the first plausible reason as one possibility among many, the brain accepts it as the answer, potentially causing the clinician to overlook more complete or accurate information.
Panky the schnauzer case study: The doctor ran a quick, easy diagnostic test. The result was ambiguous, but it was enough to support the anchored idea of pancreatitis. Satisfied that an answer had been found, the doctor stopped the diagnostic search, ignoring other possibilities and calling the case solved.
3. Confirmation bias
Confirmation bias is the powerful tendency to search for, interpret, favor, and recall information in a way that confirms or supports pre-existing beliefs or hypotheses. It is much easier for people to accept information that reinforces their worldview than to accept information that challenges it.
Panky the schnauzer case study: Believing Panky had pancreatitis, the doctor interpreted the ambiguous diagnostic test as "abnormal" because that result confirmed the pre-existing theory. Even though the test was objectively normal, the doctor's brain cherry-picked the data that fit the desired narrative.
4. Outcome bias
Outcome bias is the tendency to judge a past decision based on its eventual outcome instead of on the quality of the decision at the time it was made. This is often driven by favoring information that supports a hoped-for outcome, which is different from confirmation bias, which supports what one believes.
Panky the schnauzer case study: The doctor and staff felt stressed by Panky's aggressive behavior and the owner's financial constraints. The hoped-for outcome was to resolve the case quickly and get the dog out of the clinic. This desire drove them to choose the fastest, easiest diagnostic path, rather than the most thorough one, to reach that desired end result as quickly as possible.
How to combat bias
Cognitive biases are a constant and natural part of how the human brain works, Zager said. The goal is to become aware of them so one can actively engage their more deliberate, System 2 thinking when it matters most. Awareness is the first step, but one can also use active "debiasing" strategies to improve decision-making. “I want us to develop or start at least start developing some of the tools that we can recognize certain automatic thinking and switch it over to a more purposeful thinking that is going to be less prone to these biases," Zager said. Here are a few of the tips he offered.
First, slow down. When a decision is important, consciously engage your "slow" thinking. The simple act of writing down your thought process, listing pros and cons, or explaining reasoning to someone else forces one to move from a gut instinct to a more deliberate analysis. I'm going to emphasize that effortful thinking that is also deliberate and is less prone to bias,” Zager said.
Second, crowdsource brainpower. Share the mental burden by asking colleagues, friends, or partners for their perspective, as they may see things you've missed. Be especially open to those who might disagree with you, as they are most likely to challenge your biases. “It is too much to be thinking this much for every single one of our patients, so ask your colleagues,” said Zager. “And [when] I say colleagues, not just the other doctors, but your technicians, your CSRs [as well]. They might have gotten information from the owner out front that you might not have had."
Finally, anticipate risks. Actively consider how a decision could go wrong. Forcing yourself to think about potential negative outcomes helps counter the pull of confirmation bias and outcome bias, which make us focus only on information that supports our beliefs and hopes. “Forcing yourself to think about potential negative outcomes helps counter the pull of confirmation bias and outcome bias, which make us focus only on information that supports our beliefs and hopes," Zager said.
By knowing that these hidden shortcuts exist and actively working to question one’s automatic thoughts, clinicians can make clearer, more deliberate decisions in their work every day.
References
- Zager E. Why and how we choose: Rethinking decisions in veterinary practice. Presented at: Fetch dvm360 Conference; December 5-6, 2025; Long Beach, CA.
- Kahneman D. Thinking, Fast and Slow. Farrar, Straus and Giroux; 2011.
To read more news and view expert insights from Fetch Long Beach, visit dvm360’s dedicated site for conference coverage at
Newsletter
From exam room tips to practice management insights, get trusted veterinary news delivered straight to your inbox—subscribe to dvm360.





