What Is Attribution Bias?
Attribution Bias is a term used in the recruitment and staffing industry.
Why Attribution Bias Matters in Recruitment
Attribution bias skews how interviewers interpret identical behaviour depending on who exhibits it. A candidate who asks direct questions about authority and decision rights is seen as confident and leadership-ready when they match the interviewer's mental model of what a leader looks like, and as aggressive or overreaching when they do not. The same answer, same words, same delivery, different assessment. When this operates at scale across a hiring process, it produces a workforce that reflects the biases of the people doing the hiring rather than the requirements of the roles being filled.
For staffing agencies, attribution bias creates a specific liability: clients may reject strong candidates based on assessments that reflect interviewer bias rather than candidate quality, and the agency has no structured way to challenge those decisions without a framework for understanding what happened. Agencies that help clients build structured interview and scoring systems are partly in the business of making attribution bias visible and correctable.
How Attribution Bias Works
Attribution bias in recruitment is a subset of the broader psychological phenomenon documented by social psychologists including Jones and Davis, who showed that people explain the same behaviour differently depending on whether they attribute it to internal factors, personality and intent, or external factors, circumstances and context. In hiring, this plays out in two primary patterns.
The first is the fundamental attribution error: when a candidate performs poorly on an interview question, interviewers tend to attribute it to the candidate's inherent lack of ability rather than considering situational factors like nerves, an unfamiliar question format, or a poorly worded prompt. When the same candidate performs well, interviewers may attribute it to luck or an easy question rather than competence. This is particularly common in technical interviews where the interviewer has strong domain expertise and implicitly compares candidates against their own internal experience of solving similar problems.
The second pattern is in-group attribution bias: candidates who share background, education, or communication style with the interviewer are evaluated with more charitable attribution for their mistakes and stronger attribution for their successes. A candidate who attended the same university as the hiring manager gets more benefit of the doubt when they stumble on a case question. Research from economists at the University of Toronto and MIT has shown that callbacks for identical resumes vary significantly based on name alone, a finding consistent with attribution patterns where unfamiliarity triggers more negative internal attributions.
A practical example: two candidates interview for a senior finance analyst role. Both give incomplete answers to a financial modelling question. Candidate A went to a target school and has a confident presentation style. Candidate B attended a less prestigious institution and is quieter. The interviewer rates Candidate A's incomplete answer as showing strong conceptual thinking that just needs polish, and rates Candidate B's as indicating a gap in technical ability. The actual answers were structurally identical.
Attribution Bias vs Confirmation Bias
Confirmation bias causes interviewers to seek information that confirms a pre-existing impression. Attribution bias shapes how they interpret information they have already received. The two often compound each other: an interviewer who has formed a positive first impression through confirmation bias will then interpret ambiguous answers charitably through attribution bias. Structured scorecards with pre-defined behavioural anchors partially address both by requiring the interviewer to evaluate specific evidence rather than general impressions.
Attribution Bias in Practice
A talent acquisition manager at a professional services firm notices that female candidates are failing the final panel interview for client-facing senior roles at a rate 22 percentage points higher than male candidates, despite equivalent performance in competency-based assessment stages. Reviewing interviewer feedback reveals that female candidates' direct communication style is frequently annotated as lacking warmth or collaborative instinct, while the same style in male candidates is noted as demonstrating executive presence. The manager introduces calibrated scoring rubrics, requires written behavioural evidence for each competency rating, and runs a one-day structured interviewing workshop. The differential in pass rates drops to 6 percentage points within two hiring cycles.