Stop Asking “Who Did It?” and Start Asking “How Did This Make Sense at the Time?”
There is a question that feels responsible, decisive, and strong after a workplace incident:
"Who did it?"
It signals accountability. It reassures executives. It sounds like control.
It is also often the wrong first question.
Because when leaders start with who, they narrow the field of vision immediately. They move from understanding to attribution. From system to individual. From curiosity to judgment.
There is a different question that high-reliability industries use, and it sounds almost uncomfortable at first:
"How did this make sense at the time?"
That question changes everything.
It does not remove accountability. It expands understanding. And in complex work environments across North America, understanding is what prevents recurrence.
Why People Rarely Wake Up Intending to get Hurt
In most serious workplace incidents, the individuals involved were not trying to break rules. They were trying to get work done.
They were clearing a jam to keep production moving. They were stepping into a restricted zone because "it only takes a second." They were bypassing a guard because it had been done that way for months without incident.
From the outside, those decisions look reckless. From inside the moment, they often felt reasonable.
Safety science has been studying this phenomenon for decades. Human factors researchers consistently find that accidents are rarely caused by "bad people." They are caused by ordinary people operating in imperfect systems under pressure.
When leaders ask how a decision made sense at the time, they surface those pressures.
When they ask who did it, they often miss them.
The Aviation Lesson
Commercial aviation did not achieve its extraordinary safety record by focusing solely on pilot error.
After crashes, the National Transportation Safety Board does not begin by asking who to blame. It reconstructs context. It examines equipment design, communication patterns, maintenance history, fatigue, weather, and decision-making chains.
Aviation investigators understand something powerful: if you only fix the last action in the chain, the system remains fragile.
Industrial workplaces operate with similar complexity. Equipment, scheduling, training, supervision, incentives, fatigue, and communication all interact.
If you isolate one worker and close the case, you may satisfy emotion. You do not strengthen the system.
A Real-World Enforcement Pattern
Review serious citations issued by the Occupational Safety and Health Administration and you will see a recurring theme. In many fatality investigations, OSHA identifies management system failures alongside individual actions.
Common findings include:
- Inadequate hazard assessments.
- Failure to enforce existing procedures.
- Lack of supervision or verification.
- Production pressures overriding safety controls.
- Incomplete training effectiveness.
Notice the pattern. Even when a worker made a critical mistake, regulators frequently identify broader contributors.
In repeat violation cases, penalties escalate significantly because the issue was not a single lapse. It was systemic tolerance.
The "who did it" question would not have revealed that.
The Psychology of Error
When something goes wrong, hindsight bias takes over. Once we know the outcome, it seems obvious that the decision was unsafe.
But in the moment, workers do not have hindsight. They operate with partial information, competing priorities, and routine normalization.
Dr. James Reason's Swiss Cheese Model illustrates this well. Hazards pass through multiple layers of defense when gaps align. The person closest to the incident is often the final slice of cheese, not the root cause.
When leaders ask how the action made sense at the time, they are effectively asking: what were the holes in the other slices?
This is not soft thinking. It is disciplined analysis.
The Warehouse Example
Consider a pedestrian struck by a forklift in a distribution center.
The "who" question identifies the operator.
The "how did this make sense" question reveals:
- Congested aisles during peak shifts.
- Unclear right-of-way markings.
- Production bonuses tied to speed.
- Supervisors who routinely stepped into traffic zones themselves.
- Previous near misses that were never escalated.
In that context, the operator's decision was not isolated. It was predictable.
Blaming the operator might satisfy immediate pressure. Redesigning traffic flow and incentives prevents recurrence.
Why Leaders Resist this Question
There are three common objections.
First: "We cannot appear weak."
Second: "We need accountability."
Third: "Legal risk requires decisive action."
Let's address them directly.
Asking how something made sense at the time is not weakness. It is intellectual honesty. It communicates that leadership is willing to examine its own systems.
Accountability remains essential. Reckless behavior must still be addressed. But reckless behavior is different from human error influenced by system pressures.
As for legal risk, regulators and courts increasingly expect systemic evaluation. Demonstrating thoughtful analysis and corrective action strengthens defensibility.
The Difference Between Human Error and Recklessness
To apply this thinking responsibly, leaders must distinguish between categories of behavior.
Human error involves slips, lapses, or misjudgments without intent to cause harm.
At-risk behavior often involves shortcuts influenced by habit, normalization, or subtle pressure.
Reckless behavior involves conscious disregard of clear and understood risk.
Only the third category typically warrants severe discipline.
The first two require system improvement.
When leaders collapse all categories into blame, they discourage transparency and learning.
How to Operationalize the "How Did This Make Sense" Approach
This question must be embedded into investigation protocols.
Instead of starting interviews with "Why did you break procedure?" consider asking:
- What were you trying to accomplish at that moment?
- What information did you have?
- What constraints were you operating under?
- Have you seen others perform this task the same way?
- How is this task normally completed under time pressure?
These questions uncover context.
They also signal fairness.
When workers sense fairness, they provide fuller accounts. When they sense accusation, they protect themselves.
The Reporting Multiplier Effect
Research on psychological safety led by Amy Edmondson demonstrates that teams who feel safe speaking up report more errors and risks, which in turn improves performance outcomes.
In safety environments, that translates to earlier detection of weak signals.
If workers believe that honest mistakes will automatically result in punishment, they hide them. If they believe investigations are fair and systemic, they surface them.
This multiplier effect determines whether near misses become serious injuries.
The Uncomfortable Mirror
Asking how something made sense at the time often reveals uncomfortable truths.
It may expose:
- Incentive structures that reward speed over safety.
- Supervisors modeling risky shortcuts.
- Understaffing during critical tasks.
- Training that emphasized policy over practice.
- Competency never actually verified.
These findings can feel threatening to leadership identity.
But they are also opportunities.
Organizations that confront these realities evolve faster.
A Safety Stand-Down Example
After a confined space near miss in a manufacturing plant, leadership chose to present the event transparently during a site-wide meeting.
They described the timeline. They acknowledged that permit paperwork had become routine and rushed. They admitted that supervision during high-maintenance days was stretched thin. They implemented micro-verification checks before entry.
The tone of that meeting mattered.
Workers reported increased confidence in leadership afterward. Near-miss reports increased in the following quarter, then serious incidents declined over the year.
The learning effect was measurable.
What this Question Ultimately Builds
When leaders consistently ask how decisions made sense at the time, three cultural shifts occur.
First, investigations become richer. System weaknesses surface earlier.
Second, supervisors begin evaluating context proactively rather than reacting to violations.
Third, workers feel respected as contributors to safety improvement rather than suspects in waiting.
Over time, this produces a more intelligent organization.
The Leadership Discipline Required
This approach demands maturity.
It requires resisting immediate blame pressure from executives or media. It requires communicating nuance to legal teams. It requires clarity about when discipline is warranted and when systems are the issue.
But safety maturity does not grow in comfort. It grows in disciplined reflection.
The next time an incident occurs, pause before asking who.
Ask how it made sense.
The answers may be uncomfortable.
They may also be the most valuable data your organization ever uncovers.