
The term “generally accepted police practices” is often cited in legal proceedings to evaluate law enforcement actions, particularly in use of force incidents. However, experts Von Kliem, Jamie Borden, and Daniel King recently highlighted its frequent misuse, which can unfairly penalize officers and distort justice. This article, based on their in-depth discussion, explores the complexities of this issue, the pitfalls of vague standards, and practical strategies for officers, investigators, and agency leaders to navigate this challenging landscape. Designed for law enforcement professionals, legal stakeholders, and policymakers, it offers actionable steps to ensure fair, evidence-based evaluations.
The Problem with “Generally Accepted Police Practices” – No National Standard Exists
Von Kliem, Chief Consulting and Communications Officer at Force Science, emphasized that no single authority defines “generally accepted police practices.” Organizations like the International Association of Chiefs of Police (IACP), Lexipol, or the Department of Justice (DOJ) provide model policies, but these are not universally adopted. With over 18,000 law enforcement agencies in the U.S., policies vary based on local resources, laws, and community needs, making a national standard impractical.
Kliem noted that many guidelines come from small committees or activist groups, such as the Police Executive Research Forum (PERF), which may push agendas not grounded in operational reality. For example, DOJ consent decrees have forced agencies to adopt policies under legal pressure, not because they reflect widespread practice. This creates a disconnect between policy ideals and the street-level decisions officers make.
Misuse in Legal Proceedings
Jamie Borden, a use of force investigator, trainer, court certified expert in the field, and best-selling author of “Anatomy of A Critical Incident – Navigating Controversy” explained that academic experts often cite these vague standards or recommendations to criticize officers in legal proceedings. Lacking operational experience, these experts evaluate actions against self-defined “national standards” that don’t exist, ignoring the officer’s training and agency policies, not to mention the reality that the officer is facing in the moment. In one case, an expert suggested an officer should have de-escalated (un-defined as to what this de-escalation tactic should have been) while a suspect with a knife wrestled with another officer—a tactically unrealistic recommendation for multiple identifiable reasons. Keeping in mind in the review and analysis of these events, de-escalation must be looked at as a “goal”, not a “tactic”
Daniel King, a retired officer, use of force expert, and published author highlighted how prosecutors exploit this ambiguity. By charging officers with crimes like assault or manslaughter, they bypass constitutional standards like Graham v. Connor (1989), which judges use of force based on an officer’s reasonable perception of the situation. Polished academic witnesses sway jurors unfamiliar with policing, presenting speculative alternatives that sound logical but are operationally flawed at their core and unrealistic in the way things actually happen.
Hindsight Bias and Counterfactual Reasoning:The Fog of Critical Incidents
Kliem described how officers make decisions in “System 1” thinking—fast, intuitive, and based on limited information—under the “fog of war” in high-stress situations. Post-incident reviews, however, use “System 2” thinking—slow and outcome-informed—leading to hindsight bias. Reviewers assume events were more predictable than they were, criticizing officers for not choosing idealized alternatives.
Borden, referencing human factors experts like Sidney Dekker, stressed the need to stay “in the pipe,” evaluating decisions based on what the officer knew at the time. For example, an officer facing a suspect with a hatchet may not know if others are nearby, but a review might fault them for not negotiating from cover, ignoring the immediate threat to the community. This clearly falls into the risk v. threat assessment that officers are in constant evaluation of.
Counterfactual Critiques
Borden called these critiques “counterfactual reasoning,” where experts speculate about what “should have” happened. In a case King cited, an expert suggested an officer should have slapped a suspect instead of using lethal force—an absurd idea given the context of the incident. Such arguments assume better outcomes without evidence, misleading jurors who lack policing experience.
Departmental Policies: The True Standard and Focus on Agency Directives
Kliem urged officers to prioritize their agency’s policies and training, which vary widely across jurisdictions. These directives, not vague national standards, are what officers are held accountable to. Borden warned that signing off on policies (e.g., via PowerDMS) doesn’t ensure understanding. Officers must actively study their use of force guidelines to explain and educate reviewers regarding their actions effectively.
Prosecutorial Tactics
King shared cases where prosecutors disregarded agency policies, relying on academic experts to argue violations of “generally accepted practices.” In several cases, references to Graham v. Connor and state self-defense statutes were excluded from trial, leaving jurors to judge the officer against undefined standards. This tactic undermines due process, as officers lack notice of the expectations they’re held to.
Ethical and Legal Concerns: Prosecutorial Misconduct
As a former prosecutor, Kliem was troubled by prosecutors deviating from established standards for convenience. In one case, a prosecutor rejected their own prior Graham-based standards to secure a conviction, a move Kliem called ethically questionable. Kliem noted that changing jury instructions mid-trial to favor the prosecution further erodes fairness, prioritizing wins over justice.
Juror Confusion
Jurors, often unfamiliar with policing, are swayed by academic experts’ credentials and technical jargon. Kliem recounted a juror asking whether to follow local policy or “generally accepted practices,” highlighting how these terms confuse fact-finders. King emphasized that experts’ polished delivery masks their lack of operational insight, making their arguments persuasive to laypeople.
The Misnomer of “Best Practices” – Defining “Best Practices”
The term “best practices” is often misused in force reviews, treated as a catch-all for speculative critiques. Kliem and Borden argued that true best practices must be clearly defined, evidence-based, and formally adopted by the agency. Without this, the term invites arbitrary judgments, like an academic expert’s suggestion in one case to maintain 50 yards of distance from a bat-wielding suspect—impractical and disconnected from reality with no anchor for the expert recomendation.
Analysis vs. Critique
Borden distinguished between a critique (speculating on alternatives) and an analysis (reconstructing the incident’s context). Force reviews should prioritize analysis, using data like body camera footage, officer statements, and policies to understand what happened. Critiques belong in training, not legal or disciplinary decisions, as they assume predictable outcomes that don’t exist in dynamic encounters.
Human Performance Under Stress: The Science of Decision-Making
Kliem highlighted the importance of human performance science, a focus of Force Science from a scientific purview and CIR from an investigative principles’ perspective. Officers under stress experience physiological effects—tunnel vision, auditory exclusion, time distortion—that shape their decisions. Academic experts rarely account for these, evaluating actions as if officers had unlimited time and clarity. Borden and King argued that these phenomena happen in these critical incidents, the science doesn’t excuse it, the science explains it. These principles are applied in training and investigations as a common place component regarding the officer’s behavior. Investigators and all stakeholders should certainly be informed by the science that exists to find the issues where psychologists, bio-mechanical engineers, or even human factors researchers need to be involved.
Borden referenced “satisficing,” where officers choose the first viable solution under pressure, not the optimal one. This aligns with research by Dr. Gary Klein on naturalistic decision-making. Reviews that ignore these realities unfairly penalize officers for human limitations.
Video Evidence Limitations
Borden, King, and Kliem addressed the misuse of video evidence, often seen as definitive but flawed by technical issues like variable frame rates or distorted playback. In one case, a surveillance video’s unreliability was overlooked, leading to false conclusions about an officer’s actions. Investigators must understand these limitations to ensure fair evaluations. Video analysis, review, and examination have become a focus of CIR’s training, video is everywhere.
Implications for Policing: Officer Morale and Risk-Aversion
The misuse of vague standards increases legal and emotional risks for officers, potentially leading to risk-averse policing. Kliem noted that some experts advocate a “safest form of policing,” avoiding confrontation, but this neglects officers’ duty to protect communities. Such trends can erode morale and public safety.
Agency Challenges
Leaders must balance policy development with operational needs, resisting external standards that don’t fit local contexts. Borden warned that vague policy language or language in a report, like “best practices,” can be exploited in court, undermining defensibility.
Public Perception
Reliance on academic experts’ fuels narratives that policing is inherently flawed, even when actions are lawful. King argued that this oversimplifies use of force, ignoring the legal authority granted to officers to address threats like armed suspects in neighborhoods.
Actionable Strategies for Stakeholders
To counter the misuse of “generally accepted police practices” and promote fair evaluations, stakeholders can adopt these strategies:
For Officers
- Know Your Policies.
Study your agency’s use of force policies thoroughly. Attend training and clarify ambiguities to ensure you can articulate their application in reports or testimony.
- Document Context Clearly.
In reports, detail what you saw, heard, and believed, using simple terms to align with Graham v. Connor. Avoid vague descriptions that invite misinterpretation.
- Prepare for Testimony
Practice explaining your decisions in simple terms, anticipating questions about why alternatives were impractical. Mock cross-examinations can build confidence.
- Invest in Training
Attend courses from Critical Incident Review (criticalincidentreview.com) & Force Science (forcescience.com) to understand human performance and use of force principles. Read resources for deeper insight. CIR provides a lengthy reading list to provide viable means of good information for officers and investigators.
For Investigators
- Prioritize Analysis
Reconstruct incidents using officer statements, video, and policies, focusing on what the officer knew at the time. Use frameworks like Graham v. Connor’s Calculus for Objective Reasonableness. CIR has developed several analytical tools to be used in the investigative and review process. (Enhanced Force Investigations Course)
- Assess Video Critically
Recognize video limitations (e.g., frame rate issues) and consult experts to ensure reliability. Educate reviewers about these constraints to prevent overreliance.
- Incorporate Human Factors
Apply principles of stress, perception, and cognition to investigations. Reference Force Science or CIR resources to ground findings in science.
- Use Precise Language
Avoid terms like “best practices” unless tied to adopted standards. Cite specific sources (e.g., agency training manuals) to maintain clarity.
For Agency Leaders
- Craft Clear Policies
Develop specific, operationally feasible policies aligned with Graham v. Connor. Avoid vague references to “national standards” that invite misinterpretation.
- Promote Learning
Use tactical after-action reviews to improve performance, not punish. Encourage officers to attend external training and share knowledge.
- Evaluate External Standards
Critically assess guidelines from DOJ, PERF, or others before adoption. Document decisions to strengthen legal defensibility.
- Support Officers Legally
Provide access to qualified experts and counsel familiar with policing. Prepare officers for prosecution tactics, like bypassing Graham v. Connor.
Conclusion
The misuse of “generally accepted police practices” undermines fair evaluations of use of force, penalizing officers for adhering to their training and legal authority. By focusing on agency policies, grounding reviews in human performance science, and challenging vague standards, stakeholders can ensure justice reflects the realities of policing. Officers, investigators, leaders, and legal professionals must work together to educate, not excuse, fostering a system that balances accountability with operational truth. For further resources, visit contact Danny or Jamie at CriticalIncidentReview.com, or contact Von Kliem at von.kliem@forcescience.com. Or visit forcescience.com
To equip law enforcement professionals and investigators with the tools to apply the Common Thread theory effectively, I invite you to visit criticalincidentreview.com and explore our Enhanced Force Investigations Course. This training program is designed to deepen your understanding of context and suspect behavior in use-of-force cases, providing practical strategies to conduct objective, evidence-based investigations. Join us to learn how to navigate the complexities of critical incidents with clarity and confidence, ensuring your reviews align with the latest scientific insights and established and documented best practices.