Article Text

Download PDFPDF
Making the black box more useful
  1. Jeffrey A Kline
  1. Correspondence to Dr Jeffrey A Kline, Department of Emergency Medicine, Cellular and Integrative Physiology, Indiana University School of Medicine, 720 Eskanazi Avenue, Indianapolis, IN 46202, USA; jefkline{at}iu.edu, jefkline{at}iupui.edu

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Body et al1 explore the central controversy that arises when clinicians try to predict whether or not a patient currently has, or will soon develop, acute coronary syndrome (ACS). Here, clinician gestalt refers to predicting the future for a patient based upon all available pertinent evidence, using human thought. Gestalt reasoning is not entirely hidden, but shares similarities with black-box methods such as artificial intelligence: it has an input and an output and can be described better by what it does than how it does it. Gestalt requires three steps in data processing: gathering, sorting and interpreting, which is the standard process of seeing each new patient in the emergency department (ED). I submit that gestalt reasoning is inevitably, undeniably, intrinsic to every patient encounter in the ED. Gestalt differs from structured methods, such as decision rules, in its free form. Decision rules require clinicians to gather data in a specific manner and the rule generally tells us how to interpret the data. For example, most rules provide specific age cut-offs or state that the presence of a particular dichotomous feature (such as prior coronary artery disease present) quantifiably changes risk. …

View Full Text

Footnotes

  • Competing interests None.

  • Patient consent Obtained.

  • Provenance and peer review Commissioned; internally peer reviewed.

Linked Articles