OpenAI’s Whisper transcription tool shows significant inaccuracies in healthcare settings

The news: OpenAI’s Whisper transcription tool is consistently hallucinating generated text in medical and healthcare applications, researchers found.

  • Made-up chunks of text, and even entire sentences, have been found in up to eight out of 10 Whisper transcriptions, per the Associated Press.
  • More than 45,000 clinicians and over 85 healthcare systems around the world use a Whisper-based AI copilot tool called Nabla for appointment and note transcriptions.