October 26, 2024: Whisper Faces Hallucination Issues in Transcriptions - Researchers have discovered significant hallucination issues in OpenAIs Whisper transcription tool, with errors ranging from racial comments to false medical treatments. A University of Michigan study found hallucinations in 80% of cases, while other studies reported over 50% error rates. OpenAI acknowledges the problem and is actively working to improve accuracy while prohibiting Whispers use in high-stakes contexts. Despite efforts to enhance the model, the recurring errors raise concerns about its reliability, especially in critical sectors like healthcare.