OpenAI Whisper prone to hallucinations, researchers say
Researchers have discovered that Whisper, an AI-powered transcription tool used in various industries including healthcare, is prone to making up text or entire sentences, known as hallucinations. According to interviews with software engineers, developers, and academic researchers by AP, these hallucinations can include problematic content such as racial commentary, violent rhetoric, and imagined medical treatments. … Read more