OpenAI’s new AI audio transcription tool Whisper is having frequent “AI hallucinations”, despite its rapid adoption in “high-risk industries” like healthcare, AP News reports. AI hallucination is where a large language model (LLM) spots patterns that don’t exist, creating outputs that can be nonsensical or downright ridiculous. Whisper allegedly has invented text that includes “racial […]