An audit by Ontario's auditor general found that AI medical scribes approved by the provincial government regularly generate incorrect, incomplete, and hallucinated information in patient notes, with all 20 tested vendors showing significant issues. The AI systems made serious errors including inventing non-existent medical referrals, misrecording medication names, and missing key mental health details, potentially leading to harmful treatment plans. The findings raise serious concerns about the current reliability of AI-assisted medical documentation in clinical settings.
Background
AI medical scribes have been increasingly adopted in healthcare to help automate clinical documentation, with the promise of reducing physician burnout and improving efficiency. However, this audit highlights significant reliability and safety concerns with current AI documentation systems in medical settings.
- Source
- Ars Technica
- Published
- May 15, 2026 at 01:28 AM
- Score
- 7.0 / 10