AI Can't Replace Language Access
with Carol Velandia
Carol Velandia, Founder of Equal Access Language Services, brings a deeply practical perspective on where AI falls short in healthcare: language access. She explores why machine translation and AI-driven interpretation tools cannot capture the cultural nuance, emotional context, and clinical precisi...
Carol Velandia, Founder of Equal Access Language Services, brings a deeply practical perspective on where AI falls short in healthcare: language access. She explores why machine translation and AI-driven interpretation tools cannot capture the cultural nuance, emotional context, and clinical precision that human interpreters provide, and why the stakes for patient safety and health equity are too high to get this wrong.
Machine translation in healthcare carries risks that most technology advocates underestimate, particularly in clinical settings where a mistranslation can alter a diagnosis or treatment plan. A word that has clinical meaning in English might not translate precisely into another language, creating dangerous ambiguity that algorithms cannot resolve.
Language access is fundamentally about equity, and AI solutions that reduce it to a technology problem miss the human dimension entirely. When healthcare systems adopt AI interpretation tools because they are cheaper, they are effectively rationing language access based on budget constraints rather than patient need.
The cultural context of patient communication is not something algorithms can replicate with current capabilities. Healthcare interpretation requires understanding not just words, yet also cultural frameworks around health, illness, family dynamics, and how patients communicate symptoms and concerns.