Hallucination: What Makes GenAIs Lie and What We Can Do About It

by Doug McCord
November 28, 2023
Hallucination and Gen AI

Generative AI is already in use professionally, handling customer service inquiries, automating tasks, and empowering non-tech employees, as discussed in this PTP article. And the scale of use is only expected to rise exponentially in 2024 and beyond.  

AI's Swift Impact

But despite early successes and an almost unrivaled surge of enthusiasm, serious concerns remain. You’ve probably encountered stories of generative AIs appearing to blatantly lie, such as providing bogus legal cases and opinions (as discussed in this PTP piece), or giving ridiculous advice, like claiming a foodbank is one of Ottawa’s top tourist destinations.  

There have been charges of generative AI creating damaging content, like running a tasteless poll alongside an aggregated news story, or even convincing a college professor that his students cheated when, in fact, they didn’t. 

The tendency for generative AIs to periodically generate outputs that are erratic or factually wrong is called hallucination. We’ll take a look at what it is, why it happens, and what can be done by generative AI users to safeguard against it. 

  

Why do generative AIs hallucinate? 

An innovation built more around probability than precision, it’s no surprise to