Why AI systems make things up and how that can lead to serious risks
By
Binu Mathew
When someone sees something that isn’t there, people often refer to the experience as a hallucination. Hallucinations occur when your sensory perception does not correspond to external stimuli.
Technologies that rely on artificial intelligence can have hallucinations, too.
When an algorithmic system generates information that seems plausible but is actually inaccurate or misleading, computer scientists call it an AI hallucination. Researchers have found these behaviors in different types of AI systems,