< All Topics
Print

What is AI Hallucination?

An AI hallucination is a phenomenon in which an artificial intelligence model generates incorrect or entirely fabricated information and presents it as factual. This can range from subtle inaccuracies to completely nonsensical or misleading statements. The term is a metaphor for human hallucinations, though the underlying cause is not a perceptual error but rather a flaw in the AI’s data processing and generation.

Examples of AI Hallucination:

  • Citing Fake Sources: An AI might invent non-existent articles, studies, or legal precedents to support its claims.
  • Generating Inaccurate Biographies or Historical Events: The model might create a plausible but false narrative about a person’s life or a historical event.
  • Providing Incorrect Factual Information: When asked a direct question, the AI might give a confident-sounding but incorrect answer. For instance, attributing a quote to the wrong person or stating an incorrect scientific fact.
  • Anatomical Impossibilities: A person might be generated with three arms, six fingers on one hand, or a horse with five legs.