2.2. Hallucinations

So, does that mean the task is complete? Just slap AI on every white-collar job and voilà, a fresh product idea emerges, and product design becomes a solved discipline. Well, not exactly. Despite their skill, these models are not all-powerful.

Because they hallucinate.

Hallucinations in large language models refer to instances where the output is coherent and grammatically correct, yet factually inaccurate or nonsensical. In this context, a “hallucination” is the creation of false or misleading information. Such errors can emerge from several causes, including limited training data, biases within the model, or the complex nature of language itself.

For example, in February 2023, Google’s chatbot, Bard, mistakenly stated that the James Webb Space Telescope captured the first image of a planet outside our solar system. This was inaccurate; NASA confirmed that the first images of an exoplanet were actually captured in 2004. Furthermore, the James Webb Space Telescope wasn’t launched until 2021.

And in June 2023, there was a case where a New York attorney used ChatGPT to draft a motion that contained fabricated judicial opinions and legal citations. The attorney faced sanctions and fines, claiming that he was unaware that ChatGPT had the capability to generate fictitious legal cases.

The key issue with hallucinations is that they are not merely a glitch; they are inherent to the design. These models are generative in nature. Without the ability to deviate from their training data, they would be restricted to replicating what they’ve previously encountered, limiting their usefulness. They would be reduced from reasoning tools to mere search algorithms—a problem search engines like Google have already addressed. Fundamentally, these models don’t possess definitive answers to the questions posed to them. Their core function is to predict the next word based on probabilities. When given a prompt, they identify the most likely word sequence that resembles responses to similar inputs in their datasets.