Connect with us

Tech

ChatGPT And Other LLMs Produce Bull Excrement, Not Hallucinations

Published

on

ChatGPT And Other LLMs Produce Bull Excrement, Not Hallucinations

In the communications surrounding LLMs and popular interfaces like ChatGPT the term ‘hallucination’ is often used to reference false statements made in the output of these models. This infers that there is some coherency and an attempt by the LLM to be both cognizant of the truth, while also suffering moments of (mild) insanity. The LLM thus effectively is treated like a young child or a person suffering from disorders like Alzheimer’s, giving it agency in the process. That this is utter nonsense and patently incorrect is the subject of a treatise by [Michael Townsen Hicks] and colleagues, as published in Ethics and Information Technology.

Much of the distinction lies in the difference between a lie and bullshit, as so eloquently described in [Harry G. Frankfurt]’s 1986 essay and 2005 book On Bullshit. Whereas a lie is intended to deceive and cover up the truth, bullshitting is done with no regard for, or connection with, the truth. The bullshitting is only intended to serve the immediate situation, reminiscent of the worst of sound bite culture.

When we consider the way that LLMs work, with the input query used to provide a probability fit across the weighted nodes that make up its vector space, we can see that the generated output is effectively that of an oversized word prediction algorithm. This precludes any possibility of intelligence and thus cognitive awareness of ‘truth’. Meaning that even if there is no intent behind the LLM, it’s still bullshitting, even if it’s the soft (unintentional) kind. When taking into account the agency and intentions of those who created the LLM, trained it, and created the interface (like ChatGPT), however, we enter into hard, intentional bullshit territory.

It is incidentally this same bullshitting that has led to LLMs being partially phased out already, with Retrieval Augmented Generation (RAG) turning a word prediction algorithm into more of a fancy search machine. Even venture capitalists can only take so much bullshit, after all.

Continue Reading