A few months ago, I was invited to a business lunch to meet Sir Ed Davey MP. Who? You know, the leader of the Liberal Democrats.

On my way there, I thought that I would do some research on Sir Ed, so that I would only ask sensible questions, for I despise myself when I have asked a bad question. I turned to Chat GPT, which told me lots of useful info about Sir Ed, then claimed that he had been an active member of the Labour Party.

Sniffing the likely BS from my new AI buddy, I checked another font of all things semi-accurate – Wikipedia – which confirmed that my suspicions were correct: Sir Ed was never in the Labour Party. Filled with courage, I asked Chat GPT whether it/they was so sure of themselves, to which it/they apologised for its/their error.

This reminded me of the Gell-Mann Amnesia effect, coined by Michael Crighton. Each time I tell someone about this effect, the listener has an ah-ha moment. I shall let Mr Crighton explain it:

“Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well….You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.

In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.” 

For me, I have noticed this effect most often when reading a journalist’s view of a legal case: they usually miss the key point of the case. I now assume that any article I read is only 80% accurate.

Establishing what is true and what is false was hard enough before the advent of mainstream AI. Now, today, with so much content written by AI, which has “hallucinated”, fiction is being presented as fact. As the Nazi era saying goes, if you repeat a lie often enough it will appear to be the truth.

Although I do not know how to solve this problem, I am proud to have been appointed as a Non-Executive Directorship at the Quaker-inspired Just Algorithms Action Group, which campaigns for ethical AI.

(Thanks to Stable Diffusion for the AI-generated image, using the prompt “Sir Ed Davey Liberal Democrat halucinating”)