<iframe src="//www.googletagmanager.com/ns.html?id=GTM-K3L4M3" height="0" width="0" style="display:none;visibility:hidden">

More from Books

The real problem with ChatGPT is that it can never make a joke

When Andy Stanton commands the AI program to tell him a story about a blue whale with a tiny penis, the result, as it unfolds, drives him a bit insane

25 November 2023

9:00 AM

25 November 2023

9:00 AM

Benny the Blue Whale: A Descent into Story, Language and the Madness of ChatGPT Andy Stanton

Oneworld, pp.320, 16.99

I have been reviewing books for nearly four decades – starting in this very magazine – and over the years I have encountered some real stinkers. But this is the first time I can recall being reluctant to pick up the book because of actual physical nausea. Intellectual nausea I’ve had plenty of times. Give me a 900-page book of magical realism and that’s what I’ll get. But this time it metastasised into real queasiness. I’ll explain why. (Well, that is my job.)

The odd thing is, Benny the Blue Whale starts amusingly enough. Andy Stanton, a writer of chidren’s books, had been both intrigued and alarmed by the rise of ChatGPT, the artificial intelligence tool that can be instructed, with an absolute minimum of technical knowledge, to produce prose – and poetry, if you can call it that – in whichever style you like. What the program behind it does is comb the entire internet for examples, and then, in a few seconds, regurgitate something loosely approximating to what you asked for.

The real problem with ChatGPT is that it can never, ever make a joke


Stanton had the idea of playing with ChatGPT to see what it came up with. At first he asked it to ‘generate 100 names for a very smelly animal’ or ‘write an essay about getting a haircut’; but the command he finally settled on, and which has resulted in the book under review, was: ‘Tell me a story about a blue whale with a tiny penis.’ So far, so puerile; and, indeed, so amusing. The two go hand in hand, especially when you want to make an unthinking technology look foolish, as when a child types the numbers 5318008 into a calculator, turns it upside down and informs the rest of the class that it has spelled out the word ‘Boobies’. I thought that was funny then and I still think it is, although these days I don’t exactly split my sides.

But Stanton was doing this not just in order to get a giggle but to explore the limits of AI and deduce something about the nature of storytelling. As he says at the very outset, this eventually drove him a bit insane. He describes the rabbit-hole he spirals down with great conviction and plausibility. As he makes ever more detailed and ludicrous demands of ChatGPT, he prints the results on the verso and his commentary on the recto, with long footnotes at the bottom of the page (where they belong; otherwise they’d be endnotes)

For the first 30 pages or so this is both amusing and useful. Then the story gets going. ‘Once upon a time, in the vast and deep ocean, there was a blue whale called Benny.’ Fair enough; and as Stanton notes, Benny is an excellent choice of name. He explains why naming characters is so important and wonders where ChatGPT got the name from, etc. ‘This isn’t going to be Proust,’ he says, as the story, with prompts generated by Stanton, progresses. Benny dies. An octopus delivers a eulogy at his funeral, saying how Benny’s tiny penis was an inspiration for other sea creatures (since ‘size doesn’t matter’). A religion, the Penitents of Benny, is founded, worshipping the whale and his tiny penis. A counter-religion is formed by squid; then another, called the Vagina Venerators… and suddenly, 40 pages into the proper story, something in my head went pop and I got tired of seeing the numerals 5318008. So, as you can imagine, the remaining 300-plus pages were a bit of a struggle.

The problem isn’t just that ChatGPT doesn’t have a mind or that it defaults to miserable cliché and boilerplate language. (It is in fact almost by definition boilerplate. My exposure to it may be scant because it makes me want to puke, but I have seen good people described as ‘gentle and wise’ and the words ‘a deity divine’ in its miserable attempts to write verse, which aren’t even good enough to be called doggerel.) No, the real problem is that, unlike Stanton himself, who is both witty and clever, it can never, ever make a joke. It can be unintentionally funny, as when it warns that ‘this content may violate our content policy’, but even that pleonasm deflates the joke. The only interesting bit was when a glitch in the system had it repeat the words ‘mermaids and mermen’ about 100 times, so the words bleed off the end of the page like a modernist experiment by Gertrude Stein. The other problem is that it uses existing writing – maybe even mine – to come up with this stuff. The difficulty is even worse with AI art. As for ChatGPT, I wouldn’t use it to write a note for the milkman – if we still have those.

Got something to add? Join the discussion and comment below.

You might disagree with half of it, but you’ll enjoy reading all of it. Try your first month for free, then just $2 a week for the remainder of your first year.


Comments

Don't miss out

Join the conversation with other Spectator Australia readers. Subscribe to leave a comment.

Already a subscriber? Log in

Close