
The copywriting revolution may take longer than you think
There’s a new technology doing the rounds at the minute, perhaps you have heard the odd mention of it. It’s artificial intelligence (AI). Apparently it’ll take over the world, outpace human intelligence and put most working adults on the dole within a few years. But first of all, the clever people tell me, it’s going to make human copywriting obsolete.
Yes, one of the first things those pesky robots plan to do (because apparently, they’re sentient) is start a copywriting revolution that will relegate real people to the feet of their new robot overlords. Or something.
The thing is, AI won’t do any of these things. At least, not within my lifetime.
Lessons about AI from the metaverse
People are hyping the use of generative AI in copywriting even more than they hyped the metaverse. You must remember the metaverse – Meta’s own fairyland where everyone becomes an avatar and space/time ceases to exist thanks to a VR headset. Not ringing any bells? Well, can’t say I blame you. That’ll be because it all fizzled out due to a lack of feasibility (and interest). I have written a sceptical Medium post about it, if you’re interested.
Anyway, I suspect that many of the people currently hyping AI were big fans of the metaverse. Even as their VR headsets gather dust under the stairs, they over-sell AI to an almost-amusing degree.
I’m a freelance copywriter and I love AI – up to a point
I love AI. I send it out to gather data and scope the background to whatever I’m writing about and it returns in seconds with information that would take hours to collect manually. It also, crucially, gives me full citations for all of that data (I use Perplexity, which is good at that, rather than other LLMs, that aren’t). So I can check my sources and the correctness of the AI’s interpretation, which is critical when you’re a technical copywriter.
However, I do NOT use AI to write anything. Not even first drafts. Partly because I don’t need it for that – I’m a freelance copywriter with years of experience, so I flipping well shouldn’t need it for that – but also because (i) my work is ultimately my clients’ data and I’m not putting that anywhere near public AI and (ii) I want to minimise the risks that LLMs bring.
Copywriting carries risk; AI can exacerbate it
Yep, using AI can be risky. Risky in ways that have been identified but are rarely mentioned on LinkedIn and elsewhere:
- AI uses others’ work, and the line between inspiration and plagiarism can be very fine. If you accidentally plagiarise copy because that’s what AI spat out and you didn’t check it, prepare for (legal) arguments.
- AI posts tend to be bland and derivative. And it’s hard for customers to respect or favour somebody who sounds bland and derivative.
- Sector leadership and respect is built on intelligent, original thought and comment. If you want to lead, you need original copywriting from the outset. You cannot staple original thought onto AI output and hope to remain credible.
- Some AI tools help cybercriminals. They might eat your data or direct your customers and staff to fake websites where they’ll be fleeced of their data and possibly more. If your business lets this happen, you may be liable: legally, financially and reputationally.
- If you use lots of AI and claim sustainability, be prepared for accusations of greenwashing. Data centres use massive amounts of energy and water. The BBC reports that AI could very soon be using as much energy as an entire nation. Which do you prefer: AI-generated copy or the planet? AI-generated copy or your own reputation and ethics?
- According to SEO company Semrush, emerging data suggests that AI search favours public-opinion data from forums like Quora and Reddit. But public forums have zero barrier to entry and threads can be peppered with drivel. Discussion of vaccine information/disinformation on Reddit has already been studied. If LLMs treat public opinion as fact, how accurate will their output be? And for those who publish it, what does that mean for corporate risk?
The short- and medium-term improvement in AI apps = copywriting tool, not killer
Another aspect of this issue, which in my view brings it close to the metaverse experience, is the assumption that AI will carry on developing at the rate we’ve seen over the last couple of years. But it won’t. It can’t.
There isn’t the infrastructure, for a start. Even OpenAI’s Sam Altman admits that. He has also compared the current noise around AI with the dot.com bubble. Meanwhile, reaction to ChatGPT 5.0 has been muted. Some have noted performance gains and refinements, but I haven’t heard much about huge steps forward or innovation. Unless you count being extremely power hungry as a gain: one study found ChatGPT 5.0 to use more than 8 times the power required by its predecessor.
AI is undoubtedly a useful tool in copywriting and likely to remain so. But if you dream of a free-to-access LLM that writes your copy like a human, includes original and innovative thought while automating your workload and saves you a fortune, you may have to dream of something else. For a while, at least. You still need a human copywriter. Don’t trust the robots – they’re already doing loads of unpredictable things, which is why one of my hobbies is watching robots mess things up. Meanwhile, I still offer copywriting services and expect to do so for a long time to come.
Comments are closed