Can AI Content Writers Really Do It All? (Our Experiments Say No)
My relationship with technology is complicated. I didn’t use the internet until I was done high school, and I didn’t use it regularly until after my second child was born in 2003. I’ve been a writer all my life, and writing has been my primary source of income for over a decade. So when my boss tapped me for a series of experiments pitting human writers against AI content writing tools, I wasn’t sure what to think.
I had pitched a post exploring the use of AI content writers earlier with the idea of establishing human writers still had a significant edge over ChatGPT. (Thought leadership, meet self-preservation.) But as our experiments went on, I realized AI tools for content writing were gaining ground in a hurry.
So is my job in danger? All signs point to no. (Or at least, not yet.) We’ve tested a wide range of copy-generation tools, and while they’ve performed better than I initially expected, they came up short in a few vital areas.
1. ChatGPT Hasn’t Mastered Headlines
Turns out AI content writers haven’t mastered the art of composing a great headline yet. The headlines they generated during our first few experiments were pretty hit-and-miss. Some were pretty good, and others, not so much.
The scores above come from CoSchedule’s Headline Analyzer, which recommends a minimum score of 70 in headlines. AI content writing tools fared even worse with subheads, although I didn’t realize how bad it was until a Slack chat with my colleague running the AI side of our experiments revealed that he consistently had to replace the first and last subheads of each blog post the tools created. (Every tool suggested using “Intro” and “Conclusion” at the top of those sections.)
While these subheads do clearly convey what topic each section will cover, they fail to add anything that will engage the reader or pique curiosity. Effective headlines are unique, specific, and include interesting adjectives to attract attention — these don’t make the grade.
2. Fact-Checking Isn’t AI’s Strong Suit
One of the first things I noticed as I dug into our AI-generated copy was the complete lack of links. The second thing I noticed was that hard facts (like dates or statistics) weren’t there either. Hmmm. Both of those are pretty valuable tools in the content marketer’s arsenal.
AI copywriting tools are masters at pulling information from all over the web (now that Microsoft has helped ChatGPT look at data from after 2021), but ask them where they got it and… *crickets.* This should concern any marketer worth their salt for a few reasons:
3. Creativity is a Human Trait
Machines are many things: efficient, accurate (with some caveats), and, in many cases, faster than humans. But creativity is not among AI’s better qualities. AI tools for content writing have no imagination — they’re simply regurgitating the content they’ve “learned” about a topic to provide the content we’ve asked for.
One of our experiments involved giving each team (human and robot) a single keyword around which to write a blog post. (We selected “AI in marketing,” because our sense of irony is intact.)
The human version was an exploration of the fact that no one in our industry can seem to agree on how many of us are using AI, and for what: How Agencies Are Actually Using AI in Marketing — It’s Not What You Think. Obviously, it drew on published stats, but it also squeezed in some hypothesizing, anecdotal evidence, and an original, personal perspective.
The AI-generated version, conversely, trotted out the use cases for AI in marketing we’ve all read (and written) about, followed by some high-level guidance on crafting an AI strategy. (Use your data, embrace automation, and keep learning.) Was it accurate? Sure. Were there some useful points? Possibly. But was it interesting to read? Heck no.
4. Plagiarism Is (Apparently) an AI Trait
This shouldn’t really come as a surprise, given that the generative AI model is entirely based on having machines learn from content that already exists. But the word “generative” suggests the copy your AI content writers will produce is original — and that’s not 100% true.
I ran the first three blog posts we created with AI through Grammarly’s plagiarism checker. While none of them had any large sections of copy directly ripped off from other published content, they were all flagged for duplicate copy — between 6% and 15% of the text the tools had produced was not original.
5. Your Robot Still Writes Like a Robot
When technology comes close to replicating the appearance of a human, the uncomfortable sensation generated is referred to as the uncanny valley. While our AI content writing tools aren’t trying to look like us, they are trying to sound like us, and there are moments when that comes off a bit oddly.
The use of unnatural language can put off potential customers if it sounds a bit creepy…
Authenticity is a vital component of your marketing blog. When your writing sounds like you, it feels more relatable to audiences and helps establish you as a person readers can trust. But when humans talk, we don’t just spew high-level facts in an uninterrupted stream of consciousness. We use slang. Tell stories. Demonstrate empathy. We have a conversation — and that’s something AI content writers just haven’t mastered yet.
Compare these two introductions. One is dry, formal, and fact-filled. The other is engaging the reader in a conversation, describing a plight that would feel familiar to the target audience:
There’s Still a Future for AI Marketing Tools
Despite how it looks, this blog post isn’t necessarily my not-so-roundabout way of saying that AI content writing tools are worthless — far from it. But it’s vital to understand their strengths and limitations before you incorporate them into your strategy.
At TPM, we’ve launched an ongoing series of experiments to help us identify which AI tools could help us save time, improve quality, and generally better serve our clients — and where the human touch still matters.