AI Content Writers: What They Get Right & What They Muck Up

A man in a red tie leaning toward a computer monitor with a magnifying glass to illustrate AI Content Writers What They Get Right & What They Muck Up.

I had some AI content generators write my company’s bio. Here’s how they did.

Unless you’ve been living under a big rock (I bet it’s nice and cool down there), you know that artificial intelligence has been loosed upon the digital marketing world (and other industries).

This has led many of us to pen piece after piece after piece after piece discussing the ways in which AI is transforming our work now and will continue to transform it into the future.

One consistent theme amongst many of these thought pieces — ours and others’ — relates to AI’s potential for both disruptive and transformative effects on digital marketing.

Another consistent theme that I’ve tried to hammer home is the need to vet, edit, and fact-check content produced by AI. Whether it’s Jasper, ChatGPT, Bard, or any of the other platforms, AI is not ready for prime time without the input of human content marketing editors.

AI Content Writers:
GIGO (Just Like The Old Days)

AI image of person's silhouette in a room filled with computer code on the wall to illustrate AI Content Writers What They Get Right & What They Muck Up
AI writing assistants are only as accurate as the data on which they feed.

AI needs content input for its content output — whether that’s ad copy, blog articles, long-form content, website copy, or service and landing pages. To create high-quality content, machine-learning content-creation tools need quality inputs.

In other words, it’s GIGO (garbage in, garbage out) all over again.

The platforms themselves make this clear.

For example:

“ChatGPT is not connected to the internet, and it can occasionally produce incorrect answers. It has limited knowledge of world events after 2021 and may also occasionally produce harmful instructions or biased content. We’d recommend checking whether responses from the model are accurate or not.” — OpenAI (ChatGPT)

And this:

“Jasper writes creatively based on patterns he has seen across the internet. Those ‘facts’ NEED to be replaced by you (as the editor) almost all the time. … As Jasper continues to write, he will likely make up some facts you’ll want to correct, but feeding him with correct information will significantly increase the chances of him writing accurate information.” — Jasper (emphasis mine; how does one make up facts, anyway?)

The news is out, too, with some outlets suggesting that this whole fact-inventing thing is actually getting worse rather than better.

Noam Chomsky has written about the “false promise of ChatGPT.”

And Antony Brydon, co-founder and CEO of Directly, thinks that “AI will always require human input and expertise — technical and otherwise — to operate at its full potential in a way that’s ethical, responsible and safe.”

For now, at least, the “promise” of artificial general intelligence (AGI), AI writing tools, AI copywriting tools, and content generation tools, in general, is limited by the data with which AI works.

But what does that mean for marketers? What are the real-world ramifications?

Well, mistakes, errors, omissions, and flat-out lies (made-up facts), to name a few — unless a human being is paying attention.

Personal Experience

When I first began experimenting with AI writing assistants, I was simultaneously impressed by their ability to string sentences together to make coherent — and, occasionally, dazzling — prose and amused by the factual errors that seemed to appear out of left field.

For example, I instructed an AI writing assistant to compose a blog post that would include information on the history of a local high school. The AI made up facts — e.g., it invented graduates of the school and their accomplishments. It even invented an alumnus who, according to the AI, went on to win an Olympic gold medal in track and field.

A hand holds up a gold medal in triumph
An AI writing assistant invented the name and accomplishments of a supposed alumnus of a local high school.

Errors in Action

As a way to further illustrate what I’m talking about, I had a few AI content writing assistants produce pieces on the history of my company, Webfor. Here’s what they came up with, and where they erred.

The prompt: Produce a blog on the history of Webfor, a digital marketing company based in Vancouver, Washington. Talk about the owner and founder of the company, why he started the company in the first place, and the services Webfor now provides for its clients. Who are its clients? How long has Webfor been in business? Has it won any awards? Who are some of its employees? Include any information that might be relevant for someone seeking digital marketing services.

Each AI writing assistant produced serviceable text with headlines that were OK but not all that engaging. I would still need to go through it line by line to make the tone conform to our brand voice and to add SEO-related keywords and other digital marketing touches.

In addition, here’s what they got right — and what they got wrong.

What they got right:

  • Webfor was founded in 2009 by Kevin Getch.
  • “In its early days, Webfor was a one-man show.” Yep.
  • Getch “aimed to build a company that not only offered top-notch digital marketing services but also prioritized transparency, collaboration, and long-term relationships with clients.”
  • Our services (SEO, Web Design and Development, PPC, Social Media Marketing, Content Marketing)
  • Our clients: Webfor services “a broad range of clients from various industries, including healthcare, technology, hospitality, real estate, and professional services. Their client portfolio includes both small local businesses and large corporations.”
  • Awards: “The company has been honored with several prestigious awards, including the 2021 Best in Business Award from the Vancouver Business Journal.”
  • Community involvement: “Webfor’s commitment to giving back to the community has been commendable, with its involvement in charitable initiatives and partnerships with local organizations.”

What they got wrong (or left out):

  • Webfor won the “2022 Top Advertising and Marketing Agency in Portland Award by” (We didn’t win this one — and we’re not based in Portland. We are listed on as one of the “Best Digital Marketing Agencies in Vancouver,” however.)
  • Services: The AI excluded most of our digital marketing services, including Marketing Strategy, Conversion Rate Optimization Services, Press Releases, Reputation Management, Website Accessibility, Brand Identity, Ecommerce Design, Ecommerce Development, and Logo Design.
  • “Before starting the company, Kevin was working as a full-time web developer and graphic designer.” Nope.
  • Webfor “made the Inc. 5000 list of the fastest-growing companies for 2017.” We did not.
  • “It has also earned the title of Best Web Design Firm in Southwest Washington (2019) by The Columbian’s Best of Clark County.” This is also inaccurate, although we’ve been a finalist for several years running.

The AI/human relationship is reciprocal. AI knows what it knows because it has ingested content created by humans. In turn, humans inform AI as we work to turn its preliminary outputs into content that works for our clients.

But AI also makes things up. So we content marketers must include fact-checking as part of our content creation processes. AI simply can’t be trusted to get it 100% right just yet. As capabilities of AI continue to grow through technologies like AI assistants, RAG, and more the quality will continue to improve. Ensure you are staying ahead of the industry and keeping on top of the emerging technology and capabilities in this space.