Coca-Cola Real Magic AI
| |

AI in Storytelling: 5 key AI pitfalls to avoid when balancing technology with the human touch.

5 key AI pitfalls in storytelling to avoid, when balancing technology with the human touch.

Marketers STOP what you’re doing for a second, and let me share what I’ve learned about Artificial Intelligence (AI) and story-based marketing. 

Yeah, I know we’re apprehensive about AI, putting on a brave face so we embrace. 

We know AI is reshaping industries at pace, and storytelling—a traditionally human art—hasn’t escaped. It’s critical to maintain the genuine human element for your brand, so that it truly resonates with your audience. 

Let me share 5 key AI pitfalls I’ve learned with story-based marketing, and how to address them.

1. AI is always right, yeah? The first key AI pitfall in storytelling.

Like many of us, you might think: “It’s AI- so it must be true”.

The assumption is that AI replies to our prompts at warp-speed, so we simply accept every response as true. We know AI is generated from what already exists online, right? So is everything you read online true?  

Not always.

Having tested several LLM’s, and requested thousands of prompts, I know AI hallucinates. AI can be inconsistent. AI can deliver untruths. 

So imagine curating a piece of research or creating campaign copy containing statistics that underpins your brand integrity…and it’s fake. 

“If this is AI, I need to be told it’s AI”

Embrace AI but cross-check the validity before committing.

2. Can you afford to use a Deep Fake and compromise Consumer Trust?

The fundamental thinking for consumers must be: 

“If this is AI, I need to be told it’s AI”

Consumers marvel at AI generated fantasy images and dystopian futuristic landscapes, because it’s obvious they’ve been artistically created. 

When considering day-to-day news, and products & services that influence their buying behaviours, customers don’t want to be fooled by AI for reality. 

Key AI pitfalls in storytelling: The Getty Images Report: “Building Trust in the Age of AI”

According to a Getty Images report from 2024, “Building Trust in the Age of AI”, 90% of consumers prefer transparency when an image is AI-generated. 

This is especially true in healthcare, pharmaceuticals, financial services, and travel sectors. The sentiment is increasingly ‘Play around with fantasy, but don’t mess with the stuff that matters to us’. 

87% of respondents considered it important for an image to be authentic. Consumers are shouting loudly, telling us it’s best to be transparent and what is the source? So let them decide. 

The same report concluded that 98% of consumers agree that ‘authentic’ images and videos are pivotal in establishing trust. The report advised “pre‑shot content may perform better in engaging consumers.”

"Building Trust in the Age of AI”
Credit: Getty Images
Coca-Cola & AI

Marketing leaders who are open about their use of AI have seen positive responses from consumers. Coca-Cola’s 2023’s festive “Create Real Magic” campaign, openly mixed AI, mixing (no pun) with Coca-Cola’s traditional seasonal imagery. It’s perfectly obvious it’s AI, and it’s designed to “invite customers to globally engage to enhance brand favorability.” 

It feels as if Coca-Cola created the Real Magic campaign to tell the world it ‘gets’ AI. You worry for the marketing agencies. Just imagine every seasonal campaign created by AI. 

3. Marketers: The 3rd key AI pitfall in storytelling: Don’t infringe copyright in your business story.

Just to repeat; the enormous amounts of content and data that LLMs curate and serve up, is already on the internet. It’s been created and recycled, generating an intelligent response to your prompt, training and improving the quality of Gen AI tools. 

As marketers, we police and protect our businesses from plagiarising online content that’s already been credited to another. 

So why is AI any different?

LLMs curate responses which may not necessarily be free from copyright protection. So be careful. 

When you’re crafting an organisation’s story, don’t be tempted to steal other content. Avoid marketing collateral that includes trademarks or copyrighted content.

“What, we plagiarised that really expensive and successful campaign we’ve just launched!?”

At the very least you may be asked to take it down. 

At worst, you’ll be served a lawsuit and invited to answer to a judge with a corporate copyright infringement claim. This could expose your brand to both legal and reputational risk.

Simply recognise the provenance and review output of content you generate via AI, to protect you, your brand and your sanity.

4. Don’t risk ethical bias in storytelling.

Number 4 of the key AI pitfalls in storytelling is ethical bias.

We know Generative AI, more accurately described as regenerative AI, is trained on colossal amounts of existing online content. Prompts are designed to produce outputs that reflect the data it has been fed. 

In addition to hallucinating, delivering untruths and inconsistencies, LLMs can unintentionally perpetuate biases and stereotypes found in its training data, potentially leading to harmful stereotypes in storytelling. 

Cornell University Study:

A March 2024 study from Cornell University analysed imagery generated by Midjourney, OpenAI’s DALL-E, and Stable Diffusion, all AI-driven tools with image creation in their lockers. 

The study discovered that these tools “inadvertently perpetuate and intensify societal biases related to gender, race, and emotional portrayals in their imagery.” In particular, the AI images tend to exclude women and people of colour from depictions of occupational scenes. This suggests a level of inequity that is worse than what’s currently reflected in labour statistics. 

However, the scope of inherent bias extends well beyond a single cohort or context.

It will require time, advanced data-science techniques, and a nuanced understanding of cultural nature to address the tool bias. They can address this issue right away through careful use of prompts and review of the output.

We know the “prompt” is the driver that gets accurate LLM results. Gen AI users should combat bias by including phrases that promote diversity, a range of ages, body types, and socioeconomic representations. This ensures these elements applied in the proper context. 

Gemini’s Gen AI Tool Error

This approach isn’t a quick, one-size-fits-all solution. When Google released its Gemini Gen AI tool in late Feb ’24, users were quickly shocked to read the tool produced images of America’s Founding Fathers, Nazi Officers and others with cartoonish, mixed-race characters with no historic accuracy. 

Presumably this was the result of “diversity” auto-populating into prompts regardless of context.

As leaders tell their own business stories, developing prompt policies for their own organisations, they should carefully consider their company’s specific needs and values, potentially with custom sets of phrases and guidelines for each Gen AI tool used. 

The Dove brand, while committing to not using generative imagery in its own marketing, has released a Real Beauty Prompt Playbook for brands that are using these tools, in an effort to “set new industry standards of digital representation.”

It’s important for leaders to regularly review and test AI-generated content for bias. By always applying human judgement, your teams will ensure your stories are inclusive, accurate, and respectful. 

These same teams should be involved from the outset, contributing to the development of campaigns and prompts. This ensures they remain engaged to review and check outputs for any inadvertent issues.

Don’t compromise your story.

5. Being authentic: The 5th key AI pitfall in storytelling

AI models draw from existing data, which can lead to repetitive or formulaic content. This reliance can hinder the originality that makes stories stand out. 

In conjunction with the surge in visual consumption, there’s an increasing emphasis on authenticity in advertising. 

How Dove & Patagonia have embraced authenticity in their stories:

Instagram and TikTok have popularised the immediate, spontaneous, and candid aesthetic. A recent survey from visual content marketing platform Stackla found that c.90% of consumers felt authenticity was essential to brand loyalty, preferring brands that are “real and organic” over those that are “perfect and well-packaged.” 

Campaigns like Dove’s Real Beauty and Patagonia’s Buy Less Demand More, have set new standards of vulnerability. Consumers now expect advertisements to reflect their own values and relatable, genuine experiences.

Where does that leave authenticity? How can brands tell an authentic story using a tool with the word “artificial” in its name? And what about the ongoing ethical concerns surrounding certain aspects of these tools?

Leaders should navigate these issues by being accountable, and identifying specific use cases when using generative AI for content creation. 

The first time I used AI, it blew my mind. Having been conditioned with the output from search engines for +20-years, when you’re served with the answer it’s transformative. 

And that’s the key to AI. 

We’re moving from being conditioned to search, to receiving a response that’s packaged up with a ‘bow on top’. We just have to be cognisant when we take the packaging off that what’s inside is authentic, is real, and reflects how we want our brand to be represented.   

To counter this, while AI can generate sentences or suggest plot lines, the authenticity of human experiences—joy, loss, hope—must be crafted by real people.

Use AI for idea generation but rely on human creativity to craft unique narratives, adding personal touches, anecdotes, and cultural nuances that only humans can provide.

This approach preserves the authenticity of your brand voice to make your story memorable.

Conclusion on the 5 key AI pitfalls in storytelling to avoid, when balancing technology with the human touch.

Consumers are increasingly savvy about digital manipulation and crave real, relatable content. The brands that succeed will be those that use AI as a tool to enhance their storytelling, not as a replacement for authentic human creativity and emotion.

Embrace and blend the innovation that AI brings:

In this new era of brand storytelling, the goal is not to create perfect, polished narratives, but to craft stories that feel real, organic to the brand’s identity. By combining the power of AI with genuine human insight, brands can create stories that not only capture attention but also build lasting connections with their audiences.

Embrace and blend the innovation that AI brings, with the irreplaceable emotional value of human insights, ensuring that your brand tells authentic stories that truly and genuinely resonate with your audience. 

Similar Posts