By Chuck Gallagher, Business Ethics Keynote Speaker & AI Speaker and Author
Opening Story: When the Credits Rolled—And Something Didn’t Feel Right
In 2024, a rising director from Austin, Texas premiered her sci-fi film at a regional festival. The visual effects were stunning, the language localization seamless, and the buzz was electric. Yet backstage, she admitted something unsettling: “Half the film was produced with generative AI—and I have no idea if the model used someone else’s scripts, voices, or music.” She paused. “It feels like I stole something. But I don’t know from who.”
That comment stayed with me.
We are stepping into a world where artificial intelligence is not just a tool—it’s a co-creator. In the world of media and entertainment, that’s both exhilarating and ethically uncharted. We’ve opened the door to a new creative universe. But the question is: who gets to walk through it—and at what cost?
The Magic and Momentum of AI in Media
Morgan Stanley’s recent report suggests that generative AI could reduce production costs by 10–30% in TV and film. That’s not a marginal gain—it’s a revolution. Here’s what’s shifting:
- Scriptwriting Assistance: AI can analyze decades of storytelling, crafting scripts that follow winning formulas. It doesn’t tire. It doesn’t miss a deadline.
- Localized Content Creation: With voice-cloning and real-time translation, AI enables films and shows to reach global audiences in their native languages—without hiring dozens of actors.
- Video Editing & Special Effects: Platforms like Runway and Pika can generate VFX from simple prompts. Small teams now accomplish what only blockbuster budgets once could.
- Audience Personalization: Streaming services like Netflix and Spotify use AI to tailor recommendations. But now, we’re moving toward AI-generated content tailored to a viewer’s preferences. Think: custom TV episodes, not just custom playlists.
In one sense, AI democratizes Hollywood. It gives small creators a shot. But in another, it quietly rewrites the rules of originality, labor, and truth.
The Ethical Plot Twist: What Happens Behind the Curtain?
Every time we press “generate,” AI pulls from something—or someone—that came before. And that’s where the ethical stakes rise.
Intellectual Property in Jeopardy
The British Film Institute raised the alarm: more than 130,000 UK film and TV scripts may have been used to train large AI models—without the writers’ knowledge or consent.
That’s not innovation. That’s appropriation.
When AI replicates the tone of a well-known screenwriter or mimics an actor’s voice from past roles, we must ask: Who owns that voice? That style? That legacy?
Lawsuits That Are Rewriting the Industry
Universal and Disney have filed lawsuits against AI developers for scraping copyrighted content without permission. Universal even included anti-AI warnings in its latest films.
Why? Because the AI threat isn’t just theoretical. It’s producing usable content today—without proper licensing, compensation, or attribution.
Imagine spending your life building a voice, a style, a perspective—and seeing it reproduced without credit or pay.
That’s not the future. That’s right now.
Deepfakes and Digital Resurrection
From ABBA’s virtual concert avatars to AI-generated commercials of long-deceased actors “starring” in new roles—AI can now bring the past into the present.
But should it?
When an AI model creates new words using the voice of someone who’s died, it doesn’t just recreate sound. It conjures memory. It trespasses grief.
What do we owe the dead? What boundaries do we set when emotion meets algorithm?
Displacement and Devaluation
AI doesn’t need lunch breaks or overtime. For studios, that’s a bottom-line dream. For editors, designers, and junior creatives—it’s a looming threat.
As studios adopt AI to cut costs, we must ask: Will we reinvest those savings into human talent? Or will we automate creativity into silence?
I’m not against progress. I’m for responsibility. The two must walk together.
A Framework for Ethical Innovation
Ethical AI in entertainment isn’t just a technical challenge—it’s a leadership mandate. Here’s the framework I propose when I speak with executives, producers, and studio boards:
- Transparent Training Data
Models must disclose where their data comes from—and obtain licensing or opt-out mechanisms. Creative work is not public domain just because it’s online.
- AI as a Partner, Not a Replacement
Use AI to enhance workflows—not erase people. Let it do the heavy lifting on tasks that don’t require nuance, emotion, or intuition. Keep humans in the loop.
- Consent and Likeness Protection
Voice, image, and style belong to the individual. Even with “public” figures, AI use must be governed by consent, contracts, and context. This matters especially in posthumous recreations.
- New Roles, New Training
As old jobs shift, new ones must emerge. AI model trainers, ethics officers, and creative supervisors are needed now. The next Spielberg might direct AI prompts—not actors.
- Cultural Guardrails
Don’t just ask if it’s legal. Ask if it’s right. Will it empower voices that haven’t been heard—or amplify the same ones over and over again? Will it deepen storytelling—or dilute it?
Final Scene: Truth is Still Our Most Powerful Story
As an AI speaker and business ethics keynote speaker, I’ve spent decades helping companies navigate what’s possible versus what’s permissible. In media and entertainment, the stakes are particularly high. Why?
Because stories shape belief.
If we lose control over the source of those stories—if we let machines dictate narrative without accountability—we risk losing more than jobs. We risk losing trust.
Innovation is exhilarating. But let’s build it with intention. Let’s create stories that elevate—not exploit.
I’d love to hear your thoughts.
As always, we welcome your comments and are happy to respond. Feel free to share your thoughts below. Whether you’re a filmmaker, executive, educator, or just someone who loves the power of story—this conversation matters.
