AI in Media

AI in Media: An Escalating Threat to Creativity
Remember when we owned our entertainment? My shelves once groaned under the weight of cassettes, CDs, VHS tapes, and DVDs. We owned that content—no one could take it away. Fast forward to today, we’re all just renting access in streaming services where content could vanish with a few keystrokes.
This erosion of ownership isn’t just nostalgia talking—it’s a cautionary tale for how we approach AI in media. Just because we can adopt new technology doesn’t mean we should rush in without considering what we might lose.
AI’s Promise vs. AI’s Perils
AI vendors promise efficiency and cost savings like never before. And while that’s true, it’s not the full picture. Our industry thrives not on efficiency alone, but on the unpredictable, messy magic of human creativity.
The Oscar-winning “Everything Everywhere All at Once” wasn’t the product of an algorithm—it was gloriously absurd, wildly human, and impossible for AI to conceive. That creative spark drives our business, not computational efficiency.
The Fear in the Room
Let’s be honest: AI is the first technology that has knowledge workers looking over their shoulders. Mention AI implementation in a meeting and feel the room temperature drop. Writers, editors, producers, and even executives fear replacement.
This isn’t irrational. When leadership frames AI primarily as a cost-cutting measure rather than a creative enhancement, we’re sending a clear message: Your job could be next. The smarter approach is integration that amplifies human talent rather than replacing it.
Homogenized Content: The Real Threat
AI doesn’t create—it regurgitates and remixes what it’s learned. Left unchecked, it will push us toward a bland median of content that offends no one but inspires no one either.
We’ve already seen what algorithm-driven content decisions produce: hyper-personalized content that favors predictability over originality. Add AI to this mix without human guidance, and we’ll accelerate toward creative bankruptcy.
The Ethical Minefield
AI systems inherit the biases of their training data. In media, where shaping public perception is our business, these biases become dangerous amplifiers of existing prejudices.
AI-curated news feeds already reinforce echo chambers. AI-generated content risks further eroding trust in an industry already battling credibility challenges. And deepfakes? They’re evolving faster than our verification tools.
Copyright Chaos Ahead
The legal landscape around AI-generated content resembles the Wild West. Who owns content created by AI trained on copyrighted works? The lawsuits against large language models are just the opening skirmish.
Media companies deploying these tools could find themselves facing unprecedented liability for content they didn’t technically create but did publish. Our legal frameworks weren’t built for this reality.
Data-Driven vs. Instinct-Driven
Data has always informed media decisions—from Nielsen ratings to box office numbers to focus groups. But AI’s data processing capabilities risk tipping the balance too far toward algorithm-driven decision making.
When every content choice is dictated by predictive analytics, we lose the human instinct that leads to breakthrough hits. No algorithm would have greenlit “Breaking Bad” or “The Sopranos” based on their premise alone.
Finding the Balance: Three Principles
At Think we’ve developed three principles for evaluating any new technology when it comes to the media space:
- Augment, Don’t Replace: Technology should free humans to be more creative, not eliminate the human element.
- Preserve Ownership: Both for creators and consumers, maintain clear lines of ownership and control.
- Protect Trust: Your audience’s trust is your most valuable asset—any technology that undermines it is too expensive at any price.
The Path Forward
AI in media isn’t inherently destructive—it’s a powerful tool waiting for thoughtful application. Used correctly, it can handle mundane tasks, surface insights from massive datasets, and even inspire new creative directions.
But leadership matters. Organizations must implement AI strategically, with careful oversight and clear boundaries. It needs robust ethical guidelines, verification systems for AI-generated content, and transparent disclosure to audiences.
Most importantly, media organizations need to keep humans at the center of the creative process. AI should serve creativity, not dictate it. The culture that media organizations have developed over decades is the secret to their success. Preserving and enhancing that culture is paramount in a creative environment. That is one of the core pillars we measure in any transformation.
We at Think have 20 years of experience leading technology transformation for companies in various industries including media and entertainment. We have the strategy, experience, and team to bring AI into even the most creative environments and preserve what makes you unique.
In media, as in art, knowing what to preserve is as important as knowing what to change. Let Think Consulting build an AI future that enhances human creativity rather than replacing it.
And… AI did not come up with the title. It was 100% human creativity.
About the Author
Todd Parkin (Vice President of Growth & Managing Director of Media) is a seasoned media executive with a remarkable track record of driving innovation and rethinking business models in the fast-moving media landscape. Read more.