Informed i’s Weekly Business Insights
FREE weekly newsletter, sharing knowledge briefs from TOP TEN BUSINESS MAGAZINES, as a social service to foster business acumen | Since 2017 | Week 447 | April 03-09, 2026 | Archive

Who owns ideas in the AI age?
By Francesca Cassidy | Fortune Magazine | April-May 2026
3 key takeaways from the article
- Can you ever really own an idea? The publishers, music producers, and film directors who make up the creative economy would say yes — as would many of the artists and writers they work with. But some in Big Tech are beginning to push back, arguing that ideas—like information—should be free, accessible, and repurposeable for anyone. When it comes to ideas, they argue, even those which spring directly from our own heads are the product of every other idea, environment, and person we’ve come into contact with. As such, they are fair game for training the large language models (LLMs) behind the AI platforms many of us have become reliant upon.
- The argument has become increasingly urgent as generative AI companies build powerful models—and attract huge investment—by ingesting vast amounts of online text, images, and video, including books, journalism, and art created by humans.
- This is the existential issue facing, among others, the international publishing giant Hachette, publishing since 1826. David Shelley, the company’s U.K. chief who also became U.S. CEO in January 2024, is joining the fight on behalf of creatives everywhere. The Google lawsuit is just one of many examples of creatives taking on Big Tech.
(Copyright lies with the publisher)
Topics: Creative Class, Intellectual Property Rights, AI and Society
Click for the extractive summary of the articleExtractive Summary of the Article | Listen
Can you ever really own an idea? The publishers, music producers, and film directors who make up the creative economy would say yes — as would many of the artists and writers they work with. But some in Big Tech are beginning to push back, arguing that ideas—like information—should be free, accessible, and repurposeable for anyone. When it comes to ideas, they argue, even those which spring directly from our own heads are the product of every other idea, environment, and person we’ve come into contact with. As such, they are fair game for training the large language models (LLMs) behind the AI platforms many of us have become reliant upon.
The argument has become increasingly urgent as generative AI companies build powerful models—and attract huge investment—by ingesting vast amounts of online text, images, and video, including books, journalism, and art created by humans.
This is the existential issue facing, among others, the international publishing giant Hachette, publishing since 1826. David Shelley, the company’s U.K. chief who also became U.S. CEO in January 2024, is joining the fight on behalf of creatives everywhere.
Shelley is a publisher through and through. The son of antique booksellers, he grew up above a bookshop and got his first industry role fresh out of university. You would be hard-pressed to find someone more passionate about, and invested in, the future of publishing. “We’re at an absolutely pivotal moment,” he says. “We need to stand up for the rights of the authors we work with and for the whole of the creative industries.”
This is not mere lip service. This January, Hachette asked a U.S. federal court for permission to intervene in a proposed class action lawsuit against Google. Along with Cengage, an education technology provider, the publisher claims the tech giant copied content from Hachette books and Cengage textbooks to train its large language model, Gemini, without asking permission. Google argues that training LLMs on vast text-based datasets is a transformative process which analyzes patterns in language, rather than reproducing the original works and, as such, qualifies as fair use. Shelley isn’t buying it. “It’s just another form of theft,” he says. “We know these LLMs basically stole our authors’ work.”
This isn’t the first time Hachette has taken legal action against those looking to steal from it. The Google lawsuit is just one of many examples of creatives taking on Big Tech. Across the U.S. and Europe, dozens of lawsuits have now been filed by individuals and organizations seeking to stop AI companies from training their models on copyrighted material without permission.
And here is the crux of the issue. Someone is making money from the use of these ideas—but it’s not the author, it’s the LLM companies. The commercial stakes are enormous: the global generative AI market was valued at $103.58 billion in 2025 and is projected to be $161 billion in 2026, according to Fortune Business Insights.
One logical conclusion is a return to the early days of publishing, when only the super-wealthy (or those lucky enough to have a rich patron) could afford to write for a living. Whether it is writing or music or illustration, “the fact you can make a good living in all of these fields is a really strong incentive,” says Shelley. Without the economic model, “the talent pool shrinks.”
Worse still, we face a future where the only art available is an iteration of an iteration on an iteration. “LLMs are just predictive text,” says Shelley. “If you starve the supply, then there will be no new stories. As humans, we need new stories, we need new art, we need new ideas, and to get that, the economics need to work for the people who make those things.”
What is most frustrating for Shelley is that there already exists a robust mechanism for ensuring this doesn’t happen: copyright law. “Copyright essentially exists to ensure creators are able to earn a living,” he says. “I don’t think it needs to change, but it does need to evolve.”
Shelley is also realistic about the need to work with Big Tech in order to achieve Hachette’s mission (“to make it easy for everyone to discover new worlds of ideas, learning, entertainment, and opportunity”).
Neither can companies afford to shy away from the transformative potential of AI, however cynical they may be about the motives of the platform owners. For Shelley, the key is to have very clear boundaries from the start, about where the publisher will and will not use the technology.
Indeed, there is a growing trend on both sides of the Atlantic for using human creation as a badge of honor. In early 2025, the U.S.-based Authors Guild launched a “Human Authored” certification, with the U.K.’s Society of Authors following suit in March 2026. The certification allows for minor AI assistance—such as spell-checking or brainstorming—but the text itself must be human-written.
As with the hipster revival of the word “artisanal” in the mid-2000s, the AI age is beckoning in new terms to connote great value and desirability. Now, instead of coffee made from rare Southeast Asian beans or blankets knitted in little-known Nordic communities, the focus is on content. From books to marketing campaigns, experts suggest that, in a world flooded by AI-generated work, those who can will pay for what is being called the “human premium” by some thought leaders. Of course, business leaders must play their part in protecting the economic ecosystem that makes this possible.
Here, again, is an issue which appears, on the surface, to be unique to the publishing industry, but which could have severe consequences for businesses of all sectors. For Shelley, freedom of expression is no longer merely a cultural issue—it is a leadership and governance one.
show less
Leave a Reply
You must be logged in to post a comment.