Delhi | 25°C (windy) | Air: 185%

AI Threatens to Crush News Organizations. Lawmakers Signal Change Is Ahead

  • Nishadil
  • January 11, 2024
  • 0 Comments
  • 4 minutes read
  • 12 Views
AI Threatens to Crush News Organizations. Lawmakers Signal Change Is Ahead

More than a decade ago, the normalization of tech companies carrying content created by news organizations without directly paying them — cannibalizing readership and ad revenue — precipitated the decline of the media industry. With the rise of generative artificial intelligence, those same firms threaten to further tilt the balance of power between Big Tech and news.

On Wednesday, lawmakers in the Senate Judiciary Committee referenced their failure to adopt legislation that would’ve barred the exploitation of content by big tech in backing proposals that would require AI companies to strike licensing deals with news organizations. Richard Blumenthal, Democrat of Connecticut and chair of the committee, joined several other senators in supporting calls for a licensing regime and to establish a framework clarifying that intellectual property laws don’t protect AI companies using copyrighted material to build their chatbots.

“We need to learn from the mistakes of our failure to oversee social media and adopt standards,” he said. The fight over the legality of AI firms eating content from news organizations without consent or compensation is split into two camps: Those who believe the practice is protected under the “fair use” doctrine in intellectual property law that allows creators to build upon copyrighted works and those who argue that it constitutes copyright infringement.

Courts are currently wrestling with the issue, but an answer to the question is likely years away. In the meantime, AI companies continue to use copyrighted content as training materials, endangering the financial viability of media in a landscape in which readers can bypass direct sources in favor of search results generated by AI tools.

During the hearing centered on oversight of AI in journalism, Roger Lynch, chief executive of Condé Nast, urged Congress to “clarify that the use of our content and other publications’ content for training and output of AI models is not fair use.” With that issue out of the way, he explained that the “free market will take care of the rest” in reference to how licensing deals could be struck.

Josh Hawley, Republican of Missouri, called the proposal “imminently sensible.” Going one step further, he stressed, “Why shouldn’t we expand the regime outward to say anyone whose data is ingested and regurgitated by generative AI — whether in name, image or likness — has the right to compensation?” A from , filed last month, pulled back the curtain behind negotiations over the price and terms of licensing its content.

Before suing, it said that it had been talking for months with OpenAI and Microsoft about a deal, though the talks reached no such truce. In the backdrop of AI companies crawling the internet for high quality written content, news organizations have been backed into a corner, having to decide whether to accept lowball offers to license their content or expend the time and money to sue in a lawsuit.

Some companies, like Axel Springer, took the money. A major subject of the hearing was whether new legislation is necessary to account for what Lynch characterized as AI companies building their business model on “stolen goods.” “I think it’s premature,” said Curtis LeGeyt, chief executive of the National Association of Broadcasters.

“If we have clarity that the current laws apply to generative AI, the market will work.” While he agreed the law is on his side, Lynch contended that a key area of concern is the time it takes for the courts to resolve the issue. “Between now and then, many media companies will go out of business,” he said.

Jeff Jarvis, professor at the Craig Newmark Graduate School of Journalism, pushed back against the adoption of “protectionist legislation for a struggling industry.” On the issue of fair use, he advocated for a broader interpretation of the doctrine and said that journalists take advantage of it “everyday” when they “ingest information and put it out in a different way.” Under intellectual property laws, facts aren’t copyrightable.

This means that journalists are free to report common details without infringing on any copyrights as long as they aren’t copying excerpts word for word. It’s among the reasons that the may face an uphill battle in its suit against OpenAI, though the production of evidence of ChatGPT generating verbatim responses of its articles may get it over the hump.

And while AI companies have yet to argue in court that they can claim immunity under Section 230 of the Communication Decency Act, which has historically afforded tech firms significant legal protection from liability as third party publishers, it for copyright issues surrounding generative AI. Chamber of Progress, a tech industry coalition whose members include Amazon, Apple and Meta, argued in filings to the copyright office that big tech’s favorite legal shield should be interpreted to immunize firms from infringement claims.

Blumenthal stressed that AI firms shouldn’t be protected under Section 230 if they’re sued for content produced by AI tools. “There’s a deeply offensive irony here, which is that all of you and your publications or your broadcast stations can be sued,” he said. On top of copyright issues around generative AI tools, lawmakers have signaled concern around the creation of deep fakes and voice clones.

On Wednesday, a bipartisan coalition of House lawmakers a bill to prohibit the publication and distribution of unauthorized digital replicas. It’s intended to give individuals the exclusive right to approve the use of their image, voice and likeness by conferring intellectual property rights at the federal law.

Touching on the increasing prevalence of such deceptive content, Hawley said, “This seems to me like a situation we have to address and quickly.” THR Newsletters Sign up for THR news straight to your inbox every day More from The Hollywood Reporter.