Demo

A new study from the Open Markets Institute highlights the fragile foundations of AI content licensing, exposing dominant platforms’ dual role in weakening publishers and shaping unfair compensation, prompting calls for stricter regulation.

A new report from the Open Markets Institute argues that the emerging market for AI content licensing is being built on unstable foundations, with the same dominant platforms that weaken publishers’ traffic also shaping the terms of compensation. The study, titled “Same Gatekeepers, New Tollbooths”, says the current mix of copyright claims, selective licensing deals and voluntary promises risks repeating the pattern seen in the search and social media eras, when platforms absorbed the value of news and creative work without sustaining the businesses that produced it.

The report by Dr Courtney Radsch and Karina Montoya says AI companies depend on a steady flow of high-quality human-made material, yet the commercial system developing around that content is leaving most publishers and creators outside the room. After more than 35 interviews and consultations, the authors describe a market split into a handful of large bilateral agreements, a growing layer of intermediaries and a vast uncompensated remainder. ProMarket has separately argued that licensing at internet scale is only practical for a small number of large rights-holders, underscoring the report’s claim that ad hoc deals cannot cover the breadth of online publishing.

Open Markets says the imbalance is already visible in publisher economics. The report says falling website traffic is costing the industry billions and contributing to newsroom cuts, while AI systems continue to draw on the same material that helped create their value. Digiday has reported that scraping of publisher sites is rising even where protections and licensing arrangements exist, reinforcing the report’s warning that access controls alone are not stopping machine use of human content.

To address what it sees as a structural problem, the report calls for market-wide licensing rules, collective bargaining models, stronger transparency requirements and technical systems that can identify when AI outputs depend on specific sources. It also urges that any compensation framework include local, independent and non-English publishers, rather than concentrating benefits among the largest media groups. Rightswise, in an overview of the sector, says the market has been taking shape since 2023 and 2024 as AI developers moved from using scraped data to striking commercial agreements, but Open Markets argues that without enforceable rules the result will be a deeper erosion of journalism and a further degradation of AI systems themselves.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services

Noah Fact Check Pro

The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.

Freshness check

Score:
8

Notes:
The report titled ‘Same Gatekeepers, New Tollbooths: Mapping the AI Content Licensing Market’ was published by the Open Markets Institute’s Center for Journalism and Liberty. The publication date is not specified in the available information, but the report appears to be recent. No evidence suggests that the content has been recycled or republished across low-quality sites or clickbait networks. The narrative does not appear to be based on a press release, and there are no discrepancies in figures, dates, or quotes. The content seems original and timely.

Quotes check

Score:
7

Notes:
The report includes direct quotes from Dr. Courtney Radsch, such as: ‘The quality of AI outputs depends on an ongoing supply of quality human content. Destroy the economic foundation of that content, and you degrade the intelligence of AI itself. It’s a house of cards built upon quality content as the base.’ A search for this quote reveals no earlier usage, indicating originality. However, without access to the full report, it’s challenging to verify the context and accuracy of the quotes. The absence of earlier matches suggests the quotes are original, but independent verification is limited.

Source reliability

Score:
8

Notes:
The Open Markets Institute is a reputable organisation focused on promoting competition and consumer welfare. ([openmarketsinstitute.org](https://www.openmarketsinstitute.org/technology-power?utm_source=openai)) The report is authored by Dr. Courtney Radsch and Karina Montoya, both associated with the Institute. The content is hosted on the Institute’s official website, indicating a direct source. There is no evidence of derivative content or aggregation from other publications. The source appears reliable and independent.

Plausibility check

Score:
7

Notes:
The report discusses the challenges in the AI content licensing market, highlighting issues such as copyright claims, selective licensing deals, and the risk of repeating patterns seen in previous digital eras. These concerns align with ongoing discussions about AI’s impact on content creation and distribution. However, without access to the full report, it’s difficult to assess the depth of analysis and the robustness of the claims. The plausibility of the claims is reasonable, but further verification is needed.

Overall assessment

Verdict (FAIL, OPEN, PASS): OPEN

Confidence (LOW, MEDIUM, HIGH): MEDIUM

Summary:
The report from the Open Markets Institute addresses concerns about the AI content licensing market, highlighting issues such as copyright claims and the risk of repeating past patterns. While the source is reputable and the content appears original, the lack of independent verification and limited access to the full report raise concerns about the overall reliability. Further external verification is recommended before publishing.

[elementor-template id="4515"]
Share.