AI has been a hot topic the past week, with a proposal made to force news sites to label artificially generated content and a complaint filed against Google’s AI Overviews.
We’re also seeing more and more content being thrust behind paywalls, a big debate on whether live content can thwart AI scraping, and over 200 journalists give their predictions on how their world is likely to change in 2026 – but one writer used ChatGPT to work out what her peers are really saying.
Calls for AI content to be labelled and reviewed by humans
A new legislation proposal by lawmakers in New York calls for news organisations to label any content that used AI extensively in its creation, and for all AI-generated content to be reviewed by a human editor before it’s published.
Called the New York Fundamental Artificial Intelligence Requirements in News Act (NY FAIR News Act for short), the proposal targets content that is “significantly composed, written or created with the use of generative artificial intelligence”.
The bill would require anything on the page – whether prose, video or images – to be labelled as created by AI. While there are multiple stages before it can be signed into law, this is a significant effort to try and bring regulation and trust to a key facet of content creation.
the better web co. insight:
There’s a long way to go before a proposal like this becomes law, but for content producers it’s a positive first step to increasing transparency on the web.
There’s no getting away from the fact AI is playing a bigger role in the news creation process, and when it comes to research, pitching and structures it can save a lot of time. However, AI can’t be relied on for accuracy and truth, with human intervention required to ensure content is correct.
Whether this gets passed into law or not, it’s clear that there’s a strong desire to maintain the integrity of the newsroom, not just because of the need to have confidence in the facts but also so readers know what’s been created by trusted journalists, rather than aggregated digital interpretations.
This applies to any kind of content creation, and starting at one of the most globally-trusted centres of news production will hopefully encourage others to follow suit.
Read more
- New York State Senate: Senate Bill S8451
European publishers file complaint against AI Overviews
The European Publishers Council has issued an antitrust complaint against Google for its AI Overviews. The group claims Google uses their content, without permission or payment, to generate its AI overviews.
Google responded by saying the claims are “inaccurate” and “an attempt to hold back helpful new AI features that Europeans want”.
the better web co. insight:
The saga around Google’s AI Overviews continues, and it’s a battle which looks set to continue for some time as all sides state their cases.
No matter the outcome, it highlights the benefits of diversifying content discovery channels away from Google where possible, and finding additional places to build an audience. Google is still a significant source, but with the search ranking pages being more congested and links being pushed down by AI Overviews, it’s worth thinking about other plans.
Content strategy needs to adapt accordingly – not only trying to write content that can’t be easily answered by AI Overviews (basically, anything that can be answered simply) but also diversifying traffic sources to newsletters, direct readership and encouraging readers to subscribe, such as behind paywalls or on Substack.
Read more
- Reuters: Google hit by European publishers’ complaint to EU over AI Overviews
- Cryptopolitan: Publishers battle Google’s devastating AI threat to independent journalism
Reach extends premium paywall to more titles
UK publisher Reach has confirmed it’s extending its paywalled ‘premium’ offering to three more of its titles: the Daily Record, Wales Online and Leicestershire Live.
The premium paywall was already active on the Manchester Evening News and Liverpool Echo, with subscriptions costing £4.99 per month, or £39.99 per year.
Subscribers will get access to cleaner, faster, ad-lite pages (but not ad-free), and unlimited article views. The majority of content will remain free to access, albeit with ad-heavy pages and a limit on the number of visits readers can make.
the better web co. insight:
The proliferation of paywalled content is an interesting trend to watch, as publishers look to new revenue streams. Any attempts to move away from the ad-heavy models that encourage clickbait articles should be celebrated… as only a high number of views brings in enough revenue to continue, and that doesn’t incentivise quality.
The hybrid model employed here – where most content is still freely available, but readers are limited by how much they can view – provides an opportunity for publishers to showcase their content and give readers a try-before-you-buy view of what they’ll be getting if they subscribe.
We’ve seen it work successfully for local news publisher The Mill, which has over 10,000 paying members, but the key thing is creating the content that people will pay for. It either needs to be hyper local, servicing readers who want to know about their community, or reporting that offers something nobody else can.
Read more
- Hold the Front Page: Reach part-paywall extended to three more websites
Human journalists will be a necessary luxury
Recently NiemanLab sought feedback from over 200 journalists to get their predictions for the direction content will be taking in 2026. The results were hardly surprising: AI was the hot topic.
However, the responses were diverse, so investigative journalist Lauren Wolfe dissected the findings… with the unnerving help of ChatGPT.
She created the article with the service (with some human insight) and it found that most journalists think local news is going to be redefined, the value of the human writer will be more key than ever and it’s not going to be clear where the best place is to get news from.
the better web co. insight:
We keep returning to the same points. AI can create content quickly and cheaply, but it lacks the accuracy, authority, trust and real emotion of an article written in the traditional way.
There’s an acceptance here that AI isn’t going away, and is going to become a bigger part of many journalist’s work. It also highlights the underlying need for experienced writers to tie content together, use gut instinct, and understand the gravity and nuance of a situation is still essential.
However, the way this piece was created was an interesting point in itself. Using AI to sift through the data is a great way to find the themes that can be used to make a story…although it would have been great to hear more of what the journalist thought, it’s a very interesting way to present the findings.
Read more
- Chills, by Lauren Wolfe: Can we save journalism in the age of AI?
- NiemanLab: Predictions for journalism 2026
Is the live experience the ‘antidote’ to AI?
Sports publication The Atlantic is focusing more on live blogs and videos, in a move designed to help insulate it from AI scraping.
Sarah Goldstein, editorial director at The Athletic said “The one thing that AI isn’t as good at is the live experience. And so we can be that expert for you in the moment. Humans will always be faster.”
It’s currently not clear if this strategy does make things harder for AI, as some tools are able to scrape in real time. However, the use of video is more of a defence against LLMs taking the content, thanks to the additional cost and resource needed to properly process and scrape visual content.
the better web co. insight:
I’m a huge advocate for live blogs. Used in the right way, they’re powerful tools to keep audiences engaged and connected during real-time events. This move shows that publishers are investing more heavily in more connective content.
Providing human insight and analysis on events as they happen is something AI isn’t able to do well, and these formats are fantastic for personality-driven updates and entertaining posts from the humans who are running them. They’re not the cheapest to maintain though, so won’t be on offer to small-to-medium publishers.
However, this does light the way for anyone creating content: putting effort into the pieces that really showcase the voice and power of your brand is where the big names in media are going, and will lead to a richer, more engaged audience in the long term.
Read more
ICYMI: is sharing still caring?
Open-source news analysis from last month revealed the volume of open-source activity by newsrooms declined by 80% between 2016 and 2025.
Where once news organizations were actively sharing code on the likes of Github, as the online journalism industry worked together to define digital reporting, it’s happening far less now.
It’s partly down to once well-established newsrooms (such as Buzzfeed News and FiveThirtyEight) closing down, or struggling to survive as well as successful publications – like The New York Times – stopping sharing activity.
the better web co. insight:
These findings are interesting, as are the interpretations from the interviewees within the publishing industry. Yes, the lack of sharing of tools can impact the ease with which people find news – and discovery is key.
But equally, the tools that newsrooms are using have become more powerful and the need for innovative solutions, that used to power these repositories, has diminished somewhat.
That’s not to say it’s fine that sharing strategies has diminished this far, as it makes it more difficult to develop a content strategy from a standing start, without people who have experienced the changing landscape over the past decade.
Writing news (or any kind of content creation) isn’t just about revenue, it’s about making things easy for the reader to discover, so sharing your knowledge, even on small wins, should be the de facto.
Read more
With additional reporting from Gareth Beavis
Image credit: photo by Marvin Meyer on Unsplash

