Skateboard with Google written on on floor

5 critical questions Google’s core Discover update instantly poses

So, here we are. Another core update from Google, but this one isn’t about search… it’s hopefully going to fix the messy world of Discover.

I’ve been studying Discover, Google’s part-social network, part-news feed platform, for a long time, and recently it’s become more volatile than ever. 

So even though there were only a few hundred words from Google when announcing this update, scratch beneath the overly-positive surface level messaging (‘we’re going to make everything better and more original and good!) and you’ll see that there are lots of things that don’t quite make sense.

(Before we get into the points, I’m not even going to talk about the proliferation of fake websites, where new domains are registered, flooded with AI-created clickbait drivel, and are somehow doing really well in Discover.

If this update doesn’t fix that, then I have no idea what Google is up to, especially as it promised to fix the problem last year.)

Can a local gardener really get more traffic than a national newspaper?

I can’t remember how many times I’ve read about how ‘niche, expert content is going to do better’ in some Google update. But what’s interesting here is how the search giant is presenting its promise to surface expertise better.

It’s saying that sites can be expert in many things (and we’ve known that for a while – I’ve worked on lots of titles that have quite disparate topics, but still succeed on Discover) and get the traction their content deserves.

But how good does it need to be? Google highlights the idea that a local newspaper could still rank for gardening content if it covers that among other topics. But would it appear in Discover more frequently than a really expert, niche blogger? And how is Google deciding what’s actually ‘better’ with this update?

I suspect that we’ll still see the sites with good domain authority be more visible overall, but a few smaller, local news outlets will see a sudden surge in popularity (if they’ve been putting in the hard work to show expertise).

The only question is how many publishers will still even have a team of experts – not many news sites are able to support a team of experts just because it’s the ‘right thing to do by the reader’.

Are ‘local’ writers about to get a windfall of traffic?

In a similar vein to the gardening example above, Google has promised that localised content will be pushed to readers, to help improve the relevance of what they’re seeing.

This will be interesting – will it be content that’s about the area, or is Google going to prioritise sites that are headquartered in the region that are given priority? 

Google highlighted country-based content when talking about what was ‘local’, but we can only hope it’s more targeted than that.

If it’s more about the local towns (and I’ve often seen that really local content that focuses on small-town issues can do well on Discover, often at the expense of that which deals with larger cities) that would be a brilliant thing.

Readers are starved of content that’s truly relevant to their lives since local newspapers began to decline – being more locationally aware and offering traffic to writers with a local specialism (i.e. the ones that can truly call themselves expert) would be great to see.

We’ve already seen the desire for local content to return (look at the success of Mill Media, a subscription-based organisation that focuses on the stories that matter to the nearby residents).

Is first-person content going about to become even more important?

Google often talks about ‘original’ content, and it’s promising to bring that to the fore again. But it’s also pushing AI summaries of many articles in Google Discover, so that leads to a tricky conundrum:

On the one hand, if you’re writing something quickly about a current event, it’s likely the piece will be similar to your rivals’, as speed is of the essence – and that means you could just be sucked into the AI summary. But Google prioritises timely content in its Discover feed.

But choose to write about less timely or lower audience areas to stand out, and you might struggle too, as the bigger traffic spikes come from writing about more popular topics.

So it seems that journalists using their experiential learning (something we wrote about recently) and using a first-person narrative to describe their experience will potentially see more cut through, as that’s much harder to use in an AI summarisation. 

Where will the new ‘line’ be for clickbait?

Ah, here we are again. Google is promising to reduce the amount of sensationalised headlines in its feeds. 

The issue, in my eyes, has always been: how does Google categorise a headline as ‘sensational’? I mean, it should already know, given that it’s likely analysed millions of headlines, and would have been able to track the bounce rate and engagement time as a sign.

If an article overpromises and underdelivers, readers will leave quickly – what more is needed as a signal for overly-sensationalised headlines?

But I will cut Google some slack here, as there’s still a lot of nuance around reader tastes. When I first began writing many, many years ago, I was told that writing a good headline is all about encouraging a click without straying too far into ‘clickbait’… but one person’s clickbait is another’s gold standard. 

Tabloids have existed for decades and thrive on creating emotionally-charged headlines that offer intrigue, curiosity and a promise of amazing information if you read on – it doesn’t matter your opinion of these publications, the fact is that people will always read more negative stories. (Take the study, from 2023, that found for a headline of average length, each additional negative word increased the click-through rate by 2.3%).

There’s a difference between sensationalised headlines (i.e. clickbait) and well-written titles that engage the user. It’s whether Google is going to get better at telling the difference and will finally make Discover feeds feel genuinely useful, rather than every scroll through becoming an exercise in trying to spot the click trap headlines.

What is the US rollout going to tell us?

Google is taking the unusual step of making these changes in the US first, before moving to the wider world. It doesn’t usually do a core update regionally but, then again, it doesn’t usually do changes related only to Discover.

I can see why this is happening, as the way Google structures Discover feeds around the world varies. Different headlines styles are more prevalent in certain countries’ feeds than others. And rich media is more visible in some regions than others too.

For instance, in the US most people’s feeds will have more YouTube content at the top and far more social media posts in the mix too.

If you’re in the UK, you’ll have also noticed far more social media posts in your Discover feed lately, and our analysis shows that YouTube content is being shown 300% more frequently in the popular ‘news’ category in the second half of the year than the first.

So by making changes to US users’ Discover feeds first, it’s clear Google likes to prove concepts in its home region, allowing it to test click through rate and engagement, before widening the strategy.

Like many things (such as newsletters or shifts in the popularity of certain article styles) the US often leads the way in terms of how audience behaviour will alter.

Therefore it’s always a good idea to see what’s doing well for the US sites and be ready to adapt your content strategy – whether that’s being more focused on social, video or specific styles of headlines, when things alter there’s opportunity to get more eyeballs on excellent content.