Appointing Matt Brittin, a former Google executive, as BBC director general is smarter than critics admit. Although he was on the board of the Guardian’s publisher, Mr Brittin was no journalist. He does understand platforms, scale and digital audiences.
Director generals come under scrutiny when crises hit, like this week’s sacking of Scott Mills over his “personal conduct”. It then emerged that police previously questioned the Radio 2 DJ over separate allegations, of serious sexual offences, closing the case due to lack of evidence. But the role’s underlying challenge is facing future threats to the corporation’s audience.
On one measure, YouTube reaches more Britons than the BBC’s channels combined. But hovering into view is AI, which has facilitated misinformation, error and ignorance. It is already beginning to mediate the news – and how it is understood. Ofcom says about 30% of searches display AI summaries, seen regularly by more than half of adults. The BBC has tried, for good reasons, to stop its journalism being extracted by AI without payment. But it risks excluding itself from a technology where many now get information. The Reuters Institute found only about 6% of users turn to AI for news. But as summaries embed in search, journalism becomes raw material, not the finished product.
A 2025 paper by Kai-Cheng Yang of Binghamton University reveals the implications. It shows that AI-generated answers draw on a narrow band of sources: OpenAI models rest on wire services; Google’s on search-driven global media; Perplexity on respected brands such as the BBC. The same question produces a different response depending on the system used. Despite the BBC being the UK’s most trusted news source, only two of four AI tools drew on its content, according to a study by the IPPR thinktank. The UK’s most popular AI tool – OpenAI’s ChatGPT – cited GB News more often. ChatGPT’s top citations often align with OpenAI’s publisher deals (including the Guardian’s). The lack of transparency around how AI’s sources are selected and weighted is problematic.
Audiences once chose between narratives. Social media made them navigate – or trapped them in filter bubbles. Now AI distils a single response. Nuance and plurality are at risk. Journalists have traditionally judged what information to use and which sources to prioritise. Their mental models were built up through reporting. AI systems perform those functions through hidden algorithms, privileging what is most common, not what is most true.
Control lies not just in owning information, but in how it is structured, modelled and understood. The IPPR rightly argues that the UK must combine transparency over how AI answers are generated, fair licensing frameworks to ensure publishers are paid and intervention to curb platform dominance over information. Public service media – especially the BBC – should anchor this strategy. Impartial, accurate news is essential for democratic stability.
The BBC’s charter review must secure funding and end the cycle of “existential” resets with a permanent settlement protecting its independence. The BBC has the scale, data and mandate to underpin a trustworthy “orchestration” layer for news. Its journalism must be machine-readable, queryable and interpretable on its own terms. Letting companies like Palantir, co-founded by the Trump-backing billionaire Peter Thiel, do this would be a mistake. The BBC has traditionally fused innovation with public purpose. It must do so again – and ensure news remains contestable, transparent and accountable.