It is also clear from the disclosure at least some were wary that their efforts to tackle misinformation should not tip over into censorship.
There should be an “emphasis on importance of freedom of expression”, the document states, and on “transparency”. Delegates were clear that there should be “support and scope for greater efforts towards transparency and publicity”.
Somewhat ironically, about a third of the six-page disclosure is so heavily redacted, it comprises pages of black.
A section marked “Key points” is entirely blacked out. So are the names of the individual attendees, other than three civil servants.
We know that they included Loughborough University’s Andrew Chadwick and Will Moy, the then chief executive of Full Fact – but only because of statements they have made elsewhere.
At the Policy Forum’s next meeting, in March 2021, ethical principles were no longer the forum’s priority. Another memo says the DCMS proposed prioritising other topics for the remaining Policy Forum sessions.
When shown the heavily redacted text by the Telegraph, Mr Moy was aghast. On behalf of Full Fact, he said: “We do not believe that it was necessary or helpful to black out the notes of the meetings in this way.
“We recognise that not all of the work defending against these threats can be public but the government can and should be more open.”
There weren’t many more meetings of the Policy Forum. The group, which despite the claims of transparency with the public kept a low profile, was wrapped up in June 2021 after a mere six months.
Mission creep and content removal
But according to parliamentary disclosures, the Government had other measures in place to tackle the problem of disinformation.
Chief among them was the Counter-Disinformation Unit (CDU), a secretive organisation run out of the DCMS.
The CDU started life in 2019 when its job was to tackle “disinformation” – that is, information that is intentionally and deliberately misleading – related to the European and general elections.
It was “stood up” for the third time in March 2020 as it became clear that the virus that had spread quickly in Wuhan was threatening to do the same in the UK. Its remit was expanded to include “identifying and responding to harmful misinformation relating to Covid-19” – that is, false information that is “inadvertently spread”, according to Caroline Dinenage, the former digital minister.
It was the first example of mission creep, but others would follow.
What the unit meant by “responding” was varied. In some cases it involved an online rebuttal. But in others, the DCMS used its position with the social media companies as what it called a “trusted flagger” to fast-track a request for content to be removed.
Ms Dinenage told a committee of MPs in 2020 that “where potentially harmful content is identified”, the CDU will “flag that content to the platform to ensure it can be swiftly reviewed and acted on”. She added that the Government does not “mandate the removal of any content”. But it is arguable that the mere fact that the requests come from a ministerial office put pressure on the social media companies to heed them.
Even Meta’s own “oversight board” – an independent group that reviews moderation decisions on Facebook and Instagram – acknowledges that there is a lack of transparency around government requests for action.
The CDU and social media firms
Remarkably little is known about the CDU’s activities beyond its function. It has not revealed how many removal requests it has made. Meanwhile, the DCMS has refused to disclose how many staff the unit has, or how much money is spent on it.
However, emails obtained by Big Brother Watch show that they are frequently in touch with social media giants.
A Twitter executive told Matt Hancock’s special adviser in March 2020: “We’re also speaking regularly with the DCMS disinformation unit.”
Leaked WhatsApp messages show that over the months that followed, the health secretary discussed the problem of anti-vaccine misinformation with Sir Nick Clegg, the former deputy prime minister who was then vice-president of global affairs at Meta, Facebook’s owner.
In November 2020, Mr Hancock wrote to Sir Nick in America, mid-way through a “roundtable” meeting that the Government was holding with UK executives from Facebook and other technology companies.
“I’m just on a Zoom about tackling anti-vax with [Culture Secretary] Oliver Dowden – obviously vital,” he wrote. “Your team have been working really well with the department and the advertising ban is great – but we need to have a timeframe for removal of antivax material and how do [sic] to demonetise.”
Sir Nick promised: “I’ll look into this.” A month later he sent Mr Hancock another direct message: “Matt – we’re announcing further changes today (basically we’ll now remove false claims – debunked by public health experts – made about authorised/licences vaccines).”
In the background, the CDU was working alongside the Cabinet Office’s now defunct Rapid Response Unit, which monitored social media and tracked the way information was being shared online in order that it could make rebuttals if needed. That unit also had so-called “trusted flagger” powers with tech companies and requested the removal of six posts on social media sites in April 2020. It is not clear which platforms they were, but all the posts disappeared, whether they were removed by the platform or the people who posted them.
The specific details of what was taken down have only been made public in one example. The Government requested urgent attention on a Facebook post purporting to come from a Randox delivery driver dropping off boxes of Covid tests to NHS hospitals. The driver posted a picture of boxes of the test kits and their delivery schedule in an update only visible to his friends. Somehow the Government saw it and told Facebook “we would like this removed urgently”.
In the end, the person who posted it deleted the account before Facebook was required to take action.
Although this particular example may have been of little consequence, critics of the Government’s covert monitoring activities are concerned about a bigger issue at stake.
And it is one that looms larger when considering the kinds of content these little-known units are monitoring.
AI firms trawling internet
Government contracts suggest that much of the CDU’s work is carried out with the help of artificial intelligence firms, scraping the internet for statements that may count as mis- or disinformation.
The DCMS spent £114,000 with a firm called Disinformation Index at the start of the pandemic and has a contract worth more than £1.2million with Logically, a firm headquartered in Yorkshire, which claims to use AI to “uncover and address” misinformation and disinformation online.
Publicly available contract information suggests that the CDU’s monitoring programme continued until at least April 2023, and that it included helping to “build a comprehensive picture of potentially harmful misinformation and disinformation”.
Comprehensive is an apt word. Logically’s literature boasts that it “ingests material from over 300,000 media sources and all public posts on major social media platforms”. Documents obtained under data laws paint a disturbing picture of the kinds of material that it has monitored for the Government’s CDU.
In regular reports entitled “Covid-19 Mis/Disinformation Platform Terms of Service”, Logically scooped up posts by respected scientists questioning lockdown or arguing against the mass vaccination of children against Covid-19.
They also logged comments made by Silkie Carlo, the director of Big Brother Watch, on Talk TV at the end of 2021, objecting to vaccine passports and branding proposals as “a vision of checkpoint Britain”.
Other reports received by the CDU logged information about the Conservative MP David Davis, noting him as “highly critical of the Government, with the majority of comments criticising Imperial College and blaming [redacted] personally for lockdown”. The disclosure does not link to his specific comment, but it came five days after Mr Davis had co-written a piece for the Telegraph criticising the Imperial College London scientist Neil Ferguson’s modelling.
These examples are quite removed from the original aim set out by the Policy Forum on that drizzly January day: to address the “threat posed by Covid-19 mis- and disinformation”.
According to Ms Carlo, there has been huge “mission creep”, and we have arrived at a situation where the Government is effectively policing opinions it disagrees with as “false” information.
“Whilst everyone would expect the Government and tech giants to act against foreign hostile disinformation campaigns, we should be incredibly cautious about these powers being turned inwards to scan, suppress and censor the lawful speech of Brits for wrongthink, as is shockingly the case right now.
“The very concept of ‘wrong information’ dictated by a central authority is open to abuse and should be considered far more critically, lest we mirror Chinese-style censorship.”
A spokesman for Mr Hancock said the information was in the public domain and directed readers to buy a copy of his book.
A BBC spokesperson said the broadcaster attended the Counter-Disinformation Policy Forum in an observer-only capacity.