Bots, AI memes and Reform councillors: How foreign Facebook pages are hijacking UK patriotism
Fake British “patriot” pages run from abroad are fooling thousands – including elected councillors.
“Born and Bred in Britain. It’s in my DNA,” declares an image shared on the UK Glory Facebook page, featuring a white man sitting in front of the Union Flag. Clearly AI generated, the image has 295 likes and numerous positive comments from the group’s 90,000 followers. “I am white English a woman and C of E and so proud of it”, declared one person, while another suggested the British should “meet [immigrants] in international waters and drag them back to French waters.”
Other posts on the page are similar, featuring AI-generated images declaring them to be British and Proud, labelling immigrants as “parasites”, and stating “No surrender, No Compromise, Keep Britain Ours!” Each featuring thousands of positive reactions, it is apparent their followers believe they are in good company. The truth is that this page and dozens of others are part of a scheme run by individuals outside the UK, determined to capitalise on and monetise anti-immigrant hate, and in the process, they are sucking in thousands of genuine British people, including democratically elected representatives.
With deepfakes making headlines in the recent Irish elections, the impact of AI on democracy is becoming a serious concern. But to understand what’s really going on in the world of social media patriot groups, we have to look at who is behind their political manipulation.
Operating since 2015, the UK Glory Facebook page has only held its current name since last year. Prior to this, the page was titled Parroquia San Judas Tadeo Predio 4 Oficial (Parish of Saint Jude Thaddeus, Property 4, Official). Named for a tiny shop in Veracruz which sells religious artifacts, all six of the page admins are also Mexican and appear to have begun sharing pro-British content to grow their audience. Seemingly relying on the fact that followers will share this content without checking the source, the page banner still bears the name of the original group.
While some of their followers occasionally point this out, the majority appear unaware their political beliefs are being monetised and manipulated by a non-UK group. And they are not the only ones.
One of twenty-nine accounts uncovered by The Lead, UK Glory is part of an interlinked network of groups and pages with more than 467,000 followers between them. All sharing the same anti-immigrant, pseudo-patriotic images, the pages repost each other’s content to boost visibility and clicks. Clearly keen to convince followers that they are local, the page owners often list themselves as “gaming video creators” and include UK-based ‘business’ addresses and telephone numbers. It did not take much investigation to reveal all of these to be either fake or misappropriated from legitimate sources.
One page, British Legacy (run from Sri Lanka), likes to share British nostalgia memes that rely heavily on poor AI imagery. One of their key posters (and a member of multiple groups in the network) is an account run by someone named Demi Savage, who claims to be from Dublin, now living in Hounslow. With only one public profile picture, an image search revealed ‘Demi’ appearing on multiple porn sites and a babysitting profile in Medellin, Colombia. Her posts are anti-migrant memes interspersed with AI-generated videos, and nothing on the page suggests she is anything she claims to be, or even a real person.
Despite this, her follower list includes Reform UK councillors, Simon Mabbott, Kieran Mishchuk, Paul Webb, Ryan Kidby, Gordon Dunsmuir and eight others from towns and cities across the UK. Why they are Facebook friends with a porn bot who hates immigrants is unknown, but their willingness to follow obviously fake accounts is incredibly concerning.
As amusing as it may be, Demi’s page highlights how easily British users – even those in positions of political influence – can be drawn into a web of AI-driven, foreign-run content designed for clicks and profit.
Who is behind the network?
Few of the groups have links to the UK and none of them are connected to the far right. Instead, they have utilised rising political tensions to rebrand, promote and monetise pre-existing Facebook accounts. While few are as brazen as UK Glory, most were easy to identify as fake with a simple search of the ‘About’ tab.
Nine of the groups are being run from Sri Lanka, three have admins in Nigeria, and the admins of six other groups appear to be located in Mexico, the US, Australia, Canada, Norway, Sweden and Kosovo. The remaining eleven have hidden their locations, but conform to the same pattern of fake address – AI memes – gaming video creator, suggesting they are similarly moderated.
Half the groups listed have operated under at least one previous name, sharing content with no relation to British politics. On top of this, twelve of the groups either were or have been running ads to promote their content, including one asking followers to ‘Like’ if they wanted to “Ban the Burkha”, and another claiming they were “British Born & Bred”.
On the British Spirit Page, a post about Tommy Robinson’s “free speech march” received nearly 10,000 likes and shares. Run by a Sri Lankan teenager, the page was originally called ‘තනිකඩ කොල්ලා’ (a Sinhalese phrase which translates as Single Boy), but its owner appears to have realised that trolling British “patriots” is more lucrative than trying to date on Facebook, and changed the name in May this year.
While there is an amusing irony in knowing foreign groups have effectively monetised the clicks of UK xenophobes, it is also indicative of the worrying lack of accountability shown by social media platforms.
Democracy for sale
The regulation of monetised pages by Facebook is relatively simple and based on “foundational rules against unsafe content, such as graphic violence, nudity and hateful conduct”, but Facebook’s historic inability to monitor and moderate this type of content is well-known.
Its effect on democracy is equally well-known and – speaking to Parliament earlier this year – MP Sorcha Eastwood claimed that the UK was facing “a national emergency of misinformation and digital violence.” She added that, by failing to tackle it, the government was “failing democracy itself, as misinformation and intimidation silence voices and distort political participation.”
All of the groups exposed by our investigation promote Reform while criticising Labour and suggesting that Nigel Farage is the “way forward”. The ‘My UK News’ page is a key player, posting multiple pro-Reform memes and content while claiming they “love the UK from the sound of Big Ben to the warmth of a local pub”. Their love of all things London does not extend to actually living there, however, as they are based in Sri Lanka.
A recent study on social media platforms “hijacking democracy” noted that numbers of followers “can be manipulated, bought, or faked to create the impression that a particular issue represents the opinions of the majority.” This was, the study concluded, doing “irreversible” harm as by the time information is fact-checked, the “damage is already done.”
Likewise, a paper from last year by the Carnegie Endowment for Peace noted that “AI models enable malicious actors to manipulate information and disrupt electoral processes, threatening democracies.”
The dangers of normalising far-right rhetoric
A report published in Frontiers in Psychology revealed how far-right groups were using live-stream gaming platforms to “target and radicalise teenage players”. While concern over what children see online has been a topic of much discussion, it is clear adult users are equally at risk.
As with Facebook’s supposed commitment to eradicating hate, TikTok’s guidelines state the platform is “enriched by the diversity of our community” and that they do not allow “hate speech, hateful behavior, or promotion of hateful ideologies”. There are, however, no guidelines on posting inflammatory content disguised as ‘people asking questions’; a loophole that has been exploited time and again when creators have driven provably false narratives in the name of engagement.
This online activity isn’t harmless – research links exposure to far-right content with offline harassment, radicalisation, and political disengagement. According to Ofcom’s 2024 Online Nation report, one in three UK adults has encountered hate speech online in the past year, while the Center for Countering Digital Hate estimates that in 2023, 86 per cent of hateful posts flagged remain online. In 2018, researchers from the Massachusetts Institute of Technology found fake news travels faster than real news, concluding misinformation “diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude.”
With the Online Safety Act not able to challenge misinformation, and further regulation of our internet use bringing as many problems as it potentially solves, another solution must be reached and accountability placed squarely with those who run the platforms.■
About the author: Katherine Denkinson is a freelance investigative journalist with an extensive body of work on conspiracies and the the Far Right, published in multiple UK newspapers. Her reporting is informed by traditional journalistic methods and OSINT techniques, focusing on deep-dives into the murky waters of online communities. She is also the co-host of chart topping podcast, Carrie Jade Does Not Exist.
Thanks for reading your Saturday edition of The Lead. It’s great to have you with us. If you liked this story from Katherine, make sure you check out the exclusive extract from her new book that we published earlier this week. Her book INCEL: The Weaponisation of Misogyny, traces how a fringe online subculture of loneliness and resentment evolved into one of the most disturbing movements of the last decade, and it will be available to buy on 14 November.
Paid subscribers to The Lead can also listen to an in-depth conversation between Katherine and our Westminster Editor Zoë Grünewald, diving into how alienation, austerity and the collapse of community spaces have left young men without the support or purpose they need, and how political neglect has allowed online hate to fill the vacuum. Become a paid subscriber now to listen in full.





As the article points out, this exploits the very things that the networks rely on for their fundamental business model: likes and clicks. Unless the public are willing to allow some form of legislated censorship of accounts, how on earth can we expect the networks to help? You may as well tell the BP and Shell to stop pumping oil.