Does this video show footage from a “No Kings” protest in the United Kingdom?

A video on X with over 157k views has been shared with the claim that it shows a massive nighttime “No Kings” protest in the United Kingdom, where hundreds of people can be seen forming an illuminated human banner that spells “No Kings”.

A key giveaway is the watermark visible in the bottom right corner of the video — the word “Veo.” Veo is a generative video model developed by Google DeepMind, capable of producing realistic-looking scenes from text prompts. This watermark strongly suggests that the footage was created using artificial intelligence rather than filmed in the real world. Another notable clue is the length of the clip. Videos generated by Veo are currently limited to about eight seconds long, which matches the duration of the viral clip when excluding the short TikTok watermark animation at the end.

There are also several visual inconsistencies that point to AI generation. The people forming the illuminated letters move in near-perfect synchronization, and the shapes of the letters appear overly crisp and uniform — traits that are common in AI-generated crowd scenes, where algorithms often struggle to replicate the natural variation and randomness of human movement.

While “No Kings” protests did take place on 18 October in several countries, including England, Canada, Germany, and Portugal, there is no evidence of a large nighttime event in the UK resembling the viral video. A Google reverse image search and keyword searches yielded no matches or credible reports of such a gathering, and no media outlets or participants have shared corresponding footage.

Taken together, these details indicate that the viral video is a fabrication created with generative AI. The clip does not show a real protest in the United Kingdom. Instead, it demonstrates how tools like Google’s Veo can be used to produce highly realistic but entirely fictional imagery.

This case highlights how generative AI tools like Veo can be misused to create highly realistic yet entirely fictional scenes, and how such content can spread quickly before being debunked. As synthetic media becomes more accessible, it’s increasingly important for audiences to look out for telltale signs: unusual watermarks, overly symmetrical visuals, and visual artifacts that might point to fabricated content.

Leave a Reply