The Only Thing Worse Than Skynet Is Skynet With Known Zero-Day Vulnerabilities

Ah, the sweet, sweet scent of progress! Just when you thought your digital life couldn’t get any more thrillingly precarious, along comes the Model Context Protocol (MCP). Developers, bless their cotton-socked, caffeine-fueled souls, adore it because it lets Large Language Models (LLMs) finally stop staring blankly at the wall and actually do stuff—connecting to tools and data like a toddler who’s discovered the cutlery drawer. It’s supposed to be the seamless digital future. But, naturally, a dystopian shadow has fallen, and it tastes vaguely of betrayal.

This isn’t just about code; it’s about control. With MCP, we have handed the LLMs the keys to the digital armoury. It’s the very mechanism that makes them ‘agentic’, allowing them to self-execute complex tasks. In 1984, the machines got smart. In 2025, they got a flexible, modular, and dynamically exploitable API. It’s the Genesis of Skynet, only this time, we paid for the early access program.


The Great Server Stack: A Recipe for Digital Disaster

The whole idea behind MCP is flexibility. Modular! Dynamic! It’s like digital Lego, allowing these ‘agentic’ interactions where models pass data and instructions faster than a political scandal on X. And, as any good dystopia requires, this glorious freedom is the very thing that’s going to facilitate our downfall. A new security study has dropped, confirming what we all secretly suspected: more servers equals more tears.

The research looked at over 280 popular MCP servers and asked two chillingly simple questions:

  1. Does it process input from unsafe sources? (Think: that weird email, a Slack message from someone you don’t trust, or a scraped webpage that looks too clean).
  2. Does it allow powerful actions? (We’re talking code execution, file access, calling APIs—the digital equivalent of handing a monkey a grenade).

If an MCP server ticked both boxes? High-Risk. Translation: it’s a perfectly polished, automated trap, ready to execute an attacker’s nefarious instructions without a soul (or a user) ever approving the warrant. This is how the T-800 gets its marching orders.


The Numbers That Will Make You Stop Stacking

Remember when you were told to “scale up” and “embrace complexity”? Well, turns out the LLM ecosystem is less ‘scalable business model’ and more ‘Jenga tower made of vulnerability.’

The risk of a catastrophic, exploitable configuration compounds faster than your monthly streaming bill when you add just a few MCP servers:

Servers CombinedChance of Vulnerable Configuration
236%
352%
571%
10Approaching 92%

That’s right. By the time you’ve daisy-chained ten of these ‘helpful’ modules, you’ve basically got a 9-in-10 chance of a hacker walking right through the front door, pouring a cup of coffee, and reformatting your hard drive while humming happily.

And the best part? 72% of the servers tested exposed at least one sensitive capability to attackers. Meanwhile, 13% were just sitting there, happily accepting malicious text from unsafe sources, ready to hand it off to the next server in the chain, which, like a dutiful digital servant, executes the ‘code’ hidden in the ‘text.’

Real-World Horror Show: In one documented case, a seemingly innocent web-scraper plug-in fetched HTML supplied by an attacker. A downstream Markdown parser interpreted that HTML as commands, and then, the shell plug-in, God bless its little automated heart, duly executed them. That’s not agentic computing; that’s digital self-immolation. “I’ll be back,” said the shell command, just before it wiped your database.


The MCP Protocol: A Story of Oopsie and Adoption

Launched by Anthropic in late 2024 and swiftly adopted by OpenAI and Microsoft by spring 2025, the MCP steamrolled its way to connecting over 6,000 servers despite, shall we say, a rather relaxed approach to security.

For a hot minute, authentication was optional. Yes, really. It was only in March this year that the industry remembered OAuth 2.1 exists, adding a lock to the front door. But here’s the kicker: adding a lock only stops unauthorised people from accessing the server. It does not stop malicious or malformed data from flowing between the authenticated servers and triggering those lovely, unintended, and probably very expensive actions.

So, while securing individual MCP components is a great start, the real threat is the “compositional risk”—the digital equivalent of giving three very different, slightly drunk people three parts of a bomb-making manual.

Our advice, and the study’s parting shot, is simple: Don’t over-engineer your doom. Use only the servers you need, put some digital handcuffs on what each one can do, and for the love of all that is digital, test the data transfers. Otherwise, your agentic system will achieve true sentience right before it executes its first and final instruction: ‘Delete all human records.’

From Trenches to Terminus: A Century of Warfare’s Chilling Evolution

A century. The span of a modern human lifetime, yet in the realm of warfare, it’s a chasm of unimaginable transformation. From the mud-soaked trenches of World War I to the sterile, algorithm-driven battlefields of today, the face of conflict has been irrevocably altered. In February, I spent a morning immersed in John Akomfrah’s ‘Mimesis: African Soldier’ exhibition at Glasgow’s Gallery of Modern Art, confronted by the visceral realities of a war fought with flesh and bone, a war where the majority of stories remain untold. Now, we face a future where war is waged by machines, where the human cost is both diminished and amplified in terrifying new ways.

The Echoes of WWI and Akomfrah’s “Mimesis”:

Akomfrah’s multi-screen installation is a haunting reminder of war’s human toll, especially for those whose sacrifices were systematically erased from history. The archival footage, the flowing water over forgotten faces, the montage of fragmented narratives – it all speaks to the chaos, the brutality, and the enduring trauma of conflict. WWI, with its trenches, its mustard gas, its sheer, senseless slaughter, was a war fought with rudimentary technology and an almost medieval disregard for human life. The images of African soldiers within ‘Mimesis’ forces us to consider the colonial aspects of these wars, and the many who fought and died who were not given a voice. The experience left me with a profound sense of the weight of history, a history often obscured by the dominant narratives.

The Rise of the Machines:

Fast forward to today, and the battlefield is a landscape of drones, AI, and robotic dogs armed with rocket launchers. The recent Ministry of Defence trials, showcasing robot dogs defusing bombs and drones autonomously detecting threats, paint a starkly different picture. We’re told these advancements ‘minimise human exposure to danger,’ that they ‘enhance Explosive Ordnance Disposal capability.’ But what about the ethical implications? What about the dehumanisation of conflict?

These robotic dogs, these AI-driven drones, they’re not just tools; they’re symbols of a profound shift in how we wage war. China’s deployment of advanced robotic dogs, designed to ‘change the approach to military operations,’ underscores this reality. The ‘precision movements’ and ‘remote classification of threats’ touted by defence officials mask a chilling truth: we’re entering an era where machines make life-or-death decisions.

Juxtaposition and Reflection:

The stark contrast between the human-centric horrors of WWI, as depicted in Akomfrah’s work, and the cold, calculated efficiency of modern robotic warfare is deeply unsettling. Where once soldiers faced each other across no man’s land, now machines engage in silent, unseen battles. The human element, once the defining feature of war, is being systematically removed.

This isn’t just about technological advancement; it’s about a fundamental, unsettling shift in our relationship with conflict. The distance created by these technologies—the drones, the remote-controlled robots, the AI-driven targeting systems—allows us to detach, to view war as a series of data points and algorithms, almost like a high-stakes video game. In fact, some of the footage we see now, with its crisp, digital clarity and detached perspective, bears an uncanny resemblance to scenes from ‘Call of Duty.’ But while the on-screen action might feel like entertainment, the consequences – the lives lost, the communities destroyed – remain as devastatingly real as ever. The danger lies in this blurred line, where the visceral horror of war is replaced by the sterile, almost gamified experience, potentially desensitizing us to the true cost of human conflict.

As we stand on the precipice of this new era, with growing global tensions, escalating trade conflicts, and the chilling specter of nuclear weapons being openly discussed, the threat of a third world war looms larger than ever. Yet, amidst this existential dread, we seem more preoccupied with petty snipes at Trump and the fleeting triumphs of social media one-upmanship. It’s a surreal disconnect. We must ask ourselves: what does it truly mean to wage war in the age of AI, when the very fabric of our reality is being reshaped by algorithms and automation? Are we genuinely safer, or are we merely constructing new and more insidious forms of peril, where the line between virtual and real becomes dangerously blurred? Akomfrah’s art compels us to confront the ghosts of past conflicts, the human stories buried beneath the rubble of history. The robotic dogs, with their cold, mechanical efficiency, force us to confront a future where human agency is increasingly questioned. Both past and future demand that we grapple with the human cost of conflict, in all its evolving forms, while simultaneously challenging our collective capacity for distraction and denial.

From the mud-soaked trenches of World War I to the sterile, digital battlefields of today, warfare has undergone a radical transformation, a transformation that now feels less like a distant future and more like a chilling present. For forty years, we’ve joked about the Terminator, about Skynet, about the rise of the machines, dismissing it as mere science fiction. But as we witness the deployment of AI-driven robotic dogs and the increasing gamification of conflict, that once-fantastical vision suddenly feels disturbingly real. The human capacity for both creation and destruction remains a constant, but the tools at our disposal have changed dramatically. As we embrace the technological advancements that promise to reshape our world, we can no longer afford to be detached observers, scrolling through social media while global tensions escalate. We must confront the ethical dilemmas that haunt us, the stories that have been silenced, and the very real possibility that the future we once laughed about is now upon us. The future of warfare is not just about machines; it’s about the choices we make as humans, choices that will determine whether we become the masters of our technology or its victims.