Friday, March 13, 2026
Google search engine
HomeUncategorizedCall for inputs: The use of technology in the operations and activities...

Call for inputs: The use of technology in the operations and activities of mercenaries, mercenary-related actors and private military and security companies – Small Wars Journal

The modern battlefield is undergoing a seismic transformation, one driven not only by state-level military advancements but by the increasingly sophisticated technological prowess of private actors. The archetypal image of the soldier of fortune—a rugged individual with a rifle—is being rapidly superseded by a new reality: the tech-savvy operative, the cyber mercenary, and the corporate security contractor armed with algorithms, drones, and vast datasets. Recognizing this paradigm shift, the United Nations has issued a critical and far-reaching call for information, signaling a new international focus on the profound challenges posed by high-tech private warfare.

The UN’s Working Group on the use of mercenaries has launched a formal inquiry into how mercenaries, mercenary-related actors, and Private Military and Security Companies (PMSCs) are leveraging technology in their operations. This call for inputs, directed at states, academics, civil society, and private corporations, is more than a bureaucratic exercise; it is a clear acknowledgment that the fusion of private force and advanced technology has created a “grey zone” in international law and security that can no longer be ignored. The inquiry seeks to peel back the layers of secrecy surrounding this burgeoning industry, exploring a landscape where cyberattacks can be bought, surveillance is a service, and autonomous systems may soon be deployed by non-state actors with little to no public accountability. This investigation marks a crucial moment, as the international community begins to grapple with a future where conflict can be outsourced, automated, and waged with unprecedented precision and deniability.

A New Mandate for a New Battlefield: The UN Sounds the Alarm

The decision by the UN to formally investigate this nexus of technology and private military activity stems from a growing unease in diplomatic and human rights circles. The proliferation of advanced, often dual-use, technologies has democratized capabilities once exclusive to the most powerful nations, making them accessible to corporate entities and other non-state groups operating in the shadows of international law.

Who is the Working Group on the Use of Mercenaries?

Established in 2005 by the then-UN Commission on Human Rights, the Working Group on the use of mercenaries is a body of independent experts tasked with monitoring and reporting on the activities of private military actors worldwide. Its mandate is to study the effects of these groups on human rights, particularly the right of peoples to self-determination. Over the years, its scope has evolved from focusing on traditional mercenaries involved in coups and civil wars to encompass the complex and corporatized world of modern PMSCs, like those heavily utilized in Iraq and Afghanistan. The Group reports to both the Human Rights Council and the UN General Assembly, and its findings often shape international debate and policy recommendations. This latest call for inputs demonstrates the Group’s proactive stance in addressing emergent threats before they become entrenched and uncontrollable.

Deconstructing the “Call for Inputs”

The specific questions posed by the Working Group reveal the depth and breadth of its concerns. The inquiry is not a vague fishing expedition; it is a targeted investigation into the key technological domains reshaping private warfare. The Group has requested detailed information on several fronts:

  • Cyber Operations: The call explicitly asks for information on the use of cyber capabilities by PMSCs for both offensive and defensive purposes. This includes everything from hacking and data theft to disinformation campaigns and the disruption of critical infrastructure. The UN is seeking to understand if and how states are outsourcing their cyber warfare activities to private entities, creating a dangerous layer of plausible deniability.
  • Surveillance and Intelligence: A significant portion of the inquiry focuses on advanced surveillance technologies. This includes the use of Unmanned Aerial Vehicles (UAVs or drones), satellite imagery, facial recognition software, social media monitoring (OSINT), and biometric data collection. The Working Group is concerned about how these tools are being deployed in conflict zones for targeting, as well as in non-conflict settings for monitoring populations, activists, and journalists on behalf of state or corporate clients.
  • Autonomous Weapon Systems: Perhaps most chillingly, the inquiry delves into the realm of autonomy. It questions the development, marketing, and deployment of robotic and AI-powered systems by private actors, including Lethal Autonomous Weapon Systems (LAWS). This touches upon the ultimate fear: that unaccountable private groups could soon wield weapons that can independently select and engage targets without meaningful human control.
  • Regulation and Accountability: The UN is also probing the legal and regulatory vacuum. It asks what measures states have in place to oversee the export of these technologies to PMSCs and to ensure their use complies with international humanitarian law (IHL) and human rights law. This points directly to the central challenge—how to hold a multinational corporation, whose employees may be of various nationalities and operating in a third country, accountable for a cyberattack or a drone strike ordered remotely.

The Arsenal Transformed: From Rifles to Algorithms

The modern PMSC’s arsenal is increasingly digital. While physical security remains a core business, the most significant growth and the greatest concern lie in the technological services that augment and, in some cases, replace traditional military force. This transformation is occurring across multiple domains, creating a “full-spectrum” private military capability.

The Eye in the Sky: Drones and AI-Powered Surveillance

The proliferation of drones has been one of the most defining features of 21st-century conflict. Initially the preserve of powerful state militaries, sophisticated UAVs are now commercially available and widely used by PMSCs. These platforms are invaluable for Intelligence, Surveillance, and Reconnaissance (ISR) missions, providing real-time situational awareness for convoy protection, facility security, and tactical operations. In conflict zones like Libya and Syria, various factions and their private backers have deployed drones for everything from spotting artillery to directing attacks.

What elevates this capability is the integration of Artificial Intelligence. PMSCs are increasingly offering services that use AI to analyze the vast amounts of data collected by drones and other sensors. Machine learning algorithms can automatically detect and track vehicles, identify suspicious patterns of activity, and even perform facial recognition on individuals in a crowd from aerial footage. This “AI-powered ISR” provides clients with a level of intelligence that was unimaginable for a non-state actor just a decade ago, but it also raises profound ethical questions about automated targeting and the potential for algorithmic bias leading to civilian casualties.

Cyber Warfare as a Service: The Digital Dogs of War

The concept of “mercenary hackers” is no longer science fiction. A shadowy ecosystem of private firms and freelance cyber-specialists now offers offensive cyber capabilities for hire. These services, sometimes referred to as “Cyber Warfare-as-a-Service,” can be contracted by states seeking to conduct disruptive operations with deniability, or by corporations looking to gain an edge over competitors. The activities of groups like the NSO Group, whose Pegasus spyware was used to target journalists, activists, and political opponents, illustrate the power and reach of these private cyber-actors, who operate in a space analogous to traditional mercenary work.

For PMSCs, cyber capabilities are a natural extension of their security portfolio. They can offer clients protection against digital threats (defensive cyber) or engage in offensive operations to support kinetic missions. This could involve hacking into an adversary’s command-and-control network, spreading disinformation to shape public opinion, or launching denial-of-service attacks to cripple enemy infrastructure. The infamous Russian Wagner Group, for example, is widely believed to integrate cyber and information operations into its campaigns in Africa and Ukraine, blurring the lines between private military action and state-sponsored hybrid warfare.

The Specter of Autonomous Systems: Lethal Algorithms for Hire?

The most forward-looking aspect of the UN’s inquiry concerns autonomous weapons. While fully autonomous “killer robots” are not yet widely deployed, the building blocks are already in place. AI-powered loitering munitions (often called “kamikaze drones”) that can be programmed to independently search for, identify, and attack specific types of targets are a reality. The Turkish-made Kargu-2 drone was reportedly used in an autonomous mode in Libya in 2021, marking a potential watershed moment in warfare.

The danger of such systems falling into the hands of PMSCs is manifold. These groups operate with a profit motive and under varying degrees of oversight, far removed from the strict rules of engagement and ethical command structures that (in theory) govern state militaries. The deployment of LAWS by a PMSC could create a nightmarish accountability vacuum. Who is responsible if an autonomous weapon makes a mistake and kills civilians? The programmer who wrote the code? The commander who deployed the system? The company that owns it? Or the client state that hired them? This regulatory black hole is precisely why the UN is so concerned about private sector involvement in the development and proliferation of military AI.

Blurring the Lines: Technology, Accountability, and Plausible Deniability

The integration of advanced technology into the operations of PMSCs is not merely an upgrade of tools; it fundamentally alters the nature of conflict, challenging established legal norms and creating new avenues for states to wage war by proxy.

The Accountability Gap in the Digital Age

International Humanitarian Law (IHL), also known as the laws of war, is built on principles like distinction (distinguishing between combatants and civilians), proportionality (ensuring an attack is not excessive relative to the military advantage gained), and precaution (taking all feasible steps to avoid civilian harm). Technology complicates the application of these principles, especially when wielded by private actors.

A remote drone operator working for a PMSC from a control center thousands of miles away may have a skewed perception of the battlefield, making nuanced judgments about proportionality difficult. An AI targeting algorithm, trained on biased data, might misidentify a civilian object as a military target. Furthermore, cyberattacks make attribution incredibly difficult. A disruptive attack on a nation’s power grid could be the work of a state intelligence agency, a patriotic hacker collective, a criminal gang, or a PMSC working on behalf of an undisclosed client. This jurisdictional nightmare makes it nearly impossible to assign legal responsibility and hold perpetrators accountable, eroding the very foundation of international law.

“Warfare-as-a-Service”: States Outsourcing Modern Conflict

For some states, the rise of the high-tech PMSC is not a threat but an opportunity. It allows them to project power and pursue foreign policy objectives with minimal domestic political risk and a high degree of plausible deniability. By hiring a PMSC to conduct surveillance, run a disinformation campaign, or even carry out targeted strikes, a state can achieve its goals without putting its own soldiers in harm’s way or leaving its direct fingerprints on the operation.

The Wagner Group, while intricately linked to the Russian state, has long served as a prime example of this model. Its operations in Syria, Libya, and across the Sahel have allowed Moscow to secure geopolitical and economic interests without the political and diplomatic costs of a formal military intervention. Similarly, proposals from figures like Erik Prince, founder of Blackwater, for private air forces and global intelligence agencies demonstrate the ambition within the industry to offer states a turnkey solution for warfare—a model that could be described as “Warfare-as-a-Service.”

Human Rights in the Crosshairs

Beyond the battlefield, the technologies wielded by PMSCs pose a grave threat to human rights. Authoritarian regimes can hire these companies to deploy sophisticated surveillance systems against their own populations. Facial recognition, social media monitoring, and location tracking can be used to identify and suppress dissent, monitor journalists, and persecute minority groups. The sale of these powerful tools by private companies to governments with poor human rights records creates a global market for the technologies of repression. The UN Working Group’s inquiry rightfully links the use of these technologies in armed conflict to their use in internal security contexts, as the same companies and the same systems are often involved in both.

The Road Ahead: Regulation in an Unregulated Space

The UN’s call for inputs is the first step in a long and arduous process of attempting to regulate a rapidly evolving and deliberately opaque industry. Existing legal and voluntary frameworks are proving woefully inadequate to address the challenges of 21st-century private warfare.

The Limits of Existing Frameworks

Past efforts to regulate the PMSC industry, such as the Montreux Document and the International Code of Conduct for Private Security Service Providers (ICoCA), have been valuable but are now showing their age. These frameworks were designed primarily to address the challenges of the post-9/11 era, focusing on issues like vetting personnel, rules for the use of physical force, and accountability for abuses committed by “boots on the ground” contractors in places like Iraq.

They are ill-equipped to handle the complexities of cyber warfare, AI-driven surveillance, and autonomous weapons. These voluntary codes of conduct rely on self-regulation by member companies and oversight by a handful of states. They lack the teeth to enforce compliance on a global scale and can be easily circumvented by shell corporations and actors operating from permissive jurisdictions. The speed of technological change has far outpaced the slow, consensus-based process of developing international norms.

What the UN Hopes to Achieve

The report that will eventually emerge from the Working Group’s inquiry could serve as a crucial catalyst for action. By systematically documenting the landscape of technology use by PMSCs, the report will provide an authoritative basis for international discussion. Potential outcomes could include:

  • Recommendations for a New Treaty: The Group might recommend the development of a new international treaty, or an additional protocol to existing conventions, that specifically addresses the use of certain technologies (like cyber weapons or LAWS) by non-state actors.
  • Guidelines for States: The report could provide clear guidelines for states on their due diligence obligations when hiring PMSCs, as well as on licensing and export controls for military and surveillance technologies.
  • Increased Transparency: A key recommendation will likely be a call for greater transparency in the contracting process between states and PMSCs, making public the types of services and technologies being procured.

A Call to Action for the International Community

Ultimately, reining in the high-tech private military industry will require a concerted effort from multiple stakeholders. States must take responsibility for regulating the companies operating from their territory and be held accountable for the actions of those they hire. Technology companies have an ethical obligation to consider how their products are being used and to prevent them from becoming tools of war and repression in the hands of unaccountable actors. Civil society and investigative journalists have a vital role to play in exposing the activities of these secretive groups and advocating for stronger oversight.

The UN’s inquiry is a wake-up call. The character of conflict is changing, and the lines between public and private, peace and war, soldier and contractor are becoming dangerously blurred. As algorithms and autonomous systems enter the battlefield, the international community stands at a crossroads. The path taken in response to this investigation will determine whether the future of warfare is governed by law, ethics, and human accountability, or whether it is ceded to a shadowy marketplace where the highest bidder can purchase the latest tools of automated violence.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments