Coding Peace: How Technology and the WPS Agenda Are Redefining Security

by / ⠀Technology / December 11, 2025

Wars today do not always start with soldiers on a battlefield. More often, they begin in silence through cyberattacks, surveillance tools, and software that makes choices before humans even know a threat exists. Artificial intelligence (AI), once imagined as science fiction, now directs drones, filters truth from disinformation, and quietly influences global power.

WPS

Photo Courtesy of Christina/Unsplash

The real issue lies in how new technologies are being designed and deployed. As the Our Secure Future (OSF) report “Women, Peace and Security and Technology Futures” written by OSF Vice President Sahana Dharmapuri and OSF Fellow Jolynn Shoemaker warns, technology is never neutral. Every algorithm carries the biases of its creators, and when women are excluded from technology decision-making, the entire digital ecosystem is built on partial truths. The result: critical blind spots in the development of technology policy that disproportionately impact women and girls. As the relationship between the technology industry and defense sector grows increasingly interconnected, these blind spots also evince broader ramifications for the future of peace and security.

The report poses a simple question: What kind of future are we building? The answer is often pushed aside in the race for AI supremacy. Governments are increasingly investing in defense innovation, start-ups compete to deploy tools faster, and the push to “win” the next technological advantage is relentless. Meanwhile, those most impacted by these decisions – women, civilians, and marginalized communities – are often absent from the process.

Missing Voices in the Tech Race

Between 2021 and 2023, more than $100 billion in venture funding poured into defense-technology start-ups. In the United States, the Defense Department rolled out programs like the Replicator Initiative to bring autonomous systems into operation quickly. Speed became a goal in itself.

See also  One Equation Will Solve the Healthcare and Education Crisis in America: N=1

But speed comes with a cost. Insufficient or imperfect training data, and AI “hallucinations,” have already produced real-world harm. When AI systems trained on skewed datasets are deployed in policing or warfare, marginalized communities bear the brunt. In conflicts from Ukraine to Gaza, algorithms that drive targeting and facial recognition software are being tested to identify targets. Our Secure Future’s report warns that these unchecked uses can lead to abuses and misidentifications that could be fatal.

The Women, Peace and Security (WPS) framework offers a way to correct course. The WPS agenda is grounded in UN resolutions and is codified in U.S. law through the 2017 Women, Peace and Security Act. The foundation of WPS is built on four pillars: participation, protection, prevention, and relief. It argues that women’s participation in peace and security decisions leads to better, longer-lasting outcomes.

When applied to technology, WPS compels governments to examine who benefits and who is endangered by each technological choice. Without it, digital progress risks deepening existing divisions, exposing the inequities of who holds power in the digital ecosystem and whose safety is being traded away for strategic gain.

Technology as a Mirror of Inequality

The digital world reflects society’s deepest biases. Automated hiring systems utilized by Fortune 500 companies have for example previously penalized résumés with the word “women’s.” Facial-recognition models misidentify darker-skinned women up to 34 percent more often than lighter-skinned men.

Deepfakes, AI-generated images and videos, likewise target women disproportionately and such technologies are regularly weaponized to spread disinformation about female journalists and politicians in an effort to silence their participation in public life.

Such digital abuse is a precursor to wider insecurity and the Economist Intelligence Unit reports that 85 percent of women globally have experienced some form of online violence. 

See also  Embracing the AI Era: FTC Takes Steps to Rewrite Internet Rules for Future Changes

Technology-facilitated gender-based violence, or TFGBV, is both a human rights violation and an early warning signal of broader instability. Research from the Georgetown Institute for Women, Peace and Security has furthermore linked technology facilitated online abuse, such as TFGBV, as a key driver of radicalization and political extremism globally. In Myanmar, for example, misogynistic online propaganda during the military coup directly preceded physical attacks against women.

However, policymakers continue to treat TFGBV as an isolated digital-safety problem. OSF’s analysis connects the dots: when online spaces become hostile to women, societies lose half their early-warning system for peace. Ignoring this connection blinds security institutions to the social fractures that technology can amplify.

Recognizing and addressing TFGBV is essential for safeguarding societal stability. By treating digital abuse as an isolated problem, policymakers risk missing the early signals of conflict and unrest. To build resilient, peaceful societies, online safety must be seen as inseparable from security, inclusion, and justice, and needs to be part of security planning.

Building Technology That Serves People

Our Secure Future works to integrate WPS principles into modern policy from U.S. national security strategies to global discussions on AI governance. Its 2025 report reveals how the merging of defense and technology industries has sidelined human rights, humanitarian law, and civil society.

Through interviews of more than 40 experts, OSF found that women remain drastically underrepresented in technology decision-making. It warns that the growing alliance between governments and tech companies often framed as “national security partnerships” is consolidating power in ways that erode accountability.

OSF’s research resonates with new U.N. efforts such as the Governing AI for Humanity report and the creation of the Office for Digital and Emerging Technologies. Both initiatives aim to make AI governance more inclusive, yet, as OSF points out, Women, Peace and Security is still missing from their frameworks. Without those elements, international cooperation on AI risks reproducing the same exclusions that plague national policy.

See also  Embrace The Change: How AI Is Shaping The Future Of Neighborhood Retail

OSF thus calls for diverse data collection, Women, Peace and Security benchmarks for AI systems, and decision-making processes that bring women’s civil-society voices back into the room.

Building a Peace-Centric Digital World

Artificial intelligence is developing what security means, but it can still be guided toward peace. The WPS framework reminds policymakers that violence against women is an early signal of wider instability and that true security depends on equality.

To move forward, technology governance must include WPS principles from the ground up, integrating WPS analysis into data standards, civilian-protection metrics into AI design, and women’s participation into every strategic decision. This serves as pragmatism grounded in evidence, for states where women participate equally in governance are more stable and less likely to wage war.

The UN’s Global Dialogue on AI Governance offers a new opportunity. If organizations like OSF and allies within civil society succeed in embedding WPS thinking there, the digital future could become more humane and more secure.

Peace today depends as much on ethical algorithms as on diplomacy. The OSF report makes it clear that the digital future we are building depends on who shapes it. Technology is never neutral, and without Women, Peace and Security principles guiding how AI is built and used, we risk deepening inequality and insecurity. Bringing women’s perspectives into every stage of design, governance, and policy remains essential for building peace and security.

About The Author

Educator. Writer. Editor. Proofreader. Lauren Carpenter's vast career and academic experiences have strengthened her conviction in the power of words. She has developed content for a globally recognized real estate corporation, as well as respected magazines like Virginia Living Magazine and Southern Review of Books.

x

Get Funded Faster!

Proven Pitch Deck

Signup for our newsletter to get access to our proven pitch deck template.