I'm a business and human rights leader focused on identifying and mitigating potential adverse human rights impacts in the tech sector in line with the UN Guiding Principles.
Headshot of Alex Warofka

I'm a human rights defender currently employed at Meta as Human Rights Manager and Emerging Risks Lead. I head work to assess and mitigate human rights impacts of new products and technologies—including generative AI and AR/VR—and to respect human rights in crisis and conflict response situations.

I've led pioneering human rights work in the sector. Notably, I was responsible for publication of a human rights impact assessment (HRIA) on end-to-end encryption and for efforts to address Facebook's links to human rights violations in Myanmar, including through the public disclosure in full of an independent HRIA on Facebook's role in the country, an industry-first.

I know that meaningful human rights work on tech requires not just robust, globally-inclusive stakeholder engagement, but also breaking out of legal and policy bubbles to inspire and empower engineers, designers, PMs, and other members of product teams.

I hold a Master of Public Policy from the University of Chicago and a BA in Economics from Case Western Reserve University.

featured impact

End-to-End Encryption HRIA

End-to-end encryption (E2EE) has recognized benefits for privacy, freedom of expression, digital security, and protection of human rights defenders, but has also been criticized over potential impacts to child safety and law enforcement.

Following Facebook's announcement that it would expand default E2EE to Messenger and Instagram, I led a two-year process to commission and respond to a comprehensive independent HRIA to identify relevant human rights benefits, risks, and mitigations.

Wrestling with fast-moving product and policy developments, as well as underlying technical complexities, the HRIA concluded that expanding E2EE would protect a diverse range of rights, and that potential adverse impacts should be addressed without undermining encryption through a range of pragmatic mitigations.

This groundbreaking HRIA on E2EE was published in full and has influenced both Meta's work and the global policy conversation on encryption.

Woman messaging on her phone at night.
Yangon Skyline

Myanmar

In the aftermath of the Rohingya genocide in 2017 and allegations that content on Facebook had contributed to atrocities, I played a central role in the company's efforts to identify and mitigate ongoing adverse human rights impacts and improve integrity defenses. My work included extensive on-the-ground stakeholder engagement; spearheading the unprecedented ban of senior military officials and media entities tied to from the platform; and the conduct and publication in full of an independent HRIA examining Facebook's role in the country, an industry first.

I also led Facebook's human rights work on Myanmar in the run-up to the country's 2020 elections and as part of crisis response efforts in the aftermath of the February 2021 coup.

Conflict and Crisis Response

I specialize in understanding and mitigating potential adverse human rights impacts where online platforms and offline conflict intersect in ways that implicate product design, policies, resourcing, and business decisions.

Impact of the war in Tigray. Credit: VoA

... and some other things.

I also lead Meta's work on protection of human rights defenders; co-lead escalation channels for civil society partners; lead the company's participation in the multi-stakeholder Global Network Initiative; support work to improve transparency around government demands to censor content in the face of rising digital authoritarianism; and have managed a range of complex human rights due diligence projects on countries and products.

In my free time, I dabble in photography, enjoy adventures involving off-the-beaten-path destinations, and like to spend my weekends hiking in the mountains.

Want to learn more?

Get in touch. Contact me.