Like Timnit Gebru, former Google Ethical AI team co-leader, and former Google design ethicist Tristan Harris, Meredith Whittaker embodies the paradox of an insider turned outspoken critic of the system she once helped build. Now the head of Signal, a secure messaging app focused on privacy, she previously held various roles at Google.
Over a decade, she rose from customer service representative to founder of her own research team. This journey gave her a unique perspective on the tech giant’s inner workings.
And that firsthand knowledge became the fuel for her transformation.
In 2018, Whittaker led an unprecedented protest at Alphabet. She mobilized 20,000 Google employees to demonstrate against the company’s handling of sexual harassment cases and certain military contracts. This act of rebellion marked a turning point in her career and changed public perceptions of big tech companies. She left the company shortly afterward.
Beyond Classic Activism
If anything sets Whittaker apart from other critics, it’s her deep technical expertise and social awareness. She co-founded the AI Now Institute and has led research on AI’s ethical and social implications.
Whittaker’s approach goes beyond whistleblowing. She aims to unravel the mechanisms behind what she describes as a “fundamentally toxic” business model.
Her perspective on AI is especially sharp. She views it not as a neutral technological advancement but as the culmination of “surveillance capitalism”—a term coined by sociologist Shoshana Zuboff to describe how big tech uses user data to predict societal behaviors.
For Whittaker, the real paradigm shift isn’t the technology itself but the unprecedented concentration of data and infrastructure in the hands of a few powerful companies.
Signal, the Alternative in Action
As president of Signal and a member of its board of directors, Whittaker is working to prove her ideas are more than theories. For her, Signal—with its uncompromising focus on privacy and nonprofit model—demonstrates a viable alternative to Silicon Valley’s prevailing paradigm.
Under Whittaker’s leadership, Signal’s commitment to privacy stands out:
- It doesn’t collect user data.
- Its architecture makes this type of collection impossible.
This approach contrasts with companies like Meta and Google, whose business models rely on monetizing personal data.
Reinventing the Technology Industry
Unlike other activists, Whittaker doesn’t just point out problems but aims to reinvent how technology is developed and used. She advocates for a more diverse tech ecosystem where nonprofit alternatives can compete on an equal footing with traditional corporations and startups.
Her vision includes radical regulatory changes, such as separating technology infrastructure from the applications running on it. This would mean breaking up companies that control both layers, such as Amazon (which owns both AWS and Prime Video), Google, and Microsoft.
According to Whittaker, this approach would dismantle monopolies—Google, for instance, is currently embroiled in a legal battle over monopoly claims—and create a more egalitarian playing field.
Making Privacy the Norm, Not the Exception
Whittaker envisions a future where privacy-focused apps like Signal are the standard across the tech industry. She advocates for funding models that allow these types of platforms to thrive without what she calls “mass surveillance.”
This future isn’t solely about technology, it’s about power and empowerment. Whittaker views encryption and privacy not as technical features, but as essential tools to address the “information asymmetries” underlying current power structures.
Her message is clear: The technological status quo is neither inevitable nor desirable. She critiques the existing system while building and promoting alternatives.
Whittaker’s unique path isn’t easily replicable nor a general model for others, but it underscores how insider knowledge, when coupled with ethical conviction and determination, can be a catalyst for change.
In a landscape shaped by technological determinism, Whittaker stands out as a reminder that alternatives are possible—and that we have the choice to pursue them.
Image | AI Now Institute edited by Xataka On
View 0 comments