Founded in May 2023 by Dutch software entrepreneur Joep Meindertsma, PauseAI has grown from a one-person initiative into an international movement active in more than 14 countries. The organization operates as Stichting PauseAI, a registered Dutch nonprofit foundation, with a small paid executive team supported by hundreds of volunteers organized into national chapters. PauseAI pursues its goals through two complementary strategies: building a visible grassroots movement that demonstrates public demand for AI safety measures, and direct lobbying of policymakers through meetings, coordinated protests, and policy interventions. It has organized notable protests at the Bletchley Park AI Safety Summit, OpenAI headquarters, Google DeepMind, and coordinated demonstrations across 13 countries.
Founded in May 2023 by Dutch software entrepreneur Joep Meindertsma, PauseAI has grown from a one-person initiative into an international movement active in more than 14 countries. The organization operates as Stichting PauseAI, a registered Dutch nonprofit foundation, with a small paid executive team supported by hundreds of volunteers organized into national chapters. PauseAI pursues its goals through two complementary strategies: building a visible grassroots movement that demonstrates public demand for AI safety measures, and direct lobbying of policymakers through meetings, coordinated protests, and policy interventions. It has organized notable protests at the Bletchley Park AI Safety Summit, OpenAI headquarters, Google DeepMind, and coordinated demonstrations across 13 countries.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- $715,000
- Fiscal Sponsor
- -
Theory of Change
PauseAI believes that the development of increasingly powerful AI systems poses an existential risk if allowed to continue without adequate safety guarantees. Their theory of change is that a sufficiently large and visible grassroots movement, combined with targeted lobbying, can shift political will and create pressure for an international agreement — modeled loosely on nuclear nonproliferation regimes — that pauses or restricts the training of the most powerful frontier AI models until safety can be demonstrated. By making AI risk legible to the public, demonstrating that demand for a pause exists, and directly engaging legislators and institutions, they aim to create the political conditions for binding international coordination before transformative AI systems are developed.
Grants Received– no grants recorded
Projects– no linked projects
People– no linked people
Discussion
Sign in to join the discussion.
Key risk: A visible 'pause now' campaign could polarize and backfire—alienating key policymakers and allies, entrenching opposition, and failing to translate awareness into binding international constraints—so the counterfactual impact relative to more incremental capacity-building may be limited.
Details
- Last Updated
- Apr 2, 2026, 10:09 PM UTC
- Created
- Mar 19, 2026, 10:30 PM UTC
Case for funding: PauseAI is the only sizable international grassroots network explicitly pushing to pause frontier AI training, leveraging hundreds of volunteers, coordinated protests (including the largest UK AI safety protest at DeepMind), media coverage, and targeted lobbying to shift the Overton window and create option value for a future emergency moratorium.