Legal Advocates for Safe Science and Technology (LASST)
Legal Advocates for Safe Science and Technology (LASST) is a 501(c)(3) nonprofit that uses the legal system to make advances in science and technology safer for people and the planet. Founded in 2024 by former commercial litigator Tyler Whitmer, LASST focuses primarily on mitigating catastrophic risks from advanced artificial intelligence and biotechnology. The organization files amicus briefs in strategic cases, pursues impact litigation, operates an AI Safety Whistleblower Legal Defense Fund, conducts litigation surveillance, and advocates for stronger governance and oversight of emerging technologies. LASST also mobilizes pro bono legal expertise from law firms and volunteers to expand the legal profession's engagement with global catastrophic risk reduction.
Legal Advocates for Safe Science and Technology (LASST) is a 501(c)(3) nonprofit that uses the legal system to make advances in science and technology safer for people and the planet. Founded in 2024 by former commercial litigator Tyler Whitmer, LASST focuses primarily on mitigating catastrophic risks from advanced artificial intelligence and biotechnology. The organization files amicus briefs in strategic cases, pursues impact litigation, operates an AI Safety Whistleblower Legal Defense Fund, conducts litigation surveillance, and advocates for stronger governance and oversight of emerging technologies. LASST also mobilizes pro bono legal expertise from law firms and volunteers to expand the legal profession's engagement with global catastrophic risk reduction.
Funding Details
- Annual Budget
- -
- Monthly Burn Rate
- -
- Current Runway
- -
- Funding Goal
- -
- Funding Raised to Date
- -
- Fiscal Sponsor
- -
Theory of Change
LASST believes that the legal system is an underutilized lever for reducing catastrophic risks from advanced technologies. By filing strategic amicus briefs, courts can be educated about the risks posed by AI and dangerous pathogen research before precedent-setting decisions are made. Impact litigation can establish legal standards such as strict liability for unreasonably risky research, creating deterrent effects. Supporting AI safety whistleblowers with legal defense ensures that insiders can safely expose dangerous practices without career-ending retaliation, improving transparency at frontier AI labs. Policy advocacy to attorneys general and regulators can preserve nonprofit governance structures at AI companies, maintaining public interest oversight over powerful technologies. By mobilizing the legal profession's pro bono capacity toward catastrophic risk reduction, LASST aims to build a broader coalition of legally trained advocates working to ensure emerging technologies are developed safely.
Grants Received
from Survival and Flourishing Fund
from Survival and Flourishing Fund
Projects– no linked projects
People– no linked people
Discussion
Sign in to join the discussion.
Key risk: As a small, new org with low current funding need, their impact hinges on winning rare, high-stakes legal actions and persuading courts to adopt novel standards, where misfires (e.g., losing on strict liability for pathogen research or AI governance cases) could set adverse precedent and make marginal dollars low-leverage.
Details
- Last Updated
- Apr 2, 2026, 9:58 PM UTC
- Created
- Mar 18, 2026, 11:18 PM UTC
Case for funding: LASST fills a neglected, high-leverage niche by rapidly deploying strategic litigation and amicus briefs on AI/bio risks—already demonstrating traction by mobilizing AGs to preserve OpenAI’s nonprofit control and standing up an AI Safety Whistleblower Defense Fund—so courts and regulators can impose safety guardrails (e.g., strict liability) when legislative routes stall.