AI Safety Foundations

AI Safety Foundations - Get Involved

Get Involved in AI Safety

Discover ways to contribute to the field of AI safety, from education and research to policy and community engagement.

How to Get Involved in AI Safety

There are multiple pathways for individuals interested in contributing to AI safety:

  • Education and Training: Many organizations offer introductory courses and fellowships in AI safety, such as the Center for AI Safety’s summer research programs and philosophy fellowships, which provide foundational knowledge and hands-on experience in the field.
  • Research: Opportunities exist for both technical and interdisciplinary research, including access to specialized compute clusters for large-scale experiments. Key research areas include robustness, interpretability, and reward learning.
  • Policy and Advocacy: Engaging with policy institutes, regulatory bodies, and advocacy organizations to help shape standards, best practices, and governance frameworks for safe AI.
  • Industry Roles: Many companies and sectors are hiring for roles focused on AI safety, compliance, and risk management, especially in industries where AI is used for safety-critical applications.
  • Community Engagement: Joining professional networks, conferences, and online forums dedicated to AI safety can provide networking opportunities, mentorship, and access to the latest research and developments in the field.

By combining technical expertise, policy insight, and cross-sector collaboration, individuals can help ensure that AI technologies are developed and deployed safely for the benefit of all.

Specific Organizations and How to Get Involved in AI Safety

Major Organizations and Initiatives

U.S. Artificial Intelligence Safety Institute (US AISI)

A government-backed institute focused on developing testing, evaluation, and guidelines for advanced AI systems. It operates within NIST and collaborates with industry, academia, and nonprofits via the AI Safety Institute Consortium (AISIC), which includes over 280 organizations. Interested parties can participate in joint research, contribute to standards development, or join consortium activities.

Center for AI Safety (CAIS)

A leading nonprofit advancing technical research, field-building, and advocacy for AI safety. CAIS offers research opportunities, fellowships, educational resources, and community-building initiatives. They welcome volunteers, researchers, and donors, and maintain an active presence in San Francisco and online.

Cloud Security Alliance (CSA) – AI Safety Initiative

A global coalition focused on AI safety, compliance, and best practices. CSA invites professionals to join working groups, participate in research, attend webinars, and pursue certifications. They also offer ambassador roles for those interested in leading safety initiatives within their organizations.

Grassroots and Advocacy Groups

  • ControlAI: Offers grassroots activism, policy engagement, and public education campaigns. Volunteers can join their Discord, participate in outreach, or use their action guides to contact policymakers.
  • EncodeAI: A youth-led group for high school and college students, offering local chapters and advocacy projects, including legislative campaigns. New members can join or start chapters.
  • PauseAI: Focuses on activism, public awareness, and policy advocacy. Volunteers can join local groups, participate in events, or attend PauseCon, their annual conference.

Catalyze AI Safety Incubation Program

An incubator supporting new AI safety organizations. Recent cohorts include groups working on AI control, hardware security, legal advocacy, and safety research infrastructure. Interested individuals can subscribe to their newsletter or express interest in joining or supporting these startups.

AI NGOs and Research Organizations

Examples include the Ada Lovelace Institute (data and AI for society), Access Now (digital rights), and ACM (computing standards). Many have volunteer, research, or advocacy roles open for public involvement.

Ways to Get Involved

  • Volunteer with grassroots groups (e.g., PauseAI, ControlAI, EncodeAI) through online communities, local chapters, or direct action.
  • Apply for research, fellowship, or ambassador programs at organizations like CAIS or CSA.
  • Join professional consortia or working groups (e.g., NIST’s AISIC) to contribute to standards and policy.
  • Attend conferences, webinars, or workshops to network and learn.
  • Support organizations financially or by amplifying their work through advocacy and outreach.

For a comprehensive directory and personalized guidance, platforms like aisafety.com/communities and aisafety.quest offer resources to help newcomers find the best fit for their skills and interests.