26 : 08 : 29 : 51

26 : 08 : 29 : 51

26 : 08 : 29 : 51

26 : 08 : 29 : 51

Keep Apart Research Going: Donate Today

Keep Apart Research Going

Global AI Safety Research Accelerator & Talent Pipeline

26 : 08 : 29 : 51

Donate Now

Our Funding Ends June 2025

Apart Research is at a pivotal moment. In the past 2.5 years, we've built a global pipeline for AI safety research and talent that has produced 22 peer-reviewed papers, engaged 3,500+ participants in 42 research sprints across 50+ global locations, and helped launch top talent into AI safety careers at leading institutions like METR, Oxford, Far.ai and the UK AI Security Institute. Without immediate funding, this momentum will stop in June 2025.

We need your support now to continue this high-impact work.

Jason Hoelscher-Obermaier, Co-Director

Jason Hoelscher-Obermaier, Co-Director

Esben Kran, Co-Director

Esben Kran, Co-Director

Why Apart Makes a Difference

Unique Talent Pipeline: Our Sprint → Studio → Fellowship model has engaged 3,500+ participants, developed 450+ pilot experiments, and supported 100+ research fellows from diverse backgrounds who might otherwise never enter AI safety

Research Excellence: Our pipeline has produced 22 peer-reviewed publications at top conferences like NeurIPS, ICLR, ICML, ACL (incl. two oral spotlights at ICLR 2025), has been cited by top labs, and has received significant media attention

Policy Engagement: Beyond producing research, we actively engage in AI policy and governance. This includes presenting key findings at prominent forums like IASEAI, sharing our research in major media, serving as expert consultants for the EU AI Act Code of Practice, and participating as panelists in EU AI Office workshops

Global Impact: With 50+ event locations and participants from 26+ countries, we're building AI safety expertise across the globe

Spotlight Achievements

Our "DarkBench" research, the first comprehensive benchmark for dark design patterns in large language models, received an Oral Spotlight at ICLR 2025 (top 1.8% of accepted papers), was presented at the IASEAI (International Association for Safe and Ethical AI) conference, and has been featured in major tech publications. 

One of the participants in our hackathon with METR, a physics graduate and computational neuroscience expert with experience in ML and data science, joined METR as a full-time member of technical staff as a direct result of participating in our March 2024 event. Within months, he was contributing to groundbreaking work that he eventually presented at ICLR 2025, —demonstrating our unique ability to identify cross-disciplinary talent and rapidly transition them into impactful AI safety research.

The Clock Is Ticking

Without funding, Apart will need to severely scale back the global research pipeline we've spent 2.5 years building, a system that has successfully identified exceptional talent from across 26+ countries and transformed promising ideas into published research. Your support today preserves this proven talent discovery engine and ensures groundbreaking safety research continues to shape both technical advances and governance frameworks at this critical inflection point for AI. By supporting Apart, you're enabling top tech talent worldwide to become immediate, high-impact contributors to AI safety.

26 : 08 : 29 : 51

26 : 08 : 29 : 51

26 : 08 : 29 : 51

26 : 08 : 29 : 51

Named Supporter

$10

Donate $10 (or more) and have your name featured on our website

Donate $10

Support a Hackathon Participant

$67

Enable one person to participate in one of our Hackathons

Donate $67

Support three Hackathon Participants

$200

Enable three people to participate in one of our Hackathons

Donate $200

Support 15 Hackathon Participants

$1,000

Enable fifteen people to participate in one of our Hackathons

Donate $1,000

Newsletter Shoutout

$5,000

Supports a Lab Fellow producing publication-quality research

We'll also shout you out in our newsletter (businesses or individuals)

Donate $5,000

Sprint Shoutout

$50,000

Enables three global research sprints identifying new safety approaches

You'll be named in three of our sprints (businesses or individuals)

Donate $50,000

Paper Shoutout

$100,000

Enables the publication of four peer reviewed AI Safety publications

You'll be featured in the acknowledgements of four papers

Donate $100,000

Shape the future of AI Safety

By making an institutional donation you'll be helping us to accelerate AI safety research

Institutional Donation

Institutional Donation

If you can't donate consider sharing our campaign instead:

Testimonials

We enjoyed working with Apart, and we look forward to organising future events together. I think they were learnings on both sides, we can take on board to improve our operational flow next time.

We would love to see Apart grow and build a bigger team to support the event organisation (website design, content, outreach).

Collaborating with Apart for the Code Red Hackathon has been useful to METR’s work on developing a set of evaluation tasks. Participants created specifications and implementations spanning a wide range of domains, many of which will contribute to our work assessing risks from autonomous AI systems.

While our main priority was to receive high-quality task implementations, the hackathon was also a great chance to engage with the research community, improve our documentation, and spread awareness about our ongoing task bounty

Zainab Majid

Co-Founder at Stealth

Apart's fellowship gave me deep technical experience and connected me to an amazing network.

Apart profoundly impacted my career trajectory, and supported me throughout, I am extremely grateful for the opportunity, experience and the team.

All of my work in AI safety traces back to my involvement with Apart.

Benjamin Sturgeon

Founder of AI Safety Cape Town

I'm sure that being able to produce work like we did due to Apart had a significant impact in demonstrating competence to the mentors [at MATS 8.0].

I believe their contributions in producing tangible results in the field of AI safety are likely significantly underrated.

Cameron Tice

AI Safety Technical Governance Research Lead at ERA Cambridge

My life changed when I decided to apply for a deception detection hackathon run by Apart Research.

I am genuinely grateful for the work Apart has done, and have donated a portion of my own salary to Apart so that more researchers like me can be supported and embolden to make the leap to working on humanity's greatest problems.

Jacy Reese Anthis

Co-Founder of the Sentience Institute

Apart is a research institute and not just short-involvement field-building.

Jacob has his ear to the ground in AI safety, helping us prioritize the most impactful research directions, take advantage of diverse methodologies, and ensure we stay on top of the extremely fast pace of ML research.

Fedor Ryzhenkov

Research Engineer at Palisade Research

I believe me being a part of Apart fellowship has helped me secure a place at ARENA — which has been just transformational to my career.

Amirali Abdullah

Lead Al Researcher at Thoughtworks

The collaborative environment and early encouragement from the Apart team were instrumental in pursuing this research direction. Apart creates a unique space where seed ideas can grow into significant research contributions, well beyond even their own acclaimed and successful direct publications.

Parker

AI safety researcher at METR

I think the brainstorming sessions, followed by the opportunity to explore and comment on others' project ideas, made it much easier to find a project where I could help.

Luhan Mikaelson

CS Student, AI Safety Researcher

For me, the hackathon was a test of my skills; it helped me gain insights into my skillset and how I should refine them further. I also got to work with a team of passionate, flexible, and committed individuals. I really enjoyed feeling the intense intervals of brainstorming and coming up with ways to make the experiment work. It made me realize that I can't wait to work on more complex projects!

Marcel Mir

AI Policy & Governance Researcher at the Centre for Democracy & Technology Europe

I really liked the ideas session in the hackathon and then working async with my team.

Prithvi Shahani

Auto Alignment Eval Builder and Research Trainer at Anthropic

Apart Research provided a combination of practical experience, professional networking, and consistent mentorship that shaped my research capabilities and career trajectory.

This project helped me join the Harvard AI Safety Team, which later connected me to my position at Anthropic and contacts at Redwood. Having concrete work to show made a real difference in these applications.

Teun

Independent Al safety research

I would recommend mentoring to others.

I think most of the impact you have is by upskilling newer folks and the impact of the research itself. [The Sandbag Detection] project was interesting, and it has been used by e.g. Anthropic and UK AISI.

Jasmina Urdshals

Independent AI Safety Researcher

The best part was the good feeling of having worked really hard on a project (and gotten valuable support when needed), having some acceptable results and finally being done with them in time to submit.

Hoang-Long Tran

AI Safety Research Fellow at the Al Safety Global Society

The hackathon was an amazing opportunity for me to collaborate with brilliant people from around the world. Working across multiple time zones, my team tackled the challenge of researching how to detect and stop prompt jailbreaking attacks on LLMs. The experience of brainstorming, learning, and solving problems together was truly inspiring. Winning the 2nd prize was the icing on the cake and a testament to the hard work and creativity of our team.

Read Our Impact Report

At Apart Research, our mission is to improve humanity's preparedness towards catastrophic AI risk. We build research communities to push the limits of AI safety.

Our Funding Ends June 2025

We need your support now to continue this high-impact work.

Jason Hoelscher-Obermaier, Co-Director

Esben Kran, Co-Director

Our Funding Ends June 2025

We need your support now to continue this high-impact work.

Jason Hoelscher-Obermaier, Co-Director

Esben Kran, Co-Director