Dear Apart Community,
Welcome to our newsletter - Apart News!
At Apart Research there is so much brilliant research, great events, and countless community updates to share. This week Esben gave a talk in Paris and our inaugural Studio Progress Report is released soon.
Esben at IASEAI
Esben gave his Engineering a World Designed for Safe Superintelligence talk at The Inaugural Conference of the International Association for Safe and Ethical AI, and it was incredibly well received.


Whilst Connor Axiotes attended the launch of the Hiroshima AI Process (HAIP) Reporting Framework with the OECD - OCDE in Paris.
Upcoming Studio Progress Report

Coming next week: our collection of AI safety research from our Studio's inaugural cohort shows how fast promising ideas can move from hackathon to real-world impact. They came primarily from these sprints:
In just two months, six teams of researchers have produced compelling findings across critical areas of AI safety - from detecting AI-powered cyber attacks to understanding if LLMs make moral decisions.
One team deployed "honeypot" servers to detect AI hacking agents in the wild, finding that while they exist, they represent only 0.0001% of current attacks.
Another team used Minecraft to study AI goal drift, discovering that agents gradually deviate from their objectives over time.
Some teams tackled medical AI hallucinations using sophisticated neural feature analysis, others developed new ways to visualize the exponentially growing field of AI safety research through interactive knowledge graphs.
Our Studio accelerates the path from initial concept to meaningful research. By providing structured support and resources to promising hackathon projects, the program helps early-stage researchers make rapid progress on crucial AI safety challenges.
The full collection of research blogs will be released next week, offering detailed insights into each team's methodology and discoveries!
For those interested in joining future cohorts, sign up for upcoming hackathons for your chance to join the Studio.Stay tuned for next week when we will have the full writeup!
Opportunities
- Never miss a Hackathon by keeping up to date here!
- If you're a founder starting an AI safety company, consider applying to seldonaccelerator.com.
- A call for AI Safety Research from OpenPhil.
- The ILINA Program is dedicated to providing an outstanding platform for Africans to learn, conduct research and take action on AI safety. Apply here.
Have a great week and let’s keep working towards safe and beneficial AI.