Apart News: Hackathons in 2025 PREVIEW

Apart News: Hackathons in 2025 PREVIEW

In this week's Apart News we preview some of the Hackathons we are most excited for in 2025.

December 20, 2024
December 20, 2024

Dear Apart Community,

Welcome to our newsletter - Apart News!

At Apart Research there is so much brilliant research, great events, and countless community updates to share.

In this week's Apart News we preview some of the Hackathons we are most excited for in 2025.

AI Safety & Assurance Startup Hackathon

AI Safety requires ambition. We are facing critical technological problems in AGI deployment during the next three years; alignment, multi-agent risk, compute security, exfiltration, among many others. Each of these questions deserves a competent team to scale science-informed solutions for. This is where you come in!

​Join us​ next year on 17th-20th January 2025 where we will join you to kick off an ambitious journey into AI safety and security with other aligned and talented individuals from both science and business. We aim to bring solution-oriented deep tech to AI safety. This is your chance to literally change the world.

The impact of real-world startups is immense and can be felt almost immediately. We need to push AI safety innovations toward real-world applications rapidly to ensure they are implemented and make a real difference in the application and deployment of AI technologies.

This hackathon is not just about ideation; it's about taking that crucial first step from concept to a plan for action, setting the stage for future development. Sign up ​here​!

Hardware Hackathon

On the 7th-10th March 2025 we will have our ​hardware security hackathon​ focused on building new methods to detect and control for secret training of AI models! Work with cutting-edge side-channel monitoring equipment, collaborate with experts in hardware security, and help develop crucial tools for AI governance. $2000 in prizes available across builder and breaker tracks.

This hackathon brings together hardware experts, security researchers, and AI safety enthusiasts to develop and test verification systems for detecting AI training activities through side-channel analysis. Participants will work with state-of-the-art hardware setups to create innovative solutions for monitoring compute usage.

In the Builder Track, you will create new methods to detect stealth AI training runs and new verification protocols to ensure responsible use of compute.

  • Develop side-channel monitoring systems;
  • Create classification algorithms for training detection;
  • Build verification protocols for hardware governance;
  • Implement real-time monitoring solutions.

In the Breaker Track, you'll be tasked to evade classification algorithms with new types of training, making it more difficult for the Builders (and at the same time showing us how easy it might be to evade existing verification methods).

  • Test evasion techniques against monitoring systems;
  • Develop methods to mask training signatures;
  • Analyze system vulnerabilities;
  • Create adversarial examples.

Sign up ​here​.

And more...

...this is a non-exhaustive list of research sprints, and 2025 is looking like it will be Apart Research's biggest yet.

Opportunities

  • Never miss a Hackathon by keeping up to date ​here​!
  • The AI Safety Fund (AISF) awards grants to accelerate research efforts that identify potential safety threats that arise from the development and use of frontier AI models. Apply ​here​.
  • ​Engineering Residency​ researching with the Autonomous Systems Team at the UK AI Safety Institute.

Have a great week and let’s keep working towards safe and beneficial AI.

Dive deeper