Apart News: Ale, Cash Prizes & the UK’s AISI

Apart News: Ale, Cash Prizes & the UK’s AISI

Apart News is our newsletter to keep you up-to-date.

November 8, 2024
November 8, 2024

This week's edition of Apart News features our newest Researcher Spotlight, funding successes, and a new bounty program from the UK's AISI.

Dear Apart Community,

Welcome to our newsletter - Apart News!

At Apart Research there is so much brilliant research, great events, and countless community updates to share.

This week's edition of Apart News features our newest Researcher Spotlight, informs readers of some of our funding success, and a new cash bounty program from the UK's AI Safety Institute.

Funding Success

We are very happy to announce that Apart Research has received some funding from our recent fundraising round. This means we can continue in our mission of helping people into AI safety research and making the world a safer place.

Our Co-Founder, ​Esben​, said that:

"Since we started Apart two years ago, it has been an inspiring journey to support and conduct research with so many talented people from all over the world. Our global community is crucial to our mission of making AI systems safe and beneficial."

"The funding we have received is a vote of confidence and a strong signal that what we're doing is important and necessary as we take the next steps to grow our impact and research quality. Technical research in AI safety has never been more important."

Bounty Hunters

The UK's AI Safety Institute has just announced their exciting new ​Bounty program for novel evaluations and agent scaffolding​. Please apply through ​the application form​. Applications must be submitted by November 30, 2024. Each submission will be reviewed by a member of AISI’s technical staff.

Our Research Manager, ​Natalia​, ​tweets​ that it is a 'Great opportunity for anyone interested in evals or agent scaffolding - get feedback from skilled researchers at [the AISI] and potentially win some money for your efforts!'

Researcher Spotlight: Alexandra Abbas

Alexandra Abbas is currently a Fellow with us here at Apart, where she is focusing on studying the robustness of adversarial fine tuning techniques against the ablation of the refusal feature.

Hey Alexandra! Tell us a bit about you.

"I’m a mix! My mom is Hungarian and my dad is Syrian. I was born and raised in Hungary. I lived in a Hungarian town called Szeged in the south of the country until age 18. I then moved to Budapest to pursue my Bachelors degree in Economics. I moved to Glasgow, Scotland for my Masters in Big Data and then to London for work. I’ve been living in London ever since and I consider myself a true East Londoner! Recently I embarked on a journey as a digital nomad in SEA exploring countries like Vietnam and Indonesia."

Hobbies outside of AI safety work?

"Sports, I’m big on sports! Tennis, bouldering, running, yoga, surfing, I enjoy them all. Since I’ve been travelling, sightseeing, visiting museums, learning about different cultures, languages and history have become my new hobbies."

When did you first hear about Apart Research?

"I heard about it through the AI Alignment Course by BlueDot Impact. It must have been a post about a hackathon in one of their Slack channels."

Did you join any Hackathons?

"Yes, I joined a hackathon with my friend, Nora where we contributed tasks to the METR task suite. I designed and developed a task where an agent is required to use Terraform to provision compute resources in the cloud. Nora and I ended up joining the Lab program shortly after and we’ve been working together on a research project since then!"

When did you get interested in AI?

"I’ve been interested in ML since my university studies, that’s why I decided to pursue a masters in Big Data which taught me a combination of data engineering and data science. I’ve been closely following the developments in the field since then. Previously working in industry roles I’ve been more involved in the applied ML/AI side, but I’ve been following the major research improvements nonetheless."

When did you get interested in AI safety?

“I don’t quite remember what made me join the AI Alignment Course from BlueDot Impact February this year but it was definitely the course itself that strengthened my conviction that I should invest more in this field."

Can you see yourself pursuing technical research on AI safety after this Lab experience with Apart?

"Yes, this is in fact my goal! I’m involved in AI safety projects more in a software engineering capacity but I’d like to transition into research engineering over time. Currently I’m working as a Technical Project Manager at the AI Safety Engineering Taskforce (ASET) which is a remote-based programme that supports software engineers with upskilling for career transition into AI safety while meaningfully contributing to the work of partner organisations like the UK’s AI Safety Institute and METR."

Would you tell others to join Apart's programmes?

"Absolutely, the Lab experience has given me the foundations and confidence to pursue a more research-oriented career in AI safety even though I’m not coming from an academic background."

Opportunities

  • Want to work with us on mechanistic interpretability and feature manipulation? By joining our Hackathon on Sparse Autoencoders with ​Goodfire AI​, you could end up being one of our many Lab Fellows who are working on & publishing AI safety research after participating in our sprints. Sign up ​here​.

Have a great week and let’s keep working towards safe AI.

‘Apart Research is an AI safety lab - our mission is to ensure AI systems are safe and beneficial.’

Dive deeper