ARENA Operations Coordinator
Help run ARENA’s technical AI safety bootcamps
About ARENA:
ARENA is a technical AI safety upskilling bootcamp based at the London Initiative for Safe AI (LISA). Our 5-week residential programmes provide talented individuals with the ML engineering skills, community, and confidence to contribute to technical AI safety work. Since 2023, we’ve trained hundreds of alumni now working across the AI safety ecosystem. Our curriculum is hosted at learn.arena.education.
ARENA’s bootcamps are designed to bridge the gap between general technical talent and full-time work in AI safety. Our alumni have gone on to work at Anthropic, DeepMind, the UK AI Security Institute, Apollo Research, and across the wider AI safety research community.
Why this role matters:
We’re scaling up to running 11 in-person cohorts over the next two years: 8 in London and 3 in the US, as we expand into the Bay Area. Each cohort will takes around 40 participants through a 5-week residential programme covering ML fundamentals, mechanistic interpretability, RL, evals, topics in alignment science, and a capstone project. This role exists to keep the operational side of ARENA running well as we grow. You’ll work directly with the Operations Director, with ownership of how participants experience the programme – from when they first apply, to when they arrive at LISA and the day-to-day running of in-person weeks.
Logistics:
Based at LISA in central London (25 Holywell Row, EC2A 4XE), with occasional travel to support US cohorts.
Salary: £45,000-£55,000 per year
[Other benefits to be added]
What you’ll do:
The core function of the Operations Coordinator will be to support the Operations Director in running ARENA’s programmes smoothly, from selection through to the final week at LISA. As ARENA scales over the next two years, we need someone who can take meaningful ownership of how participants and external users experience ARENA, while building the systems that let us grow without losing the quality that sets us apart.
In practice, this could involve:
Running selection rounds:
Make informed decisions about applicants and manage external communications across our selection process. We typically receive around 600 applications per round and make 30 offers, so this is high-volume work that needs to be done carefully; over time, you’ll become efficient at judging what makes someone a strong candidate for ARENA.
Owning participants’ programme experience:
During participants’ time with us, they have plenty of time where they won’t be actively working on the ARENA curriculum. You’ll ensure that this time is well spent: organising suitable accommodation, speakers, socials, and other provisions as appropriate to keep our participants happy.
Building strong relationships:
ARENA thrives on healthy connections across the AI safety ecosystem. Beyond our current participants, we care about building close and lasting relationships with our alumni, TAs, speakers, other AI safety organisations, the LISA team and all others who support our work. You’ll be an approachable, dependable and proactive representative of ARENA across diverse groups.
Improving how things run:
As our operations scale, it’s natural that our operating systems evolve to keep pace with our plans: our in-person cohorts, content development, and provisions to self-studiers of the ARENA curriculum all contribute to our mission in distinct ways. You’ll have autonomy to suggest and make improvements to the systems that run ARENA.
About you:
We’re looking for someone who combines strategic operational thinking with care for the people moving through ARENA.
You might be a good fit if you:
Have experience running user-focused projects, in a way that holds up under pressure. You’re comfortable owning multiple aspects of delivery and manage them proactively.
Take pride in the quality of your work and genuinely care about how people experience ARENA.
Spot inefficiencies in your own work and proactively build processes to remove them.
Make clear recommendations under uncertainty. You consider options, decide, and explain your reasoning.
Notice when people around you aren’t doing well, and act on it. You’re comfortable communicating transparently and having a difficult conversation if needed.
Are interested in AI safety and ARENA’s role within it, and willing to develop a deeper understanding over time.
Are mission-driven and are able to think proactively about how ARENA could be more impactful.
Many strong candidates won’t meet all of these – we’d rather you apply than self-select out.
If this sounds like you, we’d love to have your application.