Announcing AI Safety Support

Contents:

  1. What is AI Safety Support

  2. Discussion Days

  3. Other Online Events

  4. AI Safety Resources

  5. Mentorship Program

  6. AI Alignment Slack

  7. Consider Donating

What is AI Safety Support

AI Safety Support has existed as an Incorporated Association in Australia since October and as an initiative with a web page since May 2020.

Our aim is to fill the gaps in that AI Safety career pipeline. We aim to provide operational support to early career and transitioning researchers, so you can engage with the community and test your career fitness. Broadly we just want to help reduce friction and enable people to do the things you are already trying to achieve.

If you are new to AI Safety, we would be happy to talk to you, and to help you figure out what steps to take next. We don’t have all the answers, but we can probably provide you with a better map of the career landscape.

Feel free to reach out or book a call with either one of us:

Discussion Days

Our longest running project (since June) are our regular online AI Safety Discussion Days. Each of these events has a talk, an Icebreaker session and breakout discussions covering whatever is on your mind at the time.

The Discussion Days focus on ongoing research. If you have some ideas you want feedback on, you are welcome to present them as a talk or bring them up during the breakout discussions. However, you are also welcome to just listen and learn, or maybe give feedback on other people’s ideas.

Schedule:

  • Second Monday every month, UTC 18:0021:30 (Europe/​Africa and Americas friendly time)

  • Fourth Monday every month (except December), UTC 08:0011:30 (Asia/​Pacific and Europe/​Africa friendly time)

The next one is on Monday, November 23th.

Other Online Events

We are not the only one running online AI Safety events these days. We’re collecting all online AI Safety and adjacent events we know about in this shared calendar. Notice that some events require registration or application.

If you know of other online AI Safety events, please let us know.

To stay up to date with new information, I highly recommend signing up to the Alignment Newsletter and 80k’s AI Safety group.

However, there are also lots of more static resources scattered around the internet, such as study guides and research agendas. I’ve tried to list them all here (though I’m still adding things). Let us know if we forgot something, or miscategorized something.

Mentorship Program

We have spoken to a number of people who want some mentorship and to others who are interested in mentoring. So, we are experimenting with a new mentorship program to try to bring these people together. Mentors and mentees can respectively offer and ask for anything they like so this works best if we have more people to match.

As a rule of thumb, if you are at least a 3rd year PhD student, or an independent researcher with a few publications, then you can be a mentor. You might not think you are very senior, but remember that AI Safety is a very new feld, so you are already more senior than most.

We have no rules for who can be a mentee. The more experience you have, the harder it will be for us to find a mentor who knows more than you, but whoever you are, we’ll try to find someone to help you. However, try to be as specific as you can about what you want to get out of the mentorship. For example, maybe you have a research project in mind that you want to do?

You can sign up for both!

Signup deadline for the first round is November 30th. We will contact everyone and pair you up during December, so expect the actual mentoring to start no earlier than January.

When or if we do a second round depends entirely on the interest we get in the first round, both from mentors and mentees.

AI Alignment Slack

We can’t take credit for the existence of this Slack group, since it was created by Andrei Alexandru. But we are helping to grow it, and making use of it as a place to communicate, ask questions, discuss research, and generally help each other out.

In this Slack you’ll find two channels dedicated to our Discussion Days, one for general follow-up discussions, and one for asking questions of the last speaker.

We have one channel dedicated to grad school application. One idea we have around this is that it would be beneficial for AI Safety-interested students to end up in the same program, which is more likely to happen with some communication and coordination.

There are also channels for finding study buddies, personal introduction, several sub-field specific discussions and more.

Consider Donating

Since April and June respectively we have both spent most of our working time on AI Safety Support. We can not do this work while also having to hold down separate jobs, and the people we are helping are mostly students with little to no income. We are therefore relying on donations to be able to continue this work. On top of that, a large-ish donation we were expecting has been delayed for an unknown amount of time, which means donations we get now would be extra helpful.

There are a few ways you can donate. We have accounts on both Patreon and Ko-fi, for convenient regular donations. Both these platforms have some transaction costs, however. If you prefer sending us money more directly, just let us know.

A third way you can donate to us is though Rethink Charity. This option will make your donation tax deductible if you live in Canada or US. If you want to donate more than $1000 through Rethink Charity, please email them first, so they can find the best method to donate for you. Regardless of the amount you will need to let Rethink Charity know that your donation was meant for AI Safety Support. (Please only use this option if you can take advantage of the tax deductions.)

If you are donating from somewhere else and want tax benefits, let us know, and we will try to arrange something.

Donors looking to support our project through a US based Donor Advised Fund can do so by sending their donation to Rethink Charity (EIN 82-5325150). If you need any assistance donation support is available by emailing Siobhan Brenton.

Thank you!

No comments.