Currently doing local AI safety Movement Building in Australia and NZ.
Chris Leong
You could also add:
”Negotiate safety conditions as part of a settlement”
Thank you for all your hard work.
Moderating when the whole FTX thing went down must have been incredibly stressful!
Best of luck in your role with Foresight, hopefully you find that kind of work is a good fit!
Interesting. I still think it could be valuable even with relatively few clicks. You might only even need someone to click on it once.
Sadly not. Would love to see someone start a project to fix this though!
I’d love to see the EA forum add a section titled “Get Involved” or something similar.
There is the groups directory, but it’s one of only many ways that folks can get more involved, from EAGx Conferences, to Virtual Programs, 80,000 Hours content/courses to donating.
I’d provide an option for figures to opt out of being included in this site.
I suspect that many people in the community would be both happier and more impactful if they decided to only work four days a week and spent the fifth day doing high-impact volunteering. One nice advantage of this system is that if you’re feeling particularly run down or overwhelmed, it’s easier for you to take a day off that week.
My claim about both happier and more impactful might not apply to folk right at the top of the income distribution, but I expect that it would apply to most folk here.
Props for running an experiment. I’ll be interested to see what the results are.
We seem to be boiling the frog, however I’m optimistic (perhaps naively) that GPT voice mode may wake some people up. “Just a chatbot” doesn’t carry quite the same weight when it’s actually speaking to you.
I think she adds a useful perspective, but maybe it could undermine her reporting?
The one attendee that seems a bit strange is Kelsey Piper. She’s doing great work at Future Perfect, but something feels a bit off about involving a current journalist in the key decision making. I guess I feel that the relationship should be slightly more arms-length?
Strangely enough, I’d feel differently about a blogger, which may seem inconsistent, but society’s expectations about the responsibilities of a blogger are quite different.
Someone needs to be doing mass outreach about AI Safety to techies in the Bay Area.
I’m generally more of a fan of niche outreach over mass outreach, but Bay Area tech culture influences how AI is developed. If SB 1047 is defeated, I wouldn’t be surprised if the lack of such outreach ended up being a decisive factor.
There’s now enough prominent supporters of AI Safety and AI is hot enough that public lectures or debates could draw a big crowd. Even though a lot of people have been exposed to these ideas before, there’s something about in-person events that make ideas seem real.
I expect that they were hoping for this referral program to receive much more referrals than it’s receiving?
I suspect it varies by cause area. In AI Safety, the pool of people who can do useful research is smaller than the pool of people who could do good ops work (which is more likely to involve EA’s who prefer a different cause area, but are happy to just have an EA ops job).
Only so many good short names though.
The PA idea is interesting.
My point was that working four days and volunteering one day, instead of donating, may be more effective for most people.
I guess I want CEA to focus very heavily on figuring out their overall strategy, including community engagement and then communicating their overall decisions.
Conference cost breakdowns feels like an unnecessary distraction at this point, so long as they satisfy the auditor.
In CEA’s case in particular, it doesn’t seem like they deal with biohazards or AI safety at a level necessitating high security
Agreed.
Regarding some of the specific points you’ve made:• I agree that it would be great to get the community more involved in thinking through what the forum should look like.
• Wytham Abbey was an independently run project that they just fiscally sponsored.
• I agree that funding sources should be public (although perhaps not individual donations below a certain amount).
• Unsurprised PELTIV backfired.
• I would love to see regular community office hours, though if these end up seeing low demand, or it’s just the same folks over and over, I think it would be reasonable for them to decide to discontinue this.Regarding some of the other things, I honestly don’t see them as the highest priority, especially right now.
EA needs more communications projects.
Unfortunately, the EA Communications Fellowship and the EA Blog prize shut down[1]. Any new project needs to be adapted to the new funding environment.
If someone wanted to start something in this vein, what I’d suggest would be something along the lines of AI Safety Camp. People would apply with a project to be project leads and then folk could apply to these projects. Projects would likely run over a few months, part-time remote[2].
Something like this would be relatively cheap as it would be possible for someone to run this on a volunteer basis, but it might also make sense for there to be a paid organiser at a certain point.
Likely due to the collapse of FTX
Despite the name, AI Safety Camp is now remote.