All but 3 bullet points were about AI. I know that AI is the number one catastrophic risk but I’m dyin’ for variety (news on other fronts).
Here is the non-AI content:
Allocation in the landscape seems more efficient than in the past – it’s harder to identify especially neglected interventions, causes, money, or skill-sets. That means it’s become more important to choose based on your motivations.
Post-FTX, funding has become even more dramatically concentrated under Open Philanthropy, so finding new donors seems like a much bigger priority than in the past. (It seems plausible to me that $1bn in a foundation independent from OP could be worth several times that amount added to OP.)
Both points mean efforts to start new foundations, fundraise and earn to give all seem more valuable compared to a couple of years ago.
(My bad if there were indications that this was going to be AI-centric from the outset, I could have easily missed some linguistic signals because I’m not the most savvy forum-goer.)
My impression is that of EA resources focused on catastrophic risk, 60%+ are now focused on AI safety, or issues downstream of AI (e.g. even the biorisk people are pretty focused on the AI/Bio intersection).
AI has also seem dramatic changes to the landscape / situation in the last ~2 years, and my update was focused on how things have changed recently.
So for both reasons most of the updates that seemed salient to me concerned AI in some way.
That said, I’m especially interested in AI myself, so I focused more on questions there. It would be ideal to hear from more bio people.
I also briefly mention nuclear security, where I think the main update is the point about lack of funding.
I think there is more value in separating out AI vs bio vs nuclear vs meta GCR than having posts/events marketed as GCR but be mainly on one topic. Both from the perspective of the minor causes and the main cause which would get more relevant attention.
Also the strategy/marketing of those causes will often be different and so it doesn’t make as much sense to lump them together unless it is about GCR prioritisation or cross cause support.
All but 3 bullet points were about AI. I know that AI is the number one catastrophic risk but I’m dyin’ for variety (news on other fronts).
Here is the non-AI content:
Allocation in the landscape seems more efficient than in the past – it’s harder to identify especially neglected interventions, causes, money, or skill-sets. That means it’s become more important to choose based on your motivations.
Post-FTX, funding has become even more dramatically concentrated under Open Philanthropy, so finding new donors seems like a much bigger priority than in the past. (It seems plausible to me that $1bn in a foundation independent from OP could be worth several times that amount added to OP.)
Both points mean efforts to start new foundations, fundraise and earn to give all seem more valuable compared to a couple of years ago.
(My bad if there were indications that this was going to be AI-centric from the outset, I could have easily missed some linguistic signals because I’m not the most savvy forum-goer.)
My impression is that of EA resources focused on catastrophic risk, 60%+ are now focused on AI safety, or issues downstream of AI (e.g. even the biorisk people are pretty focused on the AI/Bio intersection).
AI has also seem dramatic changes to the landscape / situation in the last ~2 years, and my update was focused on how things have changed recently.
So for both reasons most of the updates that seemed salient to me concerned AI in some way.
That said, I’m especially interested in AI myself, so I focused more on questions there. It would be ideal to hear from more bio people.
I also briefly mention nuclear security, where I think the main update is the point about lack of funding.
I think there is more value in separating out AI vs bio vs nuclear vs meta GCR than having posts/events marketed as GCR but be mainly on one topic. Both from the perspective of the minor causes and the main cause which would get more relevant attention.
Also the strategy/marketing of those causes will often be different and so it doesn’t make as much sense to lump them together unless it is about GCR prioritisation or cross cause support.