It’s hard yes, but I think the risk vectors are (note—these are different scenarios, not things that follow in chronological order, though they could):
Open Philanthropy gets under increasing scrutiny due to its political influence
OP gets viewed as a fully politicised propaganda operation from EA, and people stop associating with it, accepting its money, or call for legal or political investigations into it etc
Givewell etc dissassociate themselves from EA due to EA having a strong negative social reaction from potential collaborators or donors
OP/​Givewell dissociate from and stop funding the EA community for similar reasons as the above, and the EA community does not survive
Basically I think that ideas are more important than funding. And if society/​those in positions of power put the ideas of EA in the bin, money isn’t going to fix that
This is all speculative, but I can’t help the feeling that regardless of how the OpenAI crisis resolves a lot of people now consider EA to be their enemy :(
It’s hard yes, but I think the risk vectors are (note—these are different scenarios, not things that follow in chronological order, though they could):
Open Philanthropy gets under increasing scrutiny due to its political influence
OP gets viewed as a fully politicised propaganda operation from EA, and people stop associating with it, accepting its money, or call for legal or political investigations into it etc
Givewell etc dissassociate themselves from EA due to EA having a strong negative social reaction from potential collaborators or donors
OP/​Givewell dissociate from and stop funding the EA community for similar reasons as the above, and the EA community does not survive
Basically I think that ideas are more important than funding. And if society/​those in positions of power put the ideas of EA in the bin, money isn’t going to fix that
This is all speculative, but I can’t help the feeling that regardless of how the OpenAI crisis resolves a lot of people now consider EA to be their enemy :(