Scott has sent me the following email (reproduced here with his approval). Scott wants to highlight that he doesn’t know anything more than reading the public posts on this issue.
I’d encourage people to email Scott, it’s probably good for someone to have a list of interested donors.
If you want to donate blindly and you can afford more than $250K, read here for details, then consider emailing Open Philanthropy at inquiries@openphilanthropy.org . If less than $250K, read herefor details, then consider emailing Nonlinear at katwoods@nonlinear.org. You might want to check the longer section below for caveats first.
If you want to look over the charities available first, you can use the same contact info, or wait for them to email you. I did send them the names and emails of those of you who said you wanted to focus on charities in specific areas or who had other conditions. I hope they’ll get back to you soon, but they might not; I’m sure they appreciate your generosity but they’re also pretty swamped.
LONG VERSION(no need to read this if you’re busy, it just expands on the information above)
Two teams have come together to work on this problem—one from Open Philanthropy Project, and one from Nonlinear.
I know Open Philanthropy Project well, and they’re a good and professional organization. They’re also getting advice from the former FTX Future Fund team (who were foundation staff not in close contact with FTX the company; I still trust them, and they’re the experts on formerly FTX-funded charities). For logistical reasons they’re limiting themselves to donors potentially willing to contribute over $250,000.
I don’t know Nonlinear well, although a former ACX Grants recipient works there and says good things about it. Some people in the EA Forum have expressed concerns about them—see https://forum.effectivealtruism.org/posts/L4S2NCysoJxgCBuB6/announcing-nonlinear-emergency-funding , I have no context for this besides the comments there. They don’t seem to have a minimum donation. I’m trying to get in touch with them to learn more.
Important consideration: these groups are trying to maximize two imperatives. First, the usual effective altruism do-as-much-good-as-possible imperative. But second, an imperative to protect the reputation of the EA ecosystem as a safe and trustworthy place to do charity work, where your money won’t suddenly disappear, or at least somebody will try to help if it does. I think this means they will be unusually willing to help charities victimized by the FTX situation even if these would seem marginal by their usual quality standards. I think this is honorable, but if you’re not personally invested in the reputation of the EA ecosystem you might want to donate non-blindly or look elsewhere.
Also, FTX Future Fund focused disproportionately on biosecurity, pandemic prevention, forecasting, AI alignment, and other speculative causes, so most of the charities these teams are trying to rescue will probably be in those categories. If you don’t want to be mostly funding those, donate non-blindly or look elsewhere.
I’ve given (or will shortly give) both groups your details; they’ve promised to keep everything confidential and not abuse your emails. If they approach you in any way that seems pushy or makes you regret interacting with them, please let me know so I can avoid working with them in the future.
I can’t give you great answers on ACX Grants now, but I’ll hopefully know more soon, and if things don’t work out with this opportunity I’d be happy to work with you further then.
Thanks again for your generosity, and please let me know if you have any questions.
Scott has sent me the following email (reproduced here with his approval). Scott wants to highlight that he doesn’t know anything more than reading the public posts on this issue.
I’d encourage people to email Scott, it’s probably good for someone to have a list of interested donors.
------------------------------------
Scott’s email:
SHORT VERSION
If you want to donate blindly and you can afford more than $250K, read here for details, then consider emailing Open Philanthropy at inquiries@openphilanthropy.org . If less than $250K, read here for details, then consider emailing Nonlinear at katwoods@nonlinear.org. You might want to check the longer section below for caveats first.
If you want to look over the charities available first, you can use the same contact info, or wait for them to email you. I did send them the names and emails of those of you who said you wanted to focus on charities in specific areas or who had other conditions. I hope they’ll get back to you soon, but they might not; I’m sure they appreciate your generosity but they’re also pretty swamped.
LONG VERSION (no need to read this if you’re busy, it just expands on the information above)
Two teams have come together to work on this problem—one from Open Philanthropy Project, and one from Nonlinear.
I know Open Philanthropy Project well, and they’re a good and professional organization. They’re also getting advice from the former FTX Future Fund team (who were foundation staff not in close contact with FTX the company; I still trust them, and they’re the experts on formerly FTX-funded charities). For logistical reasons they’re limiting themselves to donors potentially willing to contribute over $250,000.
I don’t know Nonlinear well, although a former ACX Grants recipient works there and says good things about it. Some people in the EA Forum have expressed concerns about them—see https://forum.effectivealtruism.org/posts/L4S2NCysoJxgCBuB6/announcing-nonlinear-emergency-funding , I have no context for this besides the comments there. They don’t seem to have a minimum donation. I’m trying to get in touch with them to learn more.
Important consideration: these groups are trying to maximize two imperatives. First, the usual effective altruism do-as-much-good-as-possible imperative. But second, an imperative to protect the reputation of the EA ecosystem as a safe and trustworthy place to do charity work, where your money won’t suddenly disappear, or at least somebody will try to help if it does. I think this means they will be unusually willing to help charities victimized by the FTX situation even if these would seem marginal by their usual quality standards. I think this is honorable, but if you’re not personally invested in the reputation of the EA ecosystem you might want to donate non-blindly or look elsewhere.
Also, FTX Future Fund focused disproportionately on biosecurity, pandemic prevention, forecasting, AI alignment, and other speculative causes, so most of the charities these teams are trying to rescue will probably be in those categories. If you don’t want to be mostly funding those, donate non-blindly or look elsewhere.
I’ve given (or will shortly give) both groups your details; they’ve promised to keep everything confidential and not abuse your emails. If they approach you in any way that seems pushy or makes you regret interacting with them, please let me know so I can avoid working with them in the future.
I can’t give you great answers on ACX Grants now, but I’ll hopefully know more soon, and if things don’t work out with this opportunity I’d be happy to work with you further then.
Thanks again for your generosity, and please let me know if you have any questions.
Yours,
Scott