″....Even if it was devastating criticism aimed a key audience, it might have bad reach and we’d only amplify it by responding.”
If it is devastating criticism reaching a key audience, wouldn’t it be helpful to identify its criticisms directly and create discussion around them? Your post makes me want to subscribe to the journal just to find out for myself.
Yes, I agree. I was just trying to explore briefly why people might think this was a bad use of time/money, and thought ‘don’t give this stuff oxygen’ might be one of those arguments. But it’s not one I agree with.
OK, so I read the argument, mm, it’s basically associating EA with vegans, crypto, animal rights activists, and pro-tax policies. The OP believes investment serves to lift the poor out of poverty better than charity. There’s not a lot to address, really, since a lot of the premises are true.
It would be nice to know:
how investment compares to charity in alleviating poverty
whether EA advocated for pandemic preparedness before 2020
why vitamin A supplements are supplied in places where golden rice might be commonly consumed
It would be easy to take the OP’s statements of what is shameful about EA and turn them into arguments for what to be proud of about EA. An op-ed response to the original op-ed might be helpful. I would avoid all the EA jargon and stick with plain english, stark facts, and obvious pride in EA’s accomplishments.
whether EA advocated for pandemic preparedness before 2020
The biggest funding organization for EA-prioritized causes is Good Ventures (GV). Most of their philanthropic giving is on the recommendation of Open Philanthropy (OP). OP is one of if not the most significant organization of its kind in EA. “Biosecurity and pandemic preparedness” has been its main focus area for long-termism/x-risk reduction, after risks from advanced AI, since 2015. GV has on the recommendations of OP given almost $130 million USD to date to efforts focused on biosecurity and pandemic preparedness.
GV/OP donated large amounts to pandemic preparedness before 2020, but the rest of the EA community did not. When I looked at LTFF grants in December 2020, I found 5 grants for $114k for pandemics (by comparison, AI had received ~19x more money). If you download GWWC’s list of reported donations (which includes post-2020 data), the only dedicated biosecurity organizations in the top 100 are Johns Hopkins Center for Health Security (#58, 17 donors giving $257k) and Telis Bioscience (1 donor giving $250k, and FWIW I’m pretty sure this isn’t nonprofit). That’s $507k combined, less than half of a percent of the total given to the top 100 organizations.
My impression is that the EA community talked a lot about pandemics prior to Covid, and made them a relatively high priority in career advice. But aside from GV/OP, I haven’t seen evidence of sizeable donations. So my generally sense is (non-GV/OP) EAs are too quick to take credit for being ahead of the curve in this area, at least with respect to money (which I think is a good indicator of actual priorities).
$130 million? Nice, I don’t know how that compares to the totals spent by other NGO’s or the US government or some organization like the WHO, but the number shows EA commitment to pandemic preparedness over time, well before SARS-COV-2 became a concern.
Frankly, the author of the Journal piece (who I called the OP, sorry if that was confusing) did do a bad job, because:
he attacked an organization that funds charitable giving. To do it well you gotta claim there’s loads of corruption or that the causes are something unpopular or obscure (like subsidizing tofu or saving endangered wild tree mammals in Borneo or whatever). The most he managed to insinuate is that you EA folks like to encourage public sector efforts and that your cause choices are redundant.
he downplayed efforts to help pandemic preparedness. Correct me if I’m wrong, but that’s a popular issue right now, isn’t it, you just need to mention that said EA folks had the foresight to begin work on pandemic preparedness in 2015.
he made an obvious attempt to align himself with folks that hate taxes. Unfortunately for him, those same folks like charitable giving, and the majority of EA involves philanthropy, not special interest work.
he went for unpopular (“liberal”) associations, and you folks are a bit weird, but that’s because you’re super-smart. His focus on the liberal, leftie associations also doesn’t really matter. It should be easy to ignore in your response, or tackle directly.
The only way to screw up a response to the guy would be to miss how he’s put himself in a corner and then write as weird a response as possible. For example, “with medium epistemic confidence we predict, with a 5% confidence interval based on scientific studies (see the footnotes), that the majority of our s- and x-risk causes conform to preference utilitarianism, meaning that both our longtermist and short-term outcomes are consistent with our aforementioned values of..” Don’t do that.
But other than that, whether you all decide to hire a marketing firm or not, or put a formal plan together or not, this isn’t a big deal. I think a great way to go is just a heated back and forth in the Journal op-ed section. I’ve read enough arguments to see that you can win this one easily.
You wrote,
″....Even if it was devastating criticism aimed a key audience, it might have bad reach and we’d only amplify it by responding.”
If it is devastating criticism reaching a key audience, wouldn’t it be helpful to identify its criticisms directly and create discussion around them? Your post makes me want to subscribe to the journal just to find out for myself.
Yes, I agree. I was just trying to explore briefly why people might think this was a bad use of time/money, and thought ‘don’t give this stuff oxygen’ might be one of those arguments. But it’s not one I agree with.
OK, so I read the argument, mm, it’s basically associating EA with vegans, crypto, animal rights activists, and pro-tax policies. The OP believes investment serves to lift the poor out of poverty better than charity. There’s not a lot to address, really, since a lot of the premises are true.
It would be nice to know:
how investment compares to charity in alleviating poverty
whether EA advocated for pandemic preparedness before 2020
why vitamin A supplements are supplied in places where golden rice might be commonly consumed
It would be easy to take the OP’s statements of what is shameful about EA and turn them into arguments for what to be proud of about EA. An op-ed response to the original op-ed might be helpful. I would avoid all the EA jargon and stick with plain english, stark facts, and obvious pride in EA’s accomplishments.
The biggest funding organization for EA-prioritized causes is Good Ventures (GV). Most of their philanthropic giving is on the recommendation of Open Philanthropy (OP). OP is one of if not the most significant organization of its kind in EA. “Biosecurity and pandemic preparedness” has been its main focus area for long-termism/x-risk reduction, after risks from advanced AI, since 2015. GV has on the recommendations of OP given almost $130 million USD to date to efforts focused on biosecurity and pandemic preparedness.
GV/OP donated large amounts to pandemic preparedness before 2020, but the rest of the EA community did not. When I looked at LTFF grants in December 2020, I found 5 grants for $114k for pandemics (by comparison, AI had received ~19x more money). If you download GWWC’s list of reported donations (which includes post-2020 data), the only dedicated biosecurity organizations in the top 100 are Johns Hopkins Center for Health Security (#58, 17 donors giving $257k) and Telis Bioscience (1 donor giving $250k, and FWIW I’m pretty sure this isn’t nonprofit). That’s $507k combined, less than half of a percent of the total given to the top 100 organizations.
My impression is that the EA community talked a lot about pandemics prior to Covid, and made them a relatively high priority in career advice. But aside from GV/OP, I haven’t seen evidence of sizeable donations. So my generally sense is (non-GV/OP) EAs are too quick to take credit for being ahead of the curve in this area, at least with respect to money (which I think is a good indicator of actual priorities).
You make some extremely good points. I’m not sure off the top of my head when I’ll have time to respond with the care your doesn’t
$130 million? Nice, I don’t know how that compares to the totals spent by other NGO’s or the US government or some organization like the WHO, but the number shows EA commitment to pandemic preparedness over time, well before SARS-COV-2 became a concern.
Frankly, the author of the Journal piece (who I called the OP, sorry if that was confusing) did do a bad job, because:
he attacked an organization that funds charitable giving. To do it well you gotta claim there’s loads of corruption or that the causes are something unpopular or obscure (like subsidizing tofu or saving endangered wild tree mammals in Borneo or whatever). The most he managed to insinuate is that you EA folks like to encourage public sector efforts and that your cause choices are redundant.
he downplayed efforts to help pandemic preparedness. Correct me if I’m wrong, but that’s a popular issue right now, isn’t it, you just need to mention that said EA folks had the foresight to begin work on pandemic preparedness in 2015.
he made an obvious attempt to align himself with folks that hate taxes. Unfortunately for him, those same folks like charitable giving, and the majority of EA involves philanthropy, not special interest work.
he went for unpopular (“liberal”) associations, and you folks are a bit weird, but that’s because you’re super-smart. His focus on the liberal, leftie associations also doesn’t really matter. It should be easy to ignore in your response, or tackle directly.
The only way to screw up a response to the guy would be to miss how he’s put himself in a corner and then write as weird a response as possible. For example, “with medium epistemic confidence we predict, with a 5% confidence interval based on scientific studies (see the footnotes), that the majority of our s- and x-risk causes conform to preference utilitarianism, meaning that both our longtermist and short-term outcomes are consistent with our aforementioned values of..” Don’t do that.
But other than that, whether you all decide to hire a marketing firm or not, or put a formal plan together or not, this isn’t a big deal. I think a great way to go is just a heated back and forth in the Journal op-ed section. I’ve read enough arguments to see that you can win this one easily.
Good luck with it all.