I’m arguing that more spending on psychedelic research & some advocacy work (in particular, helping the rollout of FDA approval of MDMA & psilocybin go smoothly) would be leveraged.
I guess what it boils down to, is how much EA money do you think would need to go into accomplishing this, and for what expected outcome? I’d like to make the distinction that if you can recruit some talent from the EA community to use money provided by, say, Clarity Health Fund (which is earmarked for psychedelics anyway) to further psychedelics related causes in a more effective way, then I am absolutely all for it and in full support. But we ultimately want high impact with fairly small amounts of EA money, or via the use of free EA talent, or EA talent that is paid in ways other than EA money, because of the high counterfactual price tag on EA money. Calculating the expected outcome of this is tough, but possible, and I would change my mind if I saw a plausible estimate that came out as being impactful.
I think that I understand why you think this will work, and hopefully the next few paragraphs demonstrate that understanding. And I think it’s important to acknowledge that GiveWell (by their own admission) did not account for leverage in their early evaluations, and this may have created some undesirable anchoring / lock-in effects with respect to Effective Altruism recommended activities.
And I agree that “leverage” can mean that causes that seem “less efficient” in terms of a strict “direct impact” / “resources spent” metric may have been unjustifiably ignored by the EA community, especially if the form of “leverage” involved is more complex than a simple fundraiser. Moderately efficient causes could benefit from an EA mindset, so long as resources are being redirected from less efficient to more efficient areas and not the other way around. Most of the world’s resources either can’t (due to logistics) or won’t (due to the priorities of power structures and individual donors) be directed towards the most high impact causes. If you can recruit those resources, with a realistic assessment of your impact being greater than what those resources would otherwise have gone to, it would still be worthy of the name effective altruism, and you would still have more direct impact at the end of the day, using resources that would otherwise have gone somewhere with less direct impact.
In fact, when you put it that way, there’s a whole host of cause areas you might consider. While trying to “End Homelessness in America” doesn’t beat distributing mosquito nets to low income countries on a “direct impact” / “resources spent” metric, there is plenty of money that you might think of as effectively “earmarked” for USA purposes only, or earmarked for a certain type of intervention. If you redirect resources that would be otherwise spent on something less impactful, a high difference in impact means that you have done a good job, because of leverage. I think many in the EA community recognize this to some extent, and Givewell is currently investigating opportunities to influence government policy and improve government spending. The concept of leverage really broadens the scope of what “EA” could mean, and potentially does open the door to sometimes helping people in high income countries or furthering causes that don’t boast efficiency per dollar, although I would guess generally not financially helping but rather via skills or spreading a message (e.g. influence donors who are of a less global mindset to donate to more effective cause within the local parameters they care about, or help organizations that aren’t necessarily focused on doing the absolute maximum good per dollar still become more effective within the narrower scope of their goals, etc...). One could consider psychedelics legalization to be potentially a part of such activities.
Now that I’ve (hopefully) shown that I understand where you’re coming from here, let me explain why I still don’t think this will work, and what it would take to change my mind.
From the perspective of an individual, the act of recruiting EA money to your cause is also a form of “leverage”. This applies to everyone and everything, not just psychedelics: if you believe that EA is generally on the right track, then the less “EA resources” you leverage to your cause, and the more otherwise inefficient resources you leverage to your cause, the better your (counterfactually informed) impact will be. Even people doing global poverty should preferentially recruit non-EA funds, if they believe that EA funds are otherwise well allocated.
I would (from my currently naive perspective) agree with you that investing in key research goals probably would be “leveraged” impact, in the sense that direction some EA-money to this might lead to other resources being redirected to this down the line. If we’re talking about potentially diverting funding from other EA causes, we’ll need to be super stringent about impact-per-dollar. We can and should include “leverage” in those calculations, but said calculations must occur.
From what I understand, you’re essentially suggesting just a little bit of research and advocacy, on a reasonable expectation that it will catalyze some sort of tipping point, redirecting funds from various non-EA sources towards the problem. But as long as you’re working within an EA framework, it’s important to quantify your estimate of the impact of that investment.
To estimate the...counterfactual-blind?… impact of your (research, advocacy, whatever) actions, you’d have to estimate the expected impact on policy outcome (how much earlier do we estimate the relevant FDA approvals, policy changes, etc happen as a result of the diverted funds) and the expected value of those policy outcomes (how many people will get better treatment as a result of those outcomes, relative to the treatment they otherwise would have gotten). In other words, how many people benefited?
And then, you have to introduce the counterfactual question of what those resources could be spent on instead. You have to first calculate the counterfactual impact of any EA-resources, which (unless EAs are misguided) have a particularly heavy counterfactual impact price tag (At least when it comes to asking for money? Judging from what I’ve seen posted about the EA job market recruiting EA talent could still be a good move). After that, you’d have to calculate the counterfactual impact of all the other resources you leveraged (though I think it would be okay to just place that at zero for now, to keep the models simple enough to use).
And…despite non-EA leverage, I just don’t think these numbers will come out that way, for all the reasons described in the previous comment. Even if you make brilliant use of leverage to mostly set aside the fact that the countries that are in a position to benefit from this are expensive to operate in, you’d still have to deal with the fact that the lack of research and attending policy changes has little to do with global bottlenecks to access…which means that the numerator in the “beneficiaries/EA-resources spent” equation is going to be pretty low. I don’t mean all the resources you gain leverage—you can make your own calculations of what the counterfactual impact price tag of those are, and depending on who you leverage maybe you could even make a case for that being zero. I mean specifically the EA-money. A person using EA money for this cause would have to operate on a shoestring budget to beat the counterfactual cost. If you agree with my earlier statement that per individual, a year of clean water is, let’s say, 10x as good as reaping the unrealistically-best-case scenario of psychedelics research a decade earlier than otherwise (which seems really lowballing it to me), you’d have to honestly believe that every EA-derived $50k (ignoring further leverage) you spend pushes the timeline forward by a year just to “break even”. I admit I don’t fully understand this issue or the plan but that seems really optimistic when I compare that to the aforementioned $19m figure required to push un-stigmatized drugs through FDA approval.
Anyway, if someone were to do those calculations, it would be a good use of time, because developing methods to evaluate the impact of research/advocacy on policy change in general is something we need. (stay tuned! I may be posting more on that later).
In fact it’s worth just assuming psychedelics are as useful as any drug currently in use when doing your calculations, because even if psychedelics aren’t it, there would be many other items in this general class and we can try to estimate expected values of adding funds to promising research in general to refer to them all. If you were to demonstrate that psychedelic research/advocacy might have that level of impact by these metrics, it would be a pretty big deal even if this particular class of under-researched psychoactive compounds ended up being a flop, because there are a lot of other things that would potentially also become high impact by the same arguments.
In these discussions of impact, I think it’s worth pointing out that unlike, say, x-risk, something like psychedelics research/advocacy is sufficiently concrete that we can reasonably attempt to quantify the impact of our activities, at least to within one or two orders of magnitude, and compare numbers...at least next to research/advocacy for other policy interventions (which happening in low-income countries which have more people and are cheaper to lobby in)
This hopefully goes without saying, but I don’t mean to claim that psychedelics is irrelevant and EAs should not pay any attention to this at all: If you or anyone else has done the research and feel that this is a low hanging fruit, even if the aforementioned impact evaluation doesn’t come back as highly efficient, I would encourage that person to find a way to pluck it...and if some of the under-utilized EA-talent was leveraged towards the problem, it could be a good thing. I just wouldn’t support redirecting global poverty or x-risk focused funding to this (unless some very surprising and convincing impact evaluations along the lines of what I described came out and changed my mind).
(Oh also, I think my use of the world “legalizing” in the previous comment might have been misleading, I just meant the general situation where our interventions allow psychedelics to be used in more and more contexts, without breaking the law. Not legalizing recreational use specifically )
I guess what it boils down to, is how much EA money do you think would need to go into accomplishing this, and for what expected outcome? I’d like to make the distinction that if you can recruit some talent from the EA community to use money provided by, say, Clarity Health Fund (which is earmarked for psychedelics anyway) to further psychedelics related causes in a more effective way, then I am absolutely all for it and in full support. But we ultimately want high impact with fairly small amounts of EA money, or via the use of free EA talent, or EA talent that is paid in ways other than EA money, because of the high counterfactual price tag on EA money. Calculating the expected outcome of this is tough, but possible, and I would change my mind if I saw a plausible estimate that came out as being impactful.
I think that I understand why you think this will work, and hopefully the next few paragraphs demonstrate that understanding. And I think it’s important to acknowledge that GiveWell (by their own admission) did not account for leverage in their early evaluations, and this may have created some undesirable anchoring / lock-in effects with respect to Effective Altruism recommended activities.
And I agree that “leverage” can mean that causes that seem “less efficient” in terms of a strict “direct impact” / “resources spent” metric may have been unjustifiably ignored by the EA community, especially if the form of “leverage” involved is more complex than a simple fundraiser. Moderately efficient causes could benefit from an EA mindset, so long as resources are being redirected from less efficient to more efficient areas and not the other way around. Most of the world’s resources either can’t (due to logistics) or won’t (due to the priorities of power structures and individual donors) be directed towards the most high impact causes. If you can recruit those resources, with a realistic assessment of your impact being greater than what those resources would otherwise have gone to, it would still be worthy of the name effective altruism, and you would still have more direct impact at the end of the day, using resources that would otherwise have gone somewhere with less direct impact.
In fact, when you put it that way, there’s a whole host of cause areas you might consider. While trying to “End Homelessness in America” doesn’t beat distributing mosquito nets to low income countries on a “direct impact” / “resources spent” metric, there is plenty of money that you might think of as effectively “earmarked” for USA purposes only, or earmarked for a certain type of intervention. If you redirect resources that would be otherwise spent on something less impactful, a high difference in impact means that you have done a good job, because of leverage. I think many in the EA community recognize this to some extent, and Givewell is currently investigating opportunities to influence government policy and improve government spending. The concept of leverage really broadens the scope of what “EA” could mean, and potentially does open the door to sometimes helping people in high income countries or furthering causes that don’t boast efficiency per dollar, although I would guess generally not financially helping but rather via skills or spreading a message (e.g. influence donors who are of a less global mindset to donate to more effective cause within the local parameters they care about, or help organizations that aren’t necessarily focused on doing the absolute maximum good per dollar still become more effective within the narrower scope of their goals, etc...). One could consider psychedelics legalization to be potentially a part of such activities.
Now that I’ve (hopefully) shown that I understand where you’re coming from here, let me explain why I still don’t think this will work, and what it would take to change my mind.
From the perspective of an individual, the act of recruiting EA money to your cause is also a form of “leverage”. This applies to everyone and everything, not just psychedelics: if you believe that EA is generally on the right track, then the less “EA resources” you leverage to your cause, and the more otherwise inefficient resources you leverage to your cause, the better your (counterfactually informed) impact will be. Even people doing global poverty should preferentially recruit non-EA funds, if they believe that EA funds are otherwise well allocated.
I would (from my currently naive perspective) agree with you that investing in key research goals probably would be “leveraged” impact, in the sense that direction some EA-money to this might lead to other resources being redirected to this down the line. If we’re talking about potentially diverting funding from other EA causes, we’ll need to be super stringent about impact-per-dollar. We can and should include “leverage” in those calculations, but said calculations must occur.
From what I understand, you’re essentially suggesting just a little bit of research and advocacy, on a reasonable expectation that it will catalyze some sort of tipping point, redirecting funds from various non-EA sources towards the problem. But as long as you’re working within an EA framework, it’s important to quantify your estimate of the impact of that investment.
To estimate the...counterfactual-blind?… impact of your (research, advocacy, whatever) actions, you’d have to estimate the expected impact on policy outcome (how much earlier do we estimate the relevant FDA approvals, policy changes, etc happen as a result of the diverted funds) and the expected value of those policy outcomes (how many people will get better treatment as a result of those outcomes, relative to the treatment they otherwise would have gotten). In other words, how many people benefited?
And then, you have to introduce the counterfactual question of what those resources could be spent on instead. You have to first calculate the counterfactual impact of any EA-resources, which (unless EAs are misguided) have a particularly heavy counterfactual impact price tag (At least when it comes to asking for money? Judging from what I’ve seen posted about the EA job market recruiting EA talent could still be a good move). After that, you’d have to calculate the counterfactual impact of all the other resources you leveraged (though I think it would be okay to just place that at zero for now, to keep the models simple enough to use).
And…despite non-EA leverage, I just don’t think these numbers will come out that way, for all the reasons described in the previous comment. Even if you make brilliant use of leverage to mostly set aside the fact that the countries that are in a position to benefit from this are expensive to operate in, you’d still have to deal with the fact that the lack of research and attending policy changes has little to do with global bottlenecks to access…which means that the numerator in the “beneficiaries/EA-resources spent” equation is going to be pretty low. I don’t mean all the resources you gain leverage—you can make your own calculations of what the counterfactual impact price tag of those are, and depending on who you leverage maybe you could even make a case for that being zero. I mean specifically the EA-money. A person using EA money for this cause would have to operate on a shoestring budget to beat the counterfactual cost. If you agree with my earlier statement that per individual, a year of clean water is, let’s say, 10x as good as reaping the unrealistically-best-case scenario of psychedelics research a decade earlier than otherwise (which seems really lowballing it to me), you’d have to honestly believe that every EA-derived $50k (ignoring further leverage) you spend pushes the timeline forward by a year just to “break even”. I admit I don’t fully understand this issue or the plan but that seems really optimistic when I compare that to the aforementioned $19m figure required to push un-stigmatized drugs through FDA approval.
Anyway, if someone were to do those calculations, it would be a good use of time, because developing methods to evaluate the impact of research/advocacy on policy change in general is something we need. (stay tuned! I may be posting more on that later).
In fact it’s worth just assuming psychedelics are as useful as any drug currently in use when doing your calculations, because even if psychedelics aren’t it, there would be many other items in this general class and we can try to estimate expected values of adding funds to promising research in general to refer to them all. If you were to demonstrate that psychedelic research/advocacy might have that level of impact by these metrics, it would be a pretty big deal even if this particular class of under-researched psychoactive compounds ended up being a flop, because there are a lot of other things that would potentially also become high impact by the same arguments.
In these discussions of impact, I think it’s worth pointing out that unlike, say, x-risk, something like psychedelics research/advocacy is sufficiently concrete that we can reasonably attempt to quantify the impact of our activities, at least to within one or two orders of magnitude, and compare numbers...at least next to research/advocacy for other policy interventions (which happening in low-income countries which have more people and are cheaper to lobby in)
This hopefully goes without saying, but I don’t mean to claim that psychedelics is irrelevant and EAs should not pay any attention to this at all: If you or anyone else has done the research and feel that this is a low hanging fruit, even if the aforementioned impact evaluation doesn’t come back as highly efficient, I would encourage that person to find a way to pluck it...and if some of the under-utilized EA-talent was leveraged towards the problem, it could be a good thing. I just wouldn’t support redirecting global poverty or x-risk focused funding to this (unless some very surprising and convincing impact evaluations along the lines of what I described came out and changed my mind).
(Oh also, I think my use of the world “legalizing” in the previous comment might have been misleading, I just meant the general situation where our interventions allow psychedelics to be used in more and more contexts, without breaking the law. Not legalizing recreational use specifically )