Cash prizes for the best arguments against psychedelics being an EA cause area
tl;dr – Comment below with arguments for why psychedelic research & advocacy should not be an EA cause area. On June 3rd, the most upvoted argument will win $400. The second- and third-most upvoted arguments will win $200 & $100, respectively.
1. Background on why I’m making this post & offering a prize
This effort was inspired by Grue_Slinky’s prize for the best argument against the EA Hotel.
I’m offering a $400 USD prize for the best argument against psychedelic research & advocacy being an EA cause area. I’m also offering $200 for the second-best argument & $100 for the third-best.
Uncovering another high-impact cause area seems very valuable, as EA’s impact is closely correlated with the quantity & quality of the cause areas it supports. This prize is a small effort towards assessing whether psychedelics should be taken up as an EA cause area.
The primary motivation here is to surface previously unconsidered arguments for why psychedelics should not be an EA cause area. My current view is that psychedelics are an extremely promising altruistic cause area – I’d like to stress-test that view to see what I’ve missed.
Below, I offer some arguments for why psychedelics should be an EA cause area.[1] These arguments are provided as a jumping-off point for arguments against.
2. Specification of the prize
To be considered for the prize, add your argument against psychedelics being an EA cause area as a top-level comment on this post. (Only top-level comments on this post will be considered as valid submissions.)
On Monday, June 3rd, I’ll award $400 USD to the most-upvoted argument against, $200 USD to the second-most-upvoted argument against, and $100 USD to the third-most-upvoted argument against. I will break any ties, by choosing my favorite of the tied arguments.
I’ll contact the prize winners via a Forum direct message to coordinate the remittance of their prizes. Once payouts are complete, I’ll announce the winners in a comment on this post.
Feel free to upvote other people’s submissions, even if you submit an argument yourself. (I suppose you could also downvote other submissions, though that doesn’t feel very sporting.)
To avoid potential bias, I will not upvote or downvote any prize submissions.
You can also comment with things that aren’t arguments against psychedelics being an EA cause area, though note that such comments will not be considered as prize submissions.
3. Arguments for the importance of psychedelics
To seed discussion, here are some arguments for the importance of psychedelics as an altruistic cause area.
3(a). Psychedelics are appealing to both long-termist & short-termist views
Almost everyone in EA holds either a long-termist view (everyone who will exist over the entire of course of the future deserves moral consideration) or a short-termist view (only people who exist before some time horizon deserve moral consideration).
If you hold a long-termist view (i.e. all future people deserve moral consideration) & you’re a consequentialist (as far as I know, almost every long-termist is consequentialist), then it’s very difficult to do long-term cause prioritization, because you have limited visibility into the outcomes occurring 10,000+ years into the future – and these are outcomes you want to know about when assessing today’s actions. Practically, we’re clueless about many long-term consequences of our present actions. You can read more about consequentialist cluelessness here.
Given this theoretical problem, long-termist cause prioritization should include “robustness to cluelessness” as a major factor.
Some x-risk interventions seem pretty robust to cluelessness, e.g. Eliezer Yudkowsky’s work raising awareness of the AI alignment problem. Eliezer’s advocacy work seems robust because in almost every scenario you can imagine, it’s good for more researchers to be aware of the AI alignment issue.[2]
Interventions that increase the set of well-intentioned + capable people also seem quite robust to cluelessness, because they allow for more error correction at each timestep on the way to the far future.
Rationality training programs like CFAR & Paradigm Academy are aimed at increasing the number of well-intentioned + capable people.
The psychedelic experience also seems like a plausible lever on increasing capability (via reducing negative self-talk & other mental blocks) and improving intentions (via ego dissolution changing one’s metaphysical assumptions).
I compare the mechanisms of impact of the psychedelic experience and of programs like CFAR & Paradigm below.
By “changing one’s metaphysical assumptions,” I mean that the psychedelic state can change views about what the self is, and what actions constitute acting in one’s “self-interest.” Consider Michael Pollan’s account of one of his psilocybin trips, in How to Change Your Mind:
“I” now turned into a sheaf of little papers, no bigger than Post-its, and they were being scattered to the wind. But the “I” taking in this seeming catastrophe had no desire to chase after the slips and pile my old self back together. No desires of any kind, in fact. Whoever I now was was fine with whatever happened. No more ego? That was okay, in fact the most natural thing in the world.
…
For what was observing the scene was a vantage and mode of awareness entirely distinct from my accustomed self; in fact I hesitate to use the “I” to denote the presiding awareness, it was so different from my usual first person.
Where that self had always been a subject encapsulated in this body, this one seemed unbounded by any body, even though I now had access to its perspective… Everything I once was and called me, this self six decades in the making, had been liquefied and dispersed over the scene. What had always been a thinking, feeling, perceiving subject based in here was now an object out there.
Like Pollan describes, people often experience extremely altered self-awareness during psychedelic trips. This is part of the “mystical experience,” which appears to be correlated with the therapeutic benefits of psychedelics (see Roseman et al. 2017, Griffiths et al. 2008).
To sum up, under a long-termist view, psychedelic interventions are plausibly in the same ballpark of effectiveness of other interventions that increase the set of well-intentioned + capable people. This is because these interventions seem quite robust to consequentialist cluelessness – the extreme difficulty of confidently assessing far-future consequences of today’s actions.
If you hold a short-termist view (i.e. only people who exist before some time horizon deserve moral consideration), mental health appears to be a cause area on par with global poverty, i.e. if global poverty interventions meet the criteria for being an EA cause area, so too should mental health interventions.
Mental illness appears to cause more suffering than poverty in developed countries, and it seems to cause roughly as much suffering worldwide as poverty does. Unlike poverty, the mental health burden isn’t shrinking. (See Michael Plant’s cause profile on mental health for an analysis of how mental illness causes roughly as much suffering as global poverty.)
Psychedelics are showing a ton of promise as treatment for a battery of chronic mental health issues: anxiety (see Gasser et al. 2015, Griffiths et al. 2016), depression (see Carhart-Harris et al. 2018, Palhano-Fontes et al. 2019), OCD (see Moreno et al. 2006), and addictive disorders including smoking (see Johnson et al. 2017) & alcoholism (see Bogenschutz et al. 2015, Krebs & Johansen 2012).
For a summary of some of these findings, see dos Santos et al. 2016, a systematic review of the mental health effects of psychedelic therapy.
So, under a short-termist view, psychedelic interventions are plausibly in the same ballpark of effectiveness as global poverty interventions.
In summary, psychedelic research & advocacy seems particularly compelling as a cause area, because it performs well under both short-termist and long-termist worldviews. Given our uncertainty about morality, this appears to make psychedelics more robust than cause areas that rely on a single sequence of reasoning to justify their impact.
3(b). Comparison of rationality training to the psychedelic experience
As above, both rationality training and the psychedelic experience seem to be levers that increase the number of well-intentioned + capable people. This is plausibly robust to cluelessness (the difficulty of assessing outcomes that occur 10,000s of year in the future), because well-intentioned + capable people can course correct as we head into the future.
Comparing rationality training programs (e.g. CFAR, Paradigm Academy) to psychedelic experiences is tricky. It’s hard to make an apples-to-apples comparison, because the interventions are operating on very different levels of abstraction.
The rationality training programs I know of operate almost entirely on the conceptual level (though I believe Paradigm uses some bodywork modalities also). The basic structure of conceptual rationality training is something like:
Instructor says some words about a rationality topic
Trainee hears these words & tries to internalize the topic
Trainee practices their internalized version of the rationality topic (by themselves, with other trainees, or with the instructor)
Instructor provides feedback to trainee to improve the trainee’s internalized model of the topic
I think this structure can work really well for information & technique transfer, especially when the trainee is engaged & the instructor is skillful.
The basic structure of a psychedelic trip is very different:
Participant thinks about and articulates the intentions & expectations they have for their upcoming psychedelic experience (to themselves, or to a facilitator)
Participant ingests a psychedelic (by themselves, or with a sober facilitator present)
Participant has a psychedelic experience. This experience can include a wide range of subjective elements:
Old memories can come up and/or become salient
New perspectives on relationships with friends, family, one’s immediate environment can be adopted
Emotions can be felt very intensely, especially emotions about salient people & topics in the one’s life
Insights can be had about the participant’s psychology, social assumptions, epistemic assumptions, and metaphysical assumptions
New personal narratives (“this is the story of my life; this is what my life’s about”) can be adopted
Once sober, participant integrates the experience (by themselves, or in dialogue a facilitator)
How did the actual trip match up to your expectations about the trip?
What came up? What was interesting? What was trivial, or silly?
Did anything come up that’s worth incorporating into your everyday life?
(See the Psychedelic Explorer’s Guide for more on how to structure an effective, safe trip.)
I think this structure can be really helpful for surfacing emotional blocks (e.g. things that generate akrasia, i.e. acting against one’s better judgment), as well as for resolving known emotional blocks.
The psychedelic experience can also help change one’s assumptions, internal monologue, and personal narrative. (See the above Michael Pollan quote for an example of this. Also note that the psychedelic experience doesn’t do this automatically, it can just help “loosen you up.” The participant still has to intentionally work towards changing these things.)
So, to the extent that the EA community is limited by information & technique transfer, I’d expect conceptual rationality training to be more leveraged.
To the extent that the EA community is limited by emotional blocks & unhelpful personal narratives, I’d expect the psychedelic experience to be more leveraged.
My current view is that the EA community is more limited by emotional blocks & unhelpful personal narratives. The 2019 Slate Star Codex reader survey offers some data here: 17.4% of survey respondents have a formal diagnosis of depression (another 16.7% suspect they are depressed but haven’t been diagnosed); 12.6% of respondents have a formal diagnosis of anxiety (another 18.7% suspect they have anxiety but haven’t been diagnosed).
3(c). Interlude – on quantitative comparison
The above is merely a theoretical comparison – a quantitative comparative analysis would be helpful for thinking through the potential of psychedelics relative to other causes. Here’s one attempt at quantifying the benefits of a psychedelic intervention using DALYs, concluding that the intervention has a cost-per-DALY-averted of $472.
Note that DALY & QALY frameworks probably underweight mental health interventions due to the retrospective survey methodology used to construct their weightings.
Also note that there aren’t many quantitative analyses that compare across EA cause areas. (e.g. comparing animal welfare interventions to global poverty interventions, or comparing x-risk interventions to animal welfare interventions.) Michael Dickens’ cause prioritization app attempts this, though as far as I know it hasn’t been used to drive much decision-making.
3(d). Trauma alleviation
Childhood trauma is plausibly upstream of several burdensome problems. See this excerpt from The Body Keeps Score, a pop-sci review of academic trauma research (on p. 150):
The first time I heard Robert Anda present the results of the ACE study, he could not hold back his tears. In his career at the CDC he had previously worked in several major risk areas, including tobacco research and cardiovascular health.
But when the ACE study data started to appear on his computer screen, he realized that they had stumbled upon the gravest and most costly public health issue in the United States: child abuse.
[Anda] had calculated that its overall costs exceeded those of cancer or heart disease and that eradicating child abuse in America would reduce the overall rate of depression by more than half, alcoholism by two-thirds, and suicide, IV drug use, and domestic violence by three-quarters. It would also have a dramatic effect on workplace performance and vastly decrease the need for incarceration.
Psychedelic therapy seems very promising for resolving PTSD, which could plausibly break the cycle of abuse that creates new traumatic experiences. (Trauma appears to transfer from generation to generation via multiple pathways.)
In particular, MDMA-assisted psychotherapy for PTSD is yielding extremely promising results in recent randomized controlled trials (see Mithoefer et al. 2012, Mithoefer et al. 2018, Ot’alora et al. 2018). From the abstract of Mithoefer et al. 2018:
At the primary endpoint, the 75 mg and 125 mg groups had significantly greater decreases in PTSD symptom severity (mean change CAPS-IV total scores of −58·3 [SD 9·8] and −44·3 [28·7]; p=0·001) than the 30 mg group (−11·4 [12·7]). Compared with the 30 mg group, Cohen’s d effect sizes were large: 2·8 (95% CI 1·19–4·39) for the 75 mg group and 1·1 (0·04–2·08) for the 125 mg group.
…
PTSD symptoms were significantly reduced at the 12-month follow-up compared with baseline after all groups had full-dose MDMA (mean CAPS-IV total score of 38·8 [SD 28·1] vs 87·1 [16·1]; p<0·0001).
A Cohen’s d of 2.8 is extremely large (“Cohen suggested that d = 0.2 be considered a ‘small’ effect size, 0.5 represents a ‘medium’ effect size and 0.8 a ‘large’ effect size” source). Here’s a good resource for interpreting Cohen’s d.
In this study, 30 mg of MDMA was used as an active placebo, and the intervention groups were given 75 mg or 125 mg of MDMA.
From Mithoefer et al. 2012, a long-term follow-up of the first MDMA RCT:
We found the majority of these subjects with previously severe PTSD who were unresponsive to existing treatments had symptomatic relief provided by MDMA-assisted psychotherapy that persisted over time...
MDMA helped resolve severe PTSD symptoms in patients who had not responded to other treatment regimens. For 86% of patients, this benefit persisted 17+ months after the MDMA session.
3(e). Other (speculative) arguments for impact
Below are a few other mechanisms by which psychedelics could be highly impactful. These are speculative and I suggest holding them weakly – most of my own interest in psychedelic research & advocacy as an EA cause area is driven by the arguments given above.
Psychedelics may boost creativity & problem-solving. See Harman et al. 1966 – 27 white-collar professionals took mescaline and worked on a problem they had been stuck on.[3]
Outcomes for the problems participants had been stuck on:
A commercial building design, accepted by the client
Design of a linear electron accelerator beam-steering device
Engineering improvement to a magnetic tape recorder
A chair design, modeled and accepted by the manufacturer
A mathematical theorem regarding NOR gate circuits
Design of a private dwelling, approved by the client
Anecdotally, Steve Jobs said that taking LSD was “one of the most important things in my life.”
Eric Weinstein (mathematician, venture capitalist), Sam Harris (public intellectual), Aldous Huxley (writer), Kary Mullis (inventor of the polymerase chain reaction), and Douglas Engelbart (computer scientist, inventor of the computer mouse) have all spoken highly of psychedelics and/or are known to use them.
Psychedelics may expand one’s circle of moral concern. As far as I know, there hasn’t yet been a randomized study of the relationship between psychedelic use & altruism. (Though I’d love to see one!)
Lerner & Lyvers 2006 found that psychedelic users scored higher on empathy scales than non-drug users, though non-psychedelic drug users also had higher empathy scores than non-drug users, so it’s not clear if there was an effect driven by psychedelic use.
Anecdotally, see this essay by anonymous philanthropist Pine, who attributes their decision to set up the Pineapple Fund to a psychedelic experience: “I’ve had the idea to donate a large portion of my bitcoins for a while, and it is through this journey of discovering myself [with the help of psychedelics] that made me actually commit to it.”
Info hazard. Finally, there’s a pathway to impact that I give some weight to, but can’t discuss publicly due to concern that talking about the pathway would be an information hazard. If you’re interested in learning more about this consideration, shoot me an email.
(This consideration is only a small part of why I’m excited about psychedelics, and I’d still be bullish about psychedelics as an EA cause area in its absence.)
4. Neglectedness and tractability
How do psychedelics perform under the importance, neglectedness, tractability framework? The “importance” consideration is discussed above. Neglectedness and tractability are considered below.
4(a). Neglectedness
In short, psychedelic research is neglected. As discussed here, scientific research is very expensive, and current psychedelic research is happening only because a handful of private donors are funding it. (For more background, see this list of Open Phil analyses of the scientific research funding landscape.)
Governments are a major driver of scientific funding (NIH intends to grant $39.1 billion in 2019), and governments have not funded psychedelic research at all (presumably due to concern about optics, as well as bureaucracies being generally slow to update their stance).
From conversations I’ve had over the past two years, psychedelic research seems highly funding constrained, such that more money would convert into more research.
Roughly $40 million has been committed to psychedelic research since 2000.[4]
In comparison, the Open Philanthropy Project has granted $104 million to scientific research since 2011.
Another comparison point: Open Phil’s grant to establish CSET ($55 million) is larger than the total amount spent on psychedelic research in the last 20 years, worldwide.
4(b). Tractability
I’ve spent the last two years learning about the various projects going on in the psychedelic community. My general conclusion is that there are many tractable projects that could be happening but aren’t, due to lack of funding & capacity.
Here are the most important lines of psychedelic work currently happening:
MDMA is being shepherded through the FDA approval process for PTSD treatment by MAPS.
Psilocybin is being shepherded through the FDA approval process as a treatment for depression by both Compass Pathways and the Usona Institute (independently).
State-level political campaigns to liberalize psychedelics are underway in Oregon & California. The city of Denver recently decriminalized psilocybin, by a narrow margin.
Both MDMA for PTSD & psilocybin for depression have received breakthrough therapy designation from the FDA, which indicates that the FDA thinks that these treatments “may demonstrate substantial improvement over available therapy.”
MDMA therapy is on track be made available as a prescription medicine in 2021 or 2022. Psilocybin therapy is on track to be made available as prescription medicine sometime between 2022 and 2025.
There’s a huge amount of work that needs to be done to make these rescheduling processes go well. There are insurance considerations, marketing considerations, regulatory considerations, and legal considerations that haven’t been figured out yet but need to be.
There isn’t an obvious place for psychedelic therapy in the current US healthcare infrastructure, so new infrastructure will need to be built (e.g. clinics for administering psychedelic therapy). There are opportunities to address this from both non-profit and for-profit approaches.
Furthermore, we still don’t granularly understand how psychedelics work neurologically, and we haven’t explored much of the space of possible applications of these substances. More research is needed – research groups at Johns Hopkins, NYU, Yale, the Heffter Research Institute, and Imperial College London all have deep research agendas they’d like to pursue, and they are all funding-limited.
Finally, as rescheduling approaches & psychedelics gain popularity, there’s a massive amount of public education work that needs to be done. The psychedelic experience is not without risk. As a greater number of inexperienced people have psychedelic trips, it’s important to propagate good guidelines for how to approach psychedelics in a safe & respectful way.
5. Conclusion
I’m offering three prizes for the best arguments against psychedelic research & advocacy being an EA cause area. Any argument submitted as a comment on this post will be considered for these prizes.
I’ll award $400 to the most-upvoted argument, $200 to the second-most-upvoted argument, and $100 to the third-most-upvoted argument. Winners will be assessed on Monday, June 3rd.
The driving motivation for this post is to surface arguments against psychedelic research & advocacy being an EA cause area. My current view is that psychedelics are an extremely promising altruistic cause area (on par with global health & x-risk reduction) – I’d like to learn more about how this might be mistaken.
If my view about psychedelics being a very promising cause area isn’t mistaken, I’d like to see more attention paid to psychedelic research & advocacy by the EA community.
Cross-posted to LessWrong. Thanks to the Clarity Health Fund for a microgrant that enabled me to offer these prizes. This post was inspired by Grue_Slinky’s prize for the best argument against the EA Hotel.
[1]: More arguments in support of psychedelics being an EA cause area in these posts: Legal psychedelic retreats launching in Jamaica, Psychedelics Normalization, “A Psychedelic Renaissance” (Chronicle of Philanthropy)
[2]: Even for this case, you could imagine some scenarios where the intervention doesn’t result in good outcomes.
To the extent there are not-good scenarios you could say it’s not robust. It’s probably best to model “robustness to cluelessness” as a continuum, wherein some interventions are more robust than others.
[3]: In correspondence with one of the study authors, it was revealed that some of the study participants were also given methamphetamine. The author said that back in the 1960s, they were considering methamphetamine to be similar to caffeine, so did not think much about administering it. Nevertheless, the study results are seriously confounded because of this.
[4]: Estimate confirmed via private correspondence.
- Big List of Cause Candidates by 25 Dec 2020 16:34 UTC; 282 points) (
- Intervention Profile: Ballot Initiatives by 13 Jan 2020 15:41 UTC; 117 points) (
- What are the key ongoing debates in EA? by 8 Mar 2020 16:12 UTC; 74 points) (
- What posts you are planning on writing? by 24 Jul 2019 5:12 UTC; 60 points) (
- Debrief: “cash prizes for the best arguments against psychedelics” by 14 Jul 2019 17:04 UTC; 47 points) (
- 8 Nov 2020 11:42 UTC; 18 points) 's comment on NunoSempere’s Quick takes by (
- Funding essay-prizes as part of pledged donations? by 3 Feb 2021 18:43 UTC; 13 points) (
- Pros/cons of funding more research on whether psychedelics increase altruism? by 21 Jun 2019 23:07 UTC; 10 points) (
- 8 Mar 2022 6:09 UTC; 8 points) 's comment on The Future Fund’s Project Ideas Competition by (
- 7 Nov 2019 4:30 UTC; 7 points) 's comment on Podcast: solving consciousness and sabotaging the hedonic treadmill by (
- 1 Sep 2019 20:00 UTC; 7 points) 's comment on Cause X Guide by (
- 30 Sep 2020 15:37 UTC; 7 points) 's comment on How should we run the EA Forum Prize? by (
- [Link] David Pearce on understanding psychedelics by 19 May 2019 17:32 UTC; 6 points) (
- 5 Jun 2019 17:40 UTC; 3 points) 's comment on EA Forum Prize: Winners for April 2019 by (
- 31 May 2019 16:44 UTC; 3 points) 's comment on Latest EA Updates for May 2019 by (
- 課題候補のビッグリスト by 20 Aug 2023 14:59 UTC; 2 points) (
[Own views]
0) I don’t know what the bar should be for calling something a ‘cause area’ or ‘EA interest’ should be, but I think this bar should be above (e.g.) ‘promising new drug treatment for bipolar disorder’, even though this is unequivocally a good thing. Wherever exactly this bar falls (I don’t think it needs to be ‘as promising as global health’), I don’t think psychedelics meet it.
1) My scepticism on the mental health benefits of psychedelics mainly rely on second-order causes for concern, namely:
1.1) There’s some weak wisdom of nature prior that blasting one of your neurotransmitter pathways for a short period is unlikely to be helpful. This objection is pretty weak, given existing psychiatric drugs are similarly crude (although one of their advantages by the lights of this consideration is they generally didn’t come to human attention by previous recreational use).
1.2) I get more sceptical as the number of (fairly independent) ‘upsides’ of a proposed intervention increases. The OP notes psychedelics could help with anxiety and depression and OCD and addiction and PTSD, which looks remarkably wide-ranging and gives suspicion of a ‘cure looking for a disease’. (That they are often mooted as having still other benefits on people without mental health issues such as improving creativity and empathy deepens my suspicion). Likewise, a cause that is proposed to be promising on long-termism and its negation pings suspicious convergence worries.
1.3) (Owed to Scott Alexander’s recent post). The psychedelic literature mainly comprises small studies generally conducted by ‘true believers’ in psychedelics and often (but not always) on self-selected and motivated participants. This seems well within the territory of scientific work vulnerable to replication crises.
1.4) Thus my impression is that although I wouldn’t be shocked if psychedelics are somewhat beneficial, I’d expect them to regress at least as far down to efficicacies observed in existing psychopharmacology, probably worse, and plausibly to zero. Adding to the armamentarium of therapy for mental illness (in expectation) is worthwhile, but not enough for a big slice of EA opinion: it being a promising candidate for further exploration relies on ‘neartermism’ and (conditional on this) the belief that mental health is similarly promising to standard global health interventions on NTDs etc.
2) On the ‘longtermism’ side of the argument, I agree it would be good—and good enough to be an important ‘cause’ - if there were ways of further enhancing human capital. (I bracket here the proposed mental health benefits, as my scepticism above applies even more strongly to the case that psychedelics are promising based on their benefits to EA community members’ mental health alone).
My impression is most of the story for ‘how do some people perform so well?’ will be a mix of traits/‘unmodifiable’ factors (e.g. intelligence, personality dispositions, propitious upbringing); very boring advice (e.g. ‘Sleep enough’, ‘exercise regularly’); and happenstance/good fortune. I’d guess there will be some residual variance left on the table after these have taken the lion’s share, and these scraps would be important to take. Yet I suspect a lot of this will be pretty idiographic/reducible to boring advice (e.g. anecdotally, novelists have their own peculiar habits for writing: IIRC Nabokov used index cards, Pullman has a writing shed, Gaiman a ‘novel writing pen’ - maybe ‘having a ritual for dedicated work’ matters, but which one is a matter of taste).
The evidence for psychedelic ‘enhancement’ is even thinner than psychedelic therapy, and labours under a more adverse prior. I agree the case for psychedelics here is comparable to CFAR/Paradigm/rationality training, but I would rule both out, not in.
3) I agree with agdfoster that psychedelics have reputational costs. This ‘bad rap’ looks unfair to me (notwithstanding the above, I’m confident that an ‘MDMA habit’ is much better for you than an alcohol, smoking, extreme sports, or social media one, none of which attract similar opprobrium), but it is decision-relevant all the same. If the upside was big enough, these costs would be worth paying, but I don’t think they are.
The data doesn’t support this, and generally suggests that 1-3 psychedelic experiences can have beneficial effects lasting 6 months or longer. See for example Carhart-Harris et al. 2018:
“Although limited conclusions can be drawn about treatment efficacy from open-label trials, tolerability was good, effect sizes large and symptom improvements appeared rapidly after just two psilocybin treatment sessions and remained significant 6 months post-treatment in a treatment-resistant cohort.”
Griffiths et al. 2016:
“High-dose psilocybin produced large decreases in clinician- and self-rated measures of depressed mood and anxiety, along with increases in quality of life, life meaning, and optimism, and decreases in death anxiety. At 6-month follow-up, these changes were sustained, with about 80% of participants continuing to show clinically significant decreases in depressed mood and anxiety.”
Johnson et al. 2017:
“All 15 participants completed a 12-month follow-up, and 12 (80%) returned for a long-term (≥16 months) follow-up, with a mean interval of 30 months (range = 16 – 57 months) between target-quit date (i.e., first psilocybin session) and long-term follow-up. At 12-month follow-up, 10 participants (67%) were confirmed as smoking abstinent. At long-term follow-up, nine participants (60%) were confirmed as smoking abstinent.”
I would push back against the idea that these upsides are as independent as they may seem. Depression and anxiety are often comorbid (Hirschfeld 2001) and often comorbid with addiction (Quello 2005), OCD (Tukel 2002) and eating disorders (Marucci 2018). It seems that similar neurological states and cognitive processes underly these mental disorders, which is why psychedelics can effectively treat them all.
Carhart-Harris et al 2017, for example, suggest “connectedness” as the mechanism:
“A sense of disconnection is a feature of many major psychiatric disorders, particularly depression, and a sense of connection or connectedness is considered a key mediator of psychological well-being, as well as a factor underlying recovery of mental health. One of the most curious aspects of the growing literature on the therapeutic potential of psychedelics is the seeming general nature of their therapeutic applicability, i.e. they have shown promise not just for the treatment of depression but for addictions, anxiety and obsessive-compulsive disorder. This raises the question of whether psychedelic therapy targets a core factor underlying mental health. We believe that it does, and that connectedness is the key.”
A secondary point here is that substances with different pharmacological and phenomenological effects are all grouped under the term “psychedelic”. MDMA, for example, works and feels differently from ketamine, which works and feels differently from “classical” psychedelics like LSD, psilocybin, and DMT. So while it may seem unlikely that psychedelics (understood as one uniform thing) could have a range of benefits, it makes more sense when psychedelics are understood as a category that includes different substances.
I think that the wisdom of nature prior would say that we shouldn’t expect blasting a neurotransmitter pathway to be evolutionarily adaptive on average. If we know why something wouldn’t be adaptive, then it seems like it doesn’t apply. This prior would argue against claims like “X increases human capital”, but not claims like “X increases altruism”, since there’s a clear mechanism whereby being much more altruistic than normal is bad for inclusive genetic fitness.
I would worry about this more if the OP were referring to a specific intervention rather than a class of interventions. I think that the concern about being good on longterm and shortterm perspectives is reasonable, though there is a proposed mechanism (healing emotional blocks) that is related to both.
Normal drug discovery seems to be based off of coming up with hypotheses, then testing many chemicals to find statistically significant effects. In contrast, these trials are investigating chemicals that people are already taking for their effects. Running many trials then continuing the investigations that find significance is a good way to generate false positives, but that doesn’t seem to be the case here, and I would be surprised to find zero effect (as opposed to shorter or different effects) if it were investigated more thoroughly.
I also think that improving human capital is important, and am not convinced that this is a clear and unambiguous winner for that goal. I’m curious about what evidence would make you more optimistic about the possibility of large improvements to human capital.
I think small studies are also more vulnerable to publication bias.
On the flip side, it may be possible that the “true believers” actually are on to something, but they have a hard time formalizing their procedure into something that can be replicated on a massive scale. So if larger studies fail to replicate the results from the small studies, this may be the reason why.
Do you have any examples of this actually happening? I have seen it as an excuse for things that never pan out many times, but I don’t recall an instance of it actually delivering. E.g. in Many Labs 2 and other mass reproducibility efforts, you don’t find a minority of experimenters with a ‘knack’ who get the effect but can’t pass it on to others.
I don’t have data either way, but “knacks” for psychotherapy feel more plausible to me than “knacks” for producing the effects in Many Labs 2 (just skimming over the list of effects here). Like, the strongest version of this claim is that no one is more skilled than anyone else at anything, which seems obviously false.
Suppose we conduct a study of the Feynman problem-solving algorithm: “1. Write down the problem. 2. Think real hard. 3. Write down the solution.” A n=1 study of Richard Feynman finds the algorithm works great, but it fails to replicate on a larger sample. What is your conclusion: that the n=1 result was spurious, or that Feynman has useful things to teach us but the 3-step algorithm didn’t capture them?
I haven’t read enough studies on psychedelics to know how much room there is in the typical procedure for a skilled therapist to make a difference though.
Wouldn’t 1.2), 1.3), and 1.4) point towards funding more psychedelic research?
(To prove or disprove the benefits found in the early-stage trials?)
It does, but although that’s enough to make it worthwhile on the margin of existing medical research, that is not enough to make it a priority for the EA community.
Are you saying that EA shouldn’t fund confirmatory research, in general?
Or are you saying that there’s something in particular about this research, such that EA shouldn’t fund confirmatory research in this case?
The latter. EA shouldn’t fund most research, but whether it is confirmatory or not is irrelevant. Psychedelics shouldn’t make the cut if we expect (as I argue above) we expect a lot of failure to replicate and regression, and the true effect to be unexceptional in the context of existing mental health treatment.
Got it, thanks!
I feel confused about why you think psychedelics shouldn’t make the cut. The present state of research (several small-n studies finding very large effect sizes) seems consistent with both:
The world in which psychedelics are in fact a promising intervention
The world in which the current promise of psychedelics is an artifact of our academic knowledge-generating process
It seems like the only way to know which world we’re in is to do confirmatory research.
That sounds a bit like the argument ‘either this claim is right, or it’s wrong, so there’s a 50% chance it’s true.’
One needs to attend to base rates. Our bad academic knowledge-generating process throws up many, many illusory interventions with purported massive effects for each amazing intervention we find, and the amazing interventions that we do find disproportionately were easier to show (with the naked eye, visible macro-correlations, consistent effects with well-powered studies, etc).
People are making similar arguments about cold fusion, psychic powers (of many different varieties), many environmental and nutritional contaminants, brain training, carbon dioxide levels, diets, polyphasic sleep, assorted purported nootropics, many psychological/parenting/educational interventions, etc.
Testing how your prior applies across a spectrum of other cases (past and present) is helpful for model checking. If psychedelics are a promising EA cause how many of those others qualify? If many do, then any one isn’t so individually special, although one might want to have a systematic program of systematically doing rigorous testing of all the wacky claims of large impact that can be tested cheaply.
If not, then it would be good to explain what exactly makes psychedelics different from the rest.
I think the case for psychedelics the OP has made doesn’t pass this standard yet, so doesn’t meet the standard for an EA cause area.
From what I understand, effect size is one of the better ways to predict whether a study will replicate. For example, this paper found that 77% of replication effect sizes reported were within a 95% prediction interval based on the original effect size.
As a spot check, you say that brain training has massive purported effects. I looked at the research page of Lumosity, a company which sells brain training software. I expect their estimates of the effectiveness of brain training to be among the most optimistic, but their highlighted effect size is only d = 0.255.
A caveat is that if an effect size seems implausibly large, it might have arisen due to methodological error. (The one brain training study I found with a large effect size has been subject to methodological criticism.) Here is a blog post by Daniel Lakens where he discusses a study which found that judges hand out much harsher sentences before lunch:
However, I think psychedelic drugs arguably do pass this test. During the 60s, before they became illegal, a lot of people kind of were talking about how society would reorganize itself around them. And forget about performing surgery or driving while you are tripping.
The way I see it, if you want to argue that an effect isn’t real, there are two ways to do it. You can argue that the supposed effect arose through random chance/p-hacking/etc., or you can argue that it arose through methodological error.
The random chance argument is harder to make if the studies have large effect sizes. If the true effect is 0, it’s unlikely we’ll observe a large effect by chance. If researchers are trying to publish papers based on noise, you’d expect p-values to cluster just below the p < 0.05 threshold (see p-curve analysis)… they’re essentially going to publish the smallest effect size they can get away with.
The methodological error argument could be valid for a large effect size, but if this is the case, confirmatory research is not necessarily going to help, because confirmatory research could have the same issue. So at that point your time is best spent trying to pinpoint the actual methodological flaw.
This is exactly what p-values are designed for, so you are probably better off looking at p-values rather than effect size if that’s the scenario you’re trying to avoid.
I suppose you could imagine that p-values are always going to be just around 0.05, and that for a real and large effect size people use a smaller sample because that’s all that’s necessary to get p < 0.05, but this feels less likely to me. I would expect that with a real, large effect you very quickly get p < 0.01, and researchers would in fact do that.
(I don’t necessarily disagree with the rest of your comment, I’m more unsure on the other points.)
Yes, this is a better idea.
This comment is a wonderful crystallisation of the ‘defensive statistics’ of Andrew Gelman, James Heathers and other great epistemic policemen. Thanks!
I’m not claiming this. I’m claiming that given the research to date, more psychedelic research would be very impactful in expectation. (I’m at like 30-40% that the beneficial effects are real.)
I haven’t read the literatures for all the examples you gave. For psychic powers & cold fusion, my impression is that confirmatory research was done and the initial results didn’t replicate.
So one difference is that the main benefits of psychedelic therapy haven’t yet failed to replicate.
> I’m at like 30-40% that the beneficial effects are real.)
Right, so you would want to show that 30-40% of interventions with similar literatures pan out. I think the figure is less.
Scott referred to [edit: one] failure to replicate in his post.
Scott referred to one failure to replicate, for a finding that a psychedelic experience increased trait openness. This isn’t one of the benefits cited by the OP.
More on psychedelics & Openness:
Also:
I think we have a disagreement about what the appropriate reference class here is.
The reference class I’m using is something like “results which are supported by 2-3 small-n studies with large effect sizes.”
I’d expect roughly 30-40% of such results to hold up after confirmatory research.
Somewhat related: 62% of results assessed by Camerer et al. 2018 replicated.
It’s a bit complicated to think about replication re: psychedelics because the intervention is showing promise as a treatment for multiple indications (there are a couple studies showing large effect sizes for depression, a couple studies showing large effect sizes for anxiety, a couple studies showing large effect sizes for addictive disorders).
Could you say a little more about what reference class you’re using here?
The real goal you seem to be advancing, Milan, is spirituality, not psychedelics per se. Based on testimony from people I trust and some slightly dubious research, I think psychedelics can likely be helpful in that, but they shouldn’t be our frontline tool. I think meditation is a much better candidate for that.
Sam Harris and Michael Pollan argue that psychedelics are useful for convincing people there’s a there there, and that makes sense to me. You have to put a lot of time and blind effort into meditation to get that same assurance. But the struggle, and particularly “asking” for deeper wisdom through your faithful efforts, is a really important part of spiritual realization according to most traditions (and in my personal experience). Based on what I’ve read (haven’t taken them), I don’t think taking psychedelics often does the trick on its own.
And there are many downsides to psychedelics. People who don’t know how mentally unstable they are may take them and be thrown badly off-kilter. Bad trips are harrowing and can reach unimaginable heights of terror. I don’t think most people have the slightest clue how deeply and completely their minds could torture them. Even if people one day are grateful for what they’ve been through (as I am now with my mental illness), I would not knowingly inflict that risk on people when there are gentler ways. Even intense meditation can have these destabilizing effects, but psychedelics are much more potent, can’t be stopped on demand, and can be wielded by totally unskilled people. My guess is that the the most common harm comes from tripping habitually out of sensation-seeking rather than humbly to gain self-insight or wisdom. Again, this can happen in meditation, too, but it’s a lot less likely. When you add in all the infrastructure necessary to mitigate these risks, like comprehensive mental health screenings and guides and practice sessions, doing psychedelics right doesn’t seem that much easier than a meditation retreat and it doesn’t teach you any skills. The advantage of psychedelics at that point is speed and the guarantee that some experience of altered consciousness will take place, which is not nothing, but all this safety equipment undercuts the elegance of taking a little pill proponents have harped on.
Psychedelics could be a more EA-style intervention than meditation (if either of them qualify) because pills are scalable, but creating a safe environment with skilled guides is a lot less so. Meditation can be taught by one teacher to many people in parallel with much less equipment. It can even be taught pretty well through apps. Meditation takes longer to reach the experiences/insights psychedelics throw up in your face, but they are more digestible through meditation and insight alone is insufficient for most people to transform their lives—the vast majority also need skills like equanimity acquired through practice.
Psychedelics probably have a role to play, but I do not think they are the magic bullet proponents claim they are. They come with serious dangers, and mitigating those dangers undercuts their scalability, which was imo their biggest EA selling point. Safer alternatives, the vast array of meditative schools and techniques, exist. Psychedelics have some advantages over traditional meditation—speed and guaranteed action—but they are no panacea. My best guess is that they should be a targeted prescription for certain roadblocks on the spiritual path.
Slate Star Codex just published on this. His argument is basically “lots of things look very promising and then fail, and LSD is especially prone to this because it stimulates the insight part of your brain.”, although I encourage everyone to read the full post because obviously there’s more to it.
If this comment wins a prize I’ll pass it on to Scott.
Full disclosure: I won a prize and attempted to pass the winnings on to Scott, but he turned me down.
Easy money :-)
I read Scott as mainly arguing that:
(a) the promising results found in psychedelic research so far may not replicate, and
(b) even if psychedelics are effective in certain settings, US healthcare infrastructure isn’t configured in a way that will promote those settings
(a) seems to be an argument for doing confirmatory research of the initial results (more discussion of that in this thread).
(b) seems like a valid concern (and is currently a live debate amongst psychedelic advocates).
Psychedelic therapy involves both a psychotherapeutic component & a pharmacological component (and a much bigger one than just “here’s a prescription for some pills, take one pill a day”), so it sits at the intersection of our pharmacology institutions and our psychology institutions.
I think meditation retreat centers & psychotherapy clinics are interesting comparables for how psychedelic therapy could be structured as it enters the US mainstream.
Boring answer warning!
The best argument against most things being ‘an EA cause area’† is simply that there is insufficient evidence in favour of the thing being a top priority.
I think future generations probably matter morally, so the information in sections 3(a), 3(b) and 4 matter most to me. I don’t see the information in 3(a) or 3(b) telling me much about how leveraged any particular intervention is. There is info about what a causal mechanism might be, but analysis of the strength is also needed. (For example, you say that psychedelic interventions are plausibly in the same ballpark of effectiveness of other interventions that increase the set of well-intentioned + capable people. I only agree with this because you use the word ‘plausibly’, and plausibly...in the same ballpark isn’t enough to make something an EA cause area.) I think similarly about previous discussion I’ve seen about the sign and magnitude of psychedelic interventions on the long-term future. (I’m also pretty sceptical of some of the narrower claims about psychedelics causing self-improvement.††)
I did appreciate your coverage in section 4 of the currently small amount of funding and what is getting done as a result, which seems like it could form part of a more thorough analysis.†††
My amateur impression is that Michael Plant has made a decent start on quantifying near-term effects, though I don’t think anyone should take my opinion on that very seriously. Regardless of that start looking good, I would be unsurprised if most people who put less weight on future generations than me still wanted a more thorough analysis before directing their careers towards the cause.
As I said, it’s a boring answer, but it’s still my true objection to prioritising this area. I also think negative PR is a material consideration, but I figured someone else will cover that.
-----
† Here I’m assuming that ‘psychedelics being an EA cause area’ would eventually involve effort on a similar scale to the areas you’re directly comparing it to, such as global health (say ~100 EAs contributing to it, ~$10m in annual donations by EA-aligned people). If you weaken ‘EA cause area’ to mean ‘someone should explore this’, then my argument doesn’t work, but the question would then be much less interesting.
†† I think mostly this comes from me being pretty sceptical of claims of self-improvement which don’t have fairly solid scientific backing. (e.g. I do deep breathing because I believe that the evidence base is good, but I think most self-improvement stuff is random noise.) I think that the most important drivers of my intuitions for how to handle weakly-evidenced claims have been my general mathematical background, a few week-equivalents trying to understand GiveWell’s work, this article on the optimiser’s curse, and an attempt to simulate the curse to get a sense of its power. Weirdness aversion and social stuff may be incorrectly biasing me, but e.g. I bought into a lot of the weirdest arguments around transformative AI before my friends at the time did, so I’m not too worried about that.
††† I also appreciated the prize incentive, without which I might not have written this comment.
Curious for your take on this part of the OP:
I believe you when you say that psychedelic experiences have an effect of some (unknown) size on emotional blocks & unhelpful personal narratives, and that this would change workers’ effectiveness by some (unknown) amount. However, even assuming that the unknown quantities are probably positive, this doesn’t tell me whether to prioritise it any more than my priors suggest, or whether it beats rationality training.
Nonetheless, I think your arguments should be either compelling or something of a wake-up call for some readers. For example, if a reader does not require careful, quantified arguments to justify their favoured cause area†, they should also not require careful, quantified arguments about other things (including psychedelics).
† For example, but by no means exclusively, rationality training.
[Edited for kindness while keeping the meaning the same.]
Got it. (And thanks for factoring in kindness!)
There hasn’t been very much research on psychedelics for “well” people yet, largely because under our current academic research regime, it’s hard to organize academic RCTs for drug effects that don’t address pathologies.
The below isn’t quite apples-to-apples, but perhaps it’s helpful as a jumping-off point.
CFAR’s 2015 longitudinal study found:
Carhart-Harris et al. 2018, a study of psilocybin therapy for treatment-resistant depression, found:
Not apples-to-apples, because a population of people with treatment-resistant depression is clearly different than a population of CFAR workshop participants. But both address a question something like “how happy are you with your life?”
Even if you add a steep discount to the Carhart-Harris 2018 effect, the effect size would still be comparable to the CFAR effect size – let’s assume that 90% of the treatment effect is an artifact of the study due to selection effects, small study size, and factors specific to having treatment-resistant depression.
Assuming a 90% discount, psilocybin would still have an adjusted Cohen’s d = 0.14 (6 months after treatment), roughly in the ballpark of the CFAR workshop effect (d = 0.17).
To explicitly separate out two issues that seem to be getting conflated:
Long-term-focused EAs should make use of the best mental health care available, which would make them more effective.
Some long-term-focused EAs should invest in making mental health care better, so that other long-term-focused EAs can have better mental health care and be more effective.
The former seems very likely true.
The latter seems very likely false. You would need the additional cost of researching, advocating for and implementing a specific new treatment (here, psilocybin) across some entire geography to be justified by the expected improvement in mental health care (above what already exists) for specifically long-term-focused EAs in that geography (<0.001% of the population). The math for that seems really unlikely to work out.
I continue to focus on the claims about this being a good long-term-focused intervention because that’s what is most relevant to me.
-----
Non-central notes:
We’ve jumped from emotional blocks & unhelpful personal narratives to life satisfaction & treatment-resistant depression, which are very different.
As you note, the two effects you’re now comparing (life satisfaction & treatment-resistant depression) aren’t really the same at all.
I don’t think that straightforwardly comparing two Cohen’s d measurements is particularly meaningful when comparing across effect types.
fwiw I think negative self-talk (a kind of emotional block) & unhelpful personal narratives are big parts of the subjective experience of depression.
Comparing dissimilar effects is a core part of EA-style analysis, right?
I’m not arguing against trying to compare things. I was saying that the comparison wasn’t informative. Comparing dissimilar effects is valuable when done well, but comparing d-values of different effects from different interventions tells you very little.
Probably the crux here is that I think rationality training & the psychedelic experience can achieve similar kinds of behavior change (e.g. less energy spent on negative self-talk & unhelpful personal narratives) such that their effect sizes can be compared.
Whereas you think that rationality training & the psychedelic experience are different enough that believable comparison isn’t possible.
Does that sound right to you?
Does this mean you think that projects like CFAR & Paradigm Academy shouldn’t be associated with the EA plank?
Psychedelic interventions seem promising because they can plausibly increase the number of capable people focused on long-termist work, in addition to plausibly boosting the efficacy of those already involved. (See section 3(a) of the OP.)
The marginal value of each additional value-aligned + capable long-termist is probably quite high.
Pointing out that there are two upsides is helpful, but I had just made this claim:
It would be helpful if you could agree with or contest with that claim before we move on to the other upside.
-
Rationality projects: I don’t care to arbitrate what counts as EA. I’m going to steer clear of present-day statements about specific orgs, but you can see my donation record from when I was a trader on my LinkedIn profile.
Isn’t much of the present discussion about “what counts as EA?”
Maybe I’m getting hung up on semantics. The question I most care about here is: “what topics should EAs dedicate research capacity & capital to?”
Does that seem like a worthwhile question?
Right. I’m saying that the math we should care about is:
effect from boosting efficacy of current long-termist labor + effect from increasing the amount of long-termist labor + effect from short-termist benefits
I think that math is likely to work out.
Given your priors, we’ve been discounting “effect from short-termist benefits” to 0.
So the math is then:
effect from boosting efficacy of current long-termist labor + effect from increasing the amount of long-termist labor
And I think that is also likely to work out, though the case is somewhat weaker when we discount short-termist benefits to 0.
(I also disagree with discounting short-termist benefits to 0, but that’s doesn’t feel like the crux of our present disagreement.)
Let’s go. Upside 1:
Adding optimistic numbers to what I already said:
Let’s say EAs contribute $50m† of resources per successful drug being rolled out across most of the US (mainly contributing to research and advocacy). We ignore costs paid by everyone else.
This somehow causes rollout about 3 years earlier than it would otherwise have happened, and doesn’t trade off against the rollout of any other important drug.
At any one time, about 100 EAs†† use the now-well-understood, legal drug, and their baseline productivity is average for long-term-focused EAs.
This improves their productivity by an expected 5%††† vs alternative mental health treatment.
Bottom line: your $50m buys you about 100 x 5% x 3 = 15 extra EA-years via this mechanism, at a price of $3.3m per person-year.
Suppose we would trade off $300k for the average person-year††††. This gives a return on investment of about $300k/$3.3m = 0.09x. Even with optimistic numbers, upside 1 justifies a small fraction of the cost, and with midline estimates and model errors I’d expect more like a ~0.001x multiplier. Thus, this part of the argument is insignificant.
-----
Also, I’ve decided to just reply to this thread, because it’s the only one that seems decision-relevant.
† Various estimates of the cost of introducing a drug here, with a 2014 estimate being $2.4bn. I guess EAs could only cover the early stages, with much of the rest being picked up by drug companies or something.
†† Very, very optimistically, 1,000 long-term-focused EAs in the US, 10% of the population suffer from relevant mental health issues, and all of them use the new drug.
††† This looks really high but what do I know.
†††† Pretty made up but don’t think it’s too low. Yes, sometimes years are worth more, but we’re looking at the whole population, not just senior staff.
An EA contribution of far less than $50m would be leveraged.
The $2.4bn estimate doesn’t apply well to psychedelics, because there’s no cost of drug discovery here (the drugs in question have already been discovered).
As a data point, MAPS has shepherded MDMA through the three phases of the FDA approval process with a total spend of ~$30m.
The current most important question for legal MDMA & psilocybin rollout in the US is not when, but at what quality. We’re at a point where the FDA is likely (>50% chance) going to reschedule these drugs within the next 5 years (both have received breakthrough therapy designation from the FDA).
Many aspects of how FDA rescheduling goes are currently undetermined (insurance, price, off-label prescription, setting in which the drugs can be used). A savvy research agenda + advocacy work could tip these factors in a substantially more favorable direction than would happen counterfactually.
Doing research & advocacy in this area scales fairly linearly (most study designs I’ve seen cost between $50k-$1m, advocates can be funded for a year for $60-$90k).
From the OP:
So somewhere between 34.1% − 65.4% of SSC readers report having a relevant mental health issue (depending on how much overlap there is between the reports of anxiety & reports of depression).
I think SSC readers are an appropriate comparison class for long-term-focused EAs.
That said, I agree with the thrust of this part of your argument. There just aren’t very many people working on long-termist stuff at present. Once all of these people are supported by a comfortable salary, it’s not clear that further spend on them is leveraged (i.e. not clear that there’s a mechanism for converting more money to more research product for the present set of researchers, once they’re receiving a comfortable salary).
So perhaps the argument collapses to:
effect from increasing the amount of long-termist labor + effect from short-termist benefits
And because of your priors, we’re discounting “effect from short-termist benefits” to 0.
I still propose that:
effect from increasing the amount of long-termist labor
is probably worth it.
Doesn’t feel like a stretch, given that this mechanism underpins the case for most of the public-facing work EA does (e.g. 80,000 Hours, CFAR, Paradigm Academy, Will MacAskill’s book).
This was a really interesting and well-written thread! To clarify, Milan, is your argument that psychedelics would make people more altruistic, and therefore they’d start working on protecting the long term future? I didn’t quite understand your argument from the OP.
:-)
Yes, from the OP:
I was using “improving intentions” to gesture towards “start working on EA-aligned projects (including long-termist projects).”
(There’s a lot of inferential distance to bridge here, so it’s not surprising that it’s non-trivial to make my views legible. Thanks for asking for clarification.)
In general, I’m not sure people who have tried psychedelics are overrepresented in far future work, if you control for relevant factors like income and religious affiliation. What makes you think increasing the number of people who experience a change in their metaphysical assumptions due to psychedelic drugs will increase the number of people working on the far future?
I think psychedelics can make people more altruistic.
Unfortunately, at present I largely have to argue from anecdote, as there are only a few studies of psychedelics in healthy people (our medical research system is configured to focus predominately on interventions that address pathologies).
Lyons & Carhart-Harris 2018 found some results tangential to increased altruism – increased nature-relatedness & decreased authoritarianism in healthy participants:
Whether psychedelics make people more altruistic is one of the studies I most want to see.
---
I don’t think the psychedelic experience per se will make people more altruistic and more focused on the longterm.
I think a psychedelic experience, paired with exposure to EA-style arguments & philosophy (or paired with alternative frameworks that heavily emphasize the longterm, e.g. the Long Now) can plausibly increase altruistic concern for the far future.
---
fwiw, controlling for religious affiliation may not be appropriate, because psychedelics may increase religiosity. (Another study I want to see!)
Arguments have been assessed & prizes awarded!
The winners –
Most upvoted: Gregory_Lewis
2nd-most upvoted: Elizabeth, cross-posting Scott Alexander’s argument
3rd-most upvoted: Kit
And I also paid out a prize to Holly_Elmore, who made what seemed to me to be the best counterargument, though it wasn’t one of the top three most upvoted.
No further comments or votes on this thread will be considered for the assessment of the prize.
Quick Summary: Despite presumed benefits, legalization would likely not increase psychedelic use in low/middle income countries, while legalization in high income countries is way too expensive to be worth funging against conventional global poverty interventions, even with extreme optimism about the necessary budget to cause legalizing and the resulting increase in usage.
Long Summary: For the purposes of this, I’m going to boldly just assume that psychedelics are a very good thing, and treat “giving people who want psychedelics access to psychedelics” as an endpoint. I don’t have any strong opinions about whether they actually are good or not, but I see that plenty of other people are trying to answer that question.
Hypothesis 1 (high confidence): For most of the world, the bottlenecks to accessing psychedelics have nothing to do with the law, so fighting to “decriminalize” them is premature. Populations whose bottlenecks are primarily legal live in high income countries, so everything to do with helping them is rather more expensive than other interventions. It’s therefore highly unlikely that legalizing psychedelics would effectively impact global health. Considering the scale of lack of access to mental health resources, I don’t even think that it’s a particularly effective way to improve mental health specifically.
Hypothesis 2: If you do work on this, consider focusing on lobbying, which may be surprisingly cheap compared to research and clinical trials (low confidence, worth further research). Even if lobbying for legalization succeeds, for this to work there are possibly additional, possibly more expensive barriers that you have to sort of hope the market sorts out on its own for it to become like an ordinary drug.
...all of which hinges on psychedelics themselves being at all good or effective medically, which seems to be something which isn’t particularly certain. So, as far as being an “EA cause area”...well, I don’t think this is something we should be diverting fungible funds to, at least. If you had activities in mind that don’t divert significant funds, or don’t boil down to changing laws, this analysis may not apply.
Evidence
Part 1: Exactly how illegal are psychedelics, really?
Please review these maps summarizing the legality of shrooms, mdma, and ibogaine
Note that most of the world is GREY (no data) in all these maps. What does that mean?
To quote Wikipedia’s page on India, which I picked because it is a country with high population: “Psilocybin mushrooms are officially illegal but the police is largely unaware of their prohibition and are poorly enforced in India.”
Okay, so… based on all this, I’m going to tentatively conclude that psychedelics are basically an unenforced non-issue for the major population centers of the world, to the point where it takes a lot of work to even figure out their obscure legal status. Maybe if you actually started distributing psychedelics it would become an issue, but as of now it’s not even on the map.
Part 2: But what if it was available by prescription?
You know what else is available by prescription and probably helps mental health? Antidepressants! This are two articles about the global picture for antidepressant use. 1) Business Insider 2) The Guardian
Eyeballing these charts, it seems pretty clear that prescription antidepressant use would drop off pretty sharply outside high income countries. It ranges from 1.1% to 0.1% for the relatively well off countries on this chart and looking at the trend of this chart, i think it would be even lower than 0.1% in low income countries. I doubt that this is because the population of countries just don’t need anti-depressants—it’s probably an issue of access to medicine. So I’m pretty sure legalizing psychedelics is not actually going to help most people acquire psychedelics.
Part 3: Okay, but what about people for whom laws ARE a limiting factor? They matter equally!
Okay, lets just consider the United States as the case study, since it’s easy to get data about the USA.
Americans may be the best case scenario in terms of post-legalization uptake, because, as per the previous articles, they use the most drugs. 1 in 6 Americans use prescriptive psychoactives of any kind.
This is promising! Maybe Americans would benefit from using more psychedelics, and they would do so if it was legal. Marijuana legalization did increase marijuana use...although ehhhh I suggest clicking on that link and eyeballing that chart, or going into the original paper and looking at the stats if you want to quantify this. Colorado is the most dramatic increase at ~11% to ~16% (eyeballing the chart) but e.g. Washington didn’t really see huge increases. The grayed-out lines are states that didn’t get legalized, and some of them saw increases to. Still, there are probably important differences between marijuana and various psychedelics in terms of ease of producing them and acquiring them.
Anyway, how much would it cost to do make it legal and available?
For the USA, full R&D to “marketing approval” of a random compound is estimated to cost 1.4b and increasing to 2.9b USD (2013 dollars) post-R&D cost. “Marketing approval” is a high standard. This means that you can openly advertise for it. For the USA, FDA approval for things (that have not been made illegal) costs 19m USD (2013 dollars.
But maybe the market would take care of that itself. What would be the lobbying costs?
Here’s a linkdump preliminary research on the landscape of current lobbying spending to legalize marijuana and end the War on Drugs in general. The numbers I’m seeing in these articles are inconsistent with each other (e.g some of Opensecret’s figures are lower than 1 million/year nationwide, yet this random article which is just about New Jersey is higher than 1 million?) but I admit I have not read this super carefully. I’m not going to summarize these further because I’ve set aside 1.5 hours to write this post, and that time is now up, but I will leave these sources for anyone who wishes to carry this forward. It’s possible that lobbying might actually be sort of affordable?
https://www.opensecrets.org/lobby/indusclient.php?id=N09 https://www.opensecrets.org/news/issues/marijuana/
https://www.mcall.com/news/nation-world/mc-nws-new-jersey-marijuana-offshore-wind-lobby-20190304-story.html
We’ve taken USA as a case study, but it is probably cheaper to lobby other countries. (But, most of the countries for which legalization is a likely barrier are also high income countries...with smaller populations...and less prescription drug use)
But even if you succeeded at all of this in the United States...how many people would you really help? The US population is 327.2 million, and even if we really optimistically assume that the number of psychedelic users will grow to match the number of antidepressant users, that’s only 1.1% of them, so… around 3.6 million people, and let’s forget about those who would use psychedelics anyway. How much are you really willing to spend on accomplishing that? I mean, pick a realistic budget, and put it into here to compare it to some other stuff https://www.thelifeyoucansave.org/impact-calculator
Let’s suppose you happen to spend a probably unrealistically low 5 million dollars on this problem, and you have an unrealistic high 100% chance of success, and as a result an unrealistically high 3.6 million people who weren’t motivated enough to buy illegal psychedelics are now free to use psychedelics a decade earlier than it otherwise would have been legalized without your intervention...you could have given 3.9 million people access to a year of clean water. Sort of—i don’t think this impact calculator takes room for more funding and scaling into account. But, just to be in the ballpark—Does that really seem at all equivalent? To me, it feels extremely not equivalent, by orders of magnitude. (Not to put too sharp a point on it, but I bet having access to clean running water also boosts creativity, problem solving, reduces stress, aids in healing from trauma, and all the other benefits listed.)
Epistemic status: all my knowledge on this topic was acquired during the 1.5 hours it took to research and write this post, and I admit that the research is extremely haphazard—aimed at establishing very rough “order of magnitude” type estimates. I have no prior stake or knowledge about this, I just quickly wrote it up because I felt like taking a shot at the prize. Nevertheless, I do feel fairly confident in the opinions expressed here.
>My current view is that psychedelics are an extremely promising altruistic cause area (on par with global health & x-risk reduction) – I’d like to learn more about how this might be mistaken.
A point of clarification: Your “on par with” phrasing perhaps inadvertently suggests that psychedelics are not a global health intervention. My analysis views psychedelics legalizations AS a global health intervention, and therefore subject to the same metrics. E.g. the way Givewell considers laws restricting and taxing tobacco a global health intervention).
To clarify, I’m not arguing that recreational legalization of psychedelics would be a good idea.
I’m arguing that more spending on psychedelic research & some advocacy work (in particular, helping the rollout of FDA approval of MDMA & psilocybin go smoothly) would be leveraged.
I guess what it boils down to, is how much EA money do you think would need to go into accomplishing this, and for what expected outcome? I’d like to make the distinction that if you can recruit some talent from the EA community to use money provided by, say, Clarity Health Fund (which is earmarked for psychedelics anyway) to further psychedelics related causes in a more effective way, then I am absolutely all for it and in full support. But we ultimately want high impact with fairly small amounts of EA money, or via the use of free EA talent, or EA talent that is paid in ways other than EA money, because of the high counterfactual price tag on EA money. Calculating the expected outcome of this is tough, but possible, and I would change my mind if I saw a plausible estimate that came out as being impactful.
I think that I understand why you think this will work, and hopefully the next few paragraphs demonstrate that understanding. And I think it’s important to acknowledge that GiveWell (by their own admission) did not account for leverage in their early evaluations, and this may have created some undesirable anchoring / lock-in effects with respect to Effective Altruism recommended activities.
And I agree that “leverage” can mean that causes that seem “less efficient” in terms of a strict “direct impact” / “resources spent” metric may have been unjustifiably ignored by the EA community, especially if the form of “leverage” involved is more complex than a simple fundraiser. Moderately efficient causes could benefit from an EA mindset, so long as resources are being redirected from less efficient to more efficient areas and not the other way around. Most of the world’s resources either can’t (due to logistics) or won’t (due to the priorities of power structures and individual donors) be directed towards the most high impact causes. If you can recruit those resources, with a realistic assessment of your impact being greater than what those resources would otherwise have gone to, it would still be worthy of the name effective altruism, and you would still have more direct impact at the end of the day, using resources that would otherwise have gone somewhere with less direct impact.
In fact, when you put it that way, there’s a whole host of cause areas you might consider. While trying to “End Homelessness in America” doesn’t beat distributing mosquito nets to low income countries on a “direct impact” / “resources spent” metric, there is plenty of money that you might think of as effectively “earmarked” for USA purposes only, or earmarked for a certain type of intervention. If you redirect resources that would be otherwise spent on something less impactful, a high difference in impact means that you have done a good job, because of leverage. I think many in the EA community recognize this to some extent, and Givewell is currently investigating opportunities to influence government policy and improve government spending. The concept of leverage really broadens the scope of what “EA” could mean, and potentially does open the door to sometimes helping people in high income countries or furthering causes that don’t boast efficiency per dollar, although I would guess generally not financially helping but rather via skills or spreading a message (e.g. influence donors who are of a less global mindset to donate to more effective cause within the local parameters they care about, or help organizations that aren’t necessarily focused on doing the absolute maximum good per dollar still become more effective within the narrower scope of their goals, etc...). One could consider psychedelics legalization to be potentially a part of such activities.
Now that I’ve (hopefully) shown that I understand where you’re coming from here, let me explain why I still don’t think this will work, and what it would take to change my mind.
From the perspective of an individual, the act of recruiting EA money to your cause is also a form of “leverage”. This applies to everyone and everything, not just psychedelics: if you believe that EA is generally on the right track, then the less “EA resources” you leverage to your cause, and the more otherwise inefficient resources you leverage to your cause, the better your (counterfactually informed) impact will be. Even people doing global poverty should preferentially recruit non-EA funds, if they believe that EA funds are otherwise well allocated.
I would (from my currently naive perspective) agree with you that investing in key research goals probably would be “leveraged” impact, in the sense that direction some EA-money to this might lead to other resources being redirected to this down the line. If we’re talking about potentially diverting funding from other EA causes, we’ll need to be super stringent about impact-per-dollar. We can and should include “leverage” in those calculations, but said calculations must occur.
From what I understand, you’re essentially suggesting just a little bit of research and advocacy, on a reasonable expectation that it will catalyze some sort of tipping point, redirecting funds from various non-EA sources towards the problem. But as long as you’re working within an EA framework, it’s important to quantify your estimate of the impact of that investment.
To estimate the...counterfactual-blind?… impact of your (research, advocacy, whatever) actions, you’d have to estimate the expected impact on policy outcome (how much earlier do we estimate the relevant FDA approvals, policy changes, etc happen as a result of the diverted funds) and the expected value of those policy outcomes (how many people will get better treatment as a result of those outcomes, relative to the treatment they otherwise would have gotten). In other words, how many people benefited?
And then, you have to introduce the counterfactual question of what those resources could be spent on instead. You have to first calculate the counterfactual impact of any EA-resources, which (unless EAs are misguided) have a particularly heavy counterfactual impact price tag (At least when it comes to asking for money? Judging from what I’ve seen posted about the EA job market recruiting EA talent could still be a good move). After that, you’d have to calculate the counterfactual impact of all the other resources you leveraged (though I think it would be okay to just place that at zero for now, to keep the models simple enough to use).
And…despite non-EA leverage, I just don’t think these numbers will come out that way, for all the reasons described in the previous comment. Even if you make brilliant use of leverage to mostly set aside the fact that the countries that are in a position to benefit from this are expensive to operate in, you’d still have to deal with the fact that the lack of research and attending policy changes has little to do with global bottlenecks to access…which means that the numerator in the “beneficiaries/EA-resources spent” equation is going to be pretty low. I don’t mean all the resources you gain leverage—you can make your own calculations of what the counterfactual impact price tag of those are, and depending on who you leverage maybe you could even make a case for that being zero. I mean specifically the EA-money. A person using EA money for this cause would have to operate on a shoestring budget to beat the counterfactual cost. If you agree with my earlier statement that per individual, a year of clean water is, let’s say, 10x as good as reaping the unrealistically-best-case scenario of psychedelics research a decade earlier than otherwise (which seems really lowballing it to me), you’d have to honestly believe that every EA-derived $50k (ignoring further leverage) you spend pushes the timeline forward by a year just to “break even”. I admit I don’t fully understand this issue or the plan but that seems really optimistic when I compare that to the aforementioned $19m figure required to push un-stigmatized drugs through FDA approval.
Anyway, if someone were to do those calculations, it would be a good use of time, because developing methods to evaluate the impact of research/advocacy on policy change in general is something we need. (stay tuned! I may be posting more on that later).
In fact it’s worth just assuming psychedelics are as useful as any drug currently in use when doing your calculations, because even if psychedelics aren’t it, there would be many other items in this general class and we can try to estimate expected values of adding funds to promising research in general to refer to them all. If you were to demonstrate that psychedelic research/advocacy might have that level of impact by these metrics, it would be a pretty big deal even if this particular class of under-researched psychoactive compounds ended up being a flop, because there are a lot of other things that would potentially also become high impact by the same arguments.
In these discussions of impact, I think it’s worth pointing out that unlike, say, x-risk, something like psychedelics research/advocacy is sufficiently concrete that we can reasonably attempt to quantify the impact of our activities, at least to within one or two orders of magnitude, and compare numbers...at least next to research/advocacy for other policy interventions (which happening in low-income countries which have more people and are cheaper to lobby in)
This hopefully goes without saying, but I don’t mean to claim that psychedelics is irrelevant and EAs should not pay any attention to this at all: If you or anyone else has done the research and feel that this is a low hanging fruit, even if the aforementioned impact evaluation doesn’t come back as highly efficient, I would encourage that person to find a way to pluck it...and if some of the under-utilized EA-talent was leveraged towards the problem, it could be a good thing. I just wouldn’t support redirecting global poverty or x-risk focused funding to this (unless some very surprising and convincing impact evaluations along the lines of what I described came out and changed my mind).
(Oh also, I think my use of the world “legalizing” in the previous comment might have been misleading, I just meant the general situation where our interventions allow psychedelics to be used in more and more contexts, without breaking the law. Not legalizing recreational use specifically )
For anyone worried their comment won’t get attention vs the existing ones, I’m enjoying this thread and am watching for and voting where relevant on new ones, FYI.
I’d guess the best argument is the obvious one:
most the professional world and voting populace have a very negative view on psychedelics
whilst the potential upsides might be sizeable, they likely don’t compare to the negative damage to EA that EA orgs publicly supporting such work would likely do.
if done in secret that’s a) a secret (generally bad) and b) inevitably going to get out.
and a fair number of non EAs are working on it anyway as it’s quite a popular idea in California. I’m guessing anyone super passionate about it could get funding and hire without having to be associated with EA at all.
A common misconception is that if something is being talked about publicly there is probably funding available for it somewhere. But the number of weirdness dollars actually available in the wild for anything not passing muster with Ra can still be safely rounded to zero for most purposes. Even people who have had past success in more conventional areas often have trouble getting funding for weirder ideas, and if they do wind up spending a lot of time fundraising.
(Essay introducing Ra, for reference.)
I would push back against this somewhat. It’s historically been the case that the general view of psychedelics is negative, but I think a case can be made that this is changing fairly quickly. Media coverage of psychedelics over the past ~5 years has been positive, e.g. The Guardian, The Wall Street Journal, Rolling Stone, Vox, CBC Radio, The New Yorker. Michael Pollan’s latest book How to Change Your Mind was pretty pro-psychedelic and was a New York Times #1 bestseller. Denver also recently decriminalized psilocybin mushrooms, and there are decriminalization ballot initiatives planned for Oregon and California in 2020.
1. In the 70s a lot of people in the West were taking psychedelic drugs, however, no major public health and/or civilization-level changes happened.
2. More compassion (operational definition = empathy + desire to reduce another person’s suffering) does not necessarily equal more prosocial behavior.
2.1. Most modern tribes have been taking psychedelics regularly for generations. First, they are still living in tribes (see point 3). Second, with their increased moral concern they, nevertheless, go to war, kill, rape, pillage, torture, and enslave.
2.2. SJWs and radical vegans score really high on compassion and are willing to engage in violent behavior to defend their ideological beliefs. Most people already have a decent physiological capacity for compassion. Battling ideological possession may be more important.
2.3. You need people who score low in empathy and compassion for certain professions like trauma surgery, burns units, firefighters, judges, warfare and so on. Being high in empathy and compassion AND working in such fields leads to burnout, compassion fatigue, depression, PTSD and so on ⇒ 1. you increase suffering in the population + 2. you decrease the effectiveness of these professions.
More compassion does not equal less antisocial behavior and less suffering.
3. Creativity and openness are not always good. Creativity evolved in certain contexts and comes at the expense of other cognitive functions and with risks of certain mental conditions. “Messy creative ADHD neurotic” people are fun to be around and have genius ideas but you need greater numbers of “close-minded boring doers” to develop, execute and regulate those. Shifting the balance between creatives and non-creatives in a population may have unexpected consequences.
4. Having an effective intervention to treat not just severe but also mild and moderate depression may shift attention and resources away from battling the cause. We should aim at implementing measures that will lead to lesser likelihood of depression to develop in the first place (not counting the rare genetic cases). Such measures may be trace lithium in the water supply, population-level screening for immune reactions to food antigens, improving in-door lights, fighting light polution, having different work schedules for different chronotypes, a custom school curriculum and career development based on your BIG5 strenghts, “dorms for adults” to reduce loneliness, and so on.
Conclusion: psychedelics = promising, amazing area of research, great potential to improve certain things BUT:
1. maybe they will work, maybe.
2. we already have proven strategies how to solve the problems psychedelics may solve but we need ways to implement them on scale.
=> psychedelics:
promising = yes
priority = no
Could you provide some evidence for this claim?
The thing to do here would be to compare the effect sizes, the size & probability of potential harms, and the cost-to-treat for each alternative we want to consider.
Here’s an analysis that attempts this for a psychedelic intervention, in terms of DALYs (so that the result can be compared to alternatives).
Can you point me to analyses like this for the alternative strategies you have in mind? Or at least back-of-the-envelope calculations that roughly size the effect, potential for harm, and cost-to-treat for the alternative strategies?
Although psychadelics is plausibly good from a short-termist view, I think the argument from the long-termist view is quite weak. Insofar as I understand it, psychadelics would improve the long term by
1. Making EAs or other well-intentioned people more capable.
2. Making people more well-intentioned. I interpret this as either causing them to join/stay in the EA community, or causing capable people to become altruistically motivated (in a consequentialist fashion) without the EA community.
Regarding (1), I could see a case for privately encouraging well-intentioned people to use psychadelics, if you believe that psychedelics generally make people more capable. However, pushing for new legislation seems like an exceedingly inefficient way to go about this. Rationality interventions are unique in that they are quite targeted—they identify well-intentioned people and give them the techniques that they need. Pushing for new psychadelic legislation, however, could only help by making the entire population more capable, including the much smaller population of well-intentioned people. I don’t know exactly how hard it is to change legislation, but I’d be surprised if it was worth doing solely due to the effect on EAs and other aligned people. New research suffers from a similar problem: good medical research is expensive, so you probably want to have a pretty specific idea about how it benefits EAs before you invest a lot in it.
Regarding (2), I’d be similarly surprised if
campaigning for new legislation → more people use psychadelics → more people become altruistically motivated → more people join the EA community
was a better way to get people into EA than just directly investing in community building.
For both (1) and (2), these conclusions might change if you cared less about EAs in particular, and thought that the future would be significantly better if the average person was somewhat more altruistic or somewhat more capabable. I could be interested in hearing such a case. This doesn’t seem very robust to cluelessness, though, given the uncertainty of how psychedelics affect people, and the uncertainty about how increasing general capabilities affects the long term.
Kit & I worked through the long-termist argument somewhat in this thread.
Tangentially related and perhaps of interest to some readers of this thread, though not a prize submission comment:
My nomination for the “three books” for psychedelic therapy would be
The “Why” Book: Pollan’s How to Change Your Mind
The “What” Book: MAPS’s A Manual for MDMA-Assisted Psychotherapy in the Treatment of PTSD (alternatives would be this treatment protocol from Phase 2 MDMA trials and Grof’s LSD Psychotherapy)
The “How” Book: R. Coleman’s Psychedelic Psychotherapy: A User-friendly Guide for Psychedelic Drug-assisted Psychotherapy (the runner up would be Fadiman’s The Psychedelic Explorer’s Guide)
If psychedelics are a low hanging fruit, for-profits are gonna take the first step and grab it
Epistemic status: >50%
(I hope SSC is wrong and Griffe is right, and I’d like to see more research , too—but I think it’s way more likely that psychedelics end up being provided by big companies than by startups or non-profits)
I feel tempted to invoke epistemic (and financial) modesty: depression (and mental health) is not a very neglected disease which only affects a small or poor population; there’s a lot of money to be made in this area by pharmaceutical research, and I see no coordination problem or similar obstacle. If big companies such as Bayer or Pfizer (more capable of providing adequate funding, research and lobby) are not willing to bet on that, why should we?
P.S.: I didn’t read every other comment, but I searched a little bit and concluded that only GnomeGnostic mentioned big pharma. His argument is sound.
What makes you conclude that there’s a lot of money to be made in it? My prior is the opposite. MDMA and psilocybin themselves aren’t patentable at this point. Yes, delivery mechanisms could be and new or related unpatented compounds could be. But any for profit company will likely be competing against at least a non profit or two. And my research on pricing is that having a single competitor massively reduces margins and profitability. Also dosing will be highly infrequent which should also reduce the profits for any psychedelic pharma companies.
Given the risk in pharma r&d, potential profits presumably need to be very large to justify investment. My sense is that the expected rate of return may be lower than other similarly risky projects and therefore it won’t be particularly suited to for profits. But maybe it’ll be somewhat profitable and the reward of positive impact will make up the difference. Or maybe thanks to many years of use, these compounds have much lower risk and therefore make sense from a risk reward perspective to be pursued by pharma investors/companies.
But I’m skeptical of that and I expect that we’ll need altruistically motivated people to make progress, and if it’s left to for profits the industry would stagnate. (One piece of evidence is that the for profit pharma world has seemingly made no progress in the psychedelic field since about the 1970s with the exception of compass pathways a few years ago)
You have a good point: if a big pharma can’t have IP over a psychedelic product, at least in our current system, it has no incentives to invest on risky R&D. However, we do observe increasing private funding for psychedelic research and a lot of recent exposure; and the war on drugs explains enough of the halt in psychedelics research in the 70′s. So, despite updating my priors, I still don’t think that donating for this cause would result, in the margin, in more QALY than donating to GD, in general.
What’s your ballpark dollars-per-QALY estimate for GiveDirectly donations, and your ballpark dollars-per-QALY estimate for the psychedelic intervention you have in mind?
This analysis could be helpful as a jumping-off point for the latter.
Also note that the QALY framework likely underweights mental health interventions.
First, I’m not referring GD as our best charity, but just as a minimal standard for EA causes. Second, last time I checked (please, update if I’m wrong):
GD was considered to be saving 1 life per U$7000 on nov 2016 by GW: https://docs.google.com/spreadsheets/d/1KiWfiAGX_QZhRbC9xkzf3I8IqsXC5kkr-nwY_feVlcM/edit#gid=1034883018
GW considered 1 life = 35 QALY. So, I estimate GD results in U$200/QALY (Actually, there are huge uncertainties over this estimate, and GW is not conclusive about GDs effectiveness in terms of lives and QALY. But one could pick AMF or SCI instead as a standard)
I’m assuming DALY = 1 - QALY
Enthea’s estimate of psychedelics liberalization is of $472/DALY.
I do agree that QALY is biased towards some interventions, and that mental health is usually underestimated by healthy people (I suspect they are unduly led by the lack of physical and apparent symptoms). I do think we should find out how to treat depression properly (maybe some neglected, cheap and scalable solution end up becoming an EA-like charity). However, I don’t believe Enthea poll is free of biases, either; particularly, it seems to me that people in developed countries consistently underestimate the burden of disease and poverty in the 3o world, screwing the comparison in the opposite way. Notwithstanding, my main point is not so much about impact, but about neglectedness; 32 million people had experimented with psychedelics only in US by 2010 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3917651/). If each of them donated an average of U$ 1 for this cause, they would match all of GDs transfers in 2017. I do believe we should liberalize psychedelics—and probably we will, eventually,since many people with considerable purchase power are interested in it.
Got it, thanks.
As far as I know, GiveWell considers cost-effectiveness estimates as informative for efficacy differences that are orders of magnitude apart.
For two interventions that are on the same order of magnitude, the analyses aren’t granular enough to believably inform which is more effective.
Agree. I kind of regret mentioning QALY in my argument, but do notice that I was trying to be healthy skeptical when I mentioned “I still don’t think that donating for this cause would result, in the margin, in more QALY than donating to GD, in general”. I never said I was confident that GD would result in more QALYs than supporting psychedelics.
Okay. I’m not sure if there’s a crux here, in that case.
fwiw I think it’s very hard to get people to donate to things.
From section 4(b) of the OP: “Roughly $40 million has been committed to psychedelic research since 2000.”
True, but people are already competing to invest in THC providers. Why wouldn’t they do it for psychedelics?
I think there will be psychedelic for-profit ventures & investment. (That’s a different claim than the claim that there’s already enough donor dollars in the space.)
My current view is that almost all of the for-profit investment that comes into the space will flow through highly suboptimal structures (e.g. pharma companies trying to achieve frivolous patents that give them a monopoly – this is already happening with esketamine).
Savvy philanthropic work could help push us out of that regime & towards a better one.
The best way to increase long term aggregate wisdom in our time is probably to push for better governance. If the gov had better policies, voting methods, and higher research funding, that would likely lead not only to more psychedelic and other pharma research, but also numerous other benefits.
From a longtermist perspective, other technologies and trends might promise a better cure for mental health problems. Genetic engineering, AI based therapy, nanotechnology, declining levels of global trauma as war and material hardship diminish.
This contest itself is evidence that global priorities research is neglected. Why devote yourself to one particular trendy medication when we have such a limited wisdom base for making such decisions? Better to focus on running contests like this, or finding ways to build a career in developing better frameworks or technologies for evaluating impact generally.
Psychedelic therapy isn’t that neglected—it’s in a stage 3 clinical trial and has had a major book published on it already, and plenty of mental health professionals have been covertly working on it as underground psychedelic therapists. There are so many biology PhDs already that the bigger bottleneck appears to be general research funding and FDA regulations. Working on these problems is broader scale, and will still support psychedelic research without being limited to it.
In general, pharmaceuticals are only one tool in the box for working on mental health issues, so even if they are a majorly impactful drug, it’s never going to be more than a partial solution to a specific and still hazily defined problem.
As a side note, I wonder whether the winner of contests decided by upvotes will be determined not by the strongest argument but by who posts first. In the future, perhaps a panel of judges casting votes after the deadline would be a better method?
+1 for the prediction that earlier posts will get more votes.
I’m sympathetic to this sentiment, though none of the examples you give seem to be at all tractable / anywhere close to being rolled out within the next 10 years.
Also I think the 20th century has good examples of increasing material wealth not correlating with decreasing trauma. (Following Pinker here in thinking that violence is becoming more power law distributed, i.e. fewer episodes but each episode has a more extreme magnitude.)
Doesn’t the second sentence here cut against the first?
i.e. doesn’t “the bigger bottleneck appears to be general research funding” speak against “Psychedelic therapy isn’t that neglected”?
That’s a good catch—I was thinking of EAs pursuing positions as psychedelic therapy researchers/practitioners, but clearly you could advocate for more research funding or donate toward it as an EA project.
I think it might prove quite difficult to scale effective use of psychedelics out to a large population, due to bottlenecks on facilitation. I’d guess that becoming an effective facilitator requires quite a bit of in-person training with an established facilitator and contact with a mature psychedelic culture, in much the same way that becoming an effective meditation teacher seems to require quite a bit (years and years) of in-person contact with an already-established meditation teacher and culture.
I expect that this cannot easily be worked around via more or better written instruction. I’d expect that simply reading The Psychedelic Explorer’s Guide (or a future improved version thereof) and then facilitating trips with no other exposure to teachers or culture would produce mediocre or ineffective results for trippers, in much the same way that teaching meditation under the same circumstances would, or teaching entrepreneurship having mostly just read business books would.
I particularly expect that non-facilitated trips will be ineffective on average if scaled out to a large population.
Perhaps a person can become a mature facilitator by extensive solo tripping experience, in combination with consuming written material. But this would also be a bottleneck on scaling out psychedelics.
Perhaps we can just scale psychedelics out slowly and maybe that would still be extremely worthwhile. But I expect this to proceed on timescales not much faster than the rate at which meditation is currently being “scaled out” in the western world.
Yeah, I expect the rollout of psychedelic facilitation to take roughly as long as the rollout of psychotherapy did. Maybe faster, because psychedelic facilitation could leverage the existing training infrastructure of the mental health establishment.
Perhaps the third wave of CBT is a good comparison case. I’m a little fuzzy on the specifics, but it looks like third-wave CBT got started in the 1980s, and was considered the standard best-in-class modality for psychotherapy by the 2000s.
So that would imply a rollout of 1-2 decades from starting point to “standard modality.”
I’ve contributed small amounts of money to MAPS , but I haven’t been thinking of those as EA donations.
My doubts overlap a fair amount with those of Scott Alexander , but I’ll focus on somewhat different reasoning which led me there.
It sounds like MAPS has been getting impressive results, and MAPS would likely qualify as an EA charity if FDA approval were the main obstacle to extending those results to the typical person who seeks help with PTSD. However, I suspect there are other important obstacles.
I know a couple of people, who I think consider themselves EAs, who have been trying to promote an NLP-based approach to treating PTSD, which reportedly has a higher success rate than MAPS has reported. The basic idea behind it has been around for years , without spreading very widely, and without much interest from mainstream science.
Maybe the reports I hear involve an improved version of the basic technique, and it will take off as soon as the studies based on the new version are published.
Or maybe the glowing reports are based on studies that attracted both therapists and patients who were unusually well suited for NLP, and don’t generalize to random therapists and random PTSD patients. And maybe the MAPS study has similar problems.
Whatever the case is there, the ease with which I was able to stumble across an alternative to psychedelics that sounds about equally promising is some sort of evidence against the hypothesis that there’s a shortage of promising techniques to treat PTSD.
I suspect there are important institutional problems in getting mental help professionals to adopt techniques that provide quick fixes. I doubt it’s a complete coincidence that the number of visits required for for successful therapy happens to resemble a number that maximizes revenue per patient.
If that were simply a conspiracy of medical professionals, and patients were eager to work around them, I’d be vaguely hopeful of finding a way to do so. But I’m under the impression that patients have a weak tendency to contribute to the problem, by being more likely to recommend to their friends a therapist who they see for long time, than they would be to recommend a therapist who they stop seeing after a month because they were cured that fast. And I don’t see lots of demand for alternative routes to finding therapists that have good track records.
None of these reasons for doubt is quite sufficient by itself to decide that MAPS isn’t an EA charity, but they outline at least half of my intuitions for feeling somewhat pessimistic about this cause area.
From the report you linked to, in the Key Findings section: “No clinical evidence on NLP for the treatment of adults with PTSD, GAD, or depression was identified.”
Could you point me to a citation for NLP having a higher success rate than MDMA for treating PTSD?
I don’t know whether it has been published. I heard it from Rick Schwall (http://shfhs.org/aboutus.html).
Got it, thanks!
Curious whether “No clinical evidence on NLP for the treatment of adults with PTSD, GAD, or depression was identified” is an update for you re: NLP’s efficacy.
No, I expected that no rigorous research had been done on NLP as of 2014, and I don’t know how rigorous the more recent research has been.
Reminder: one week left before the prizes are assessed! (Deadline is June 3rd.)
Now’s a good time to read the submissions & upvote the ones that seem best.
Because the competition ends tomorrow, I’m curious: Did any of these arguments change your views? Did people say what you expected them to say? Did you get what you wanted out of this exercise? What do you think the next steps are?
Thanks, I intend to write a follow-up post that goes into some detail on questions like these. (Probably will publish in a few weeks as I’m booked up until then.)
Briefly: it looks like only Gregory_Lewis & Carl_Shulman made arguments specifically against EA funding more psychedelic research. (Many people made arguments against psychedelics being an EA cause area in general, but not about funding more research in particular.)
Gregory didn’t close out his argument except to say that he thinks EA shouldn’t fund most kinds of research, including confirmatory research about psychedelics. (In his initial post, he pointed to some reasons why he thinks the results of the initial studies won’t hold up under further scrutiny, but he doesn’t think funding more scrutiny should be an EA priority, and I don’t follow why not.)
Carl pattern-matched psychedelic research to interventions like cold fusion, psychic powers, some parenting interventions, some nutritional / diet interventions, and a few other things. (Interventions which have initial promising results that fail to hold up under more scrutiny.)
Our crux here seems to be that the only way to figure out whether a promising early-stage result is real or not is to do confirmatory research. My prior suggests that funding confirmatory research for psychedelics would be a good use of EA funds, and Carl’s prior is probably that funding research like this wouldn’t be. We haven’t yet sorted out our difference here.
So I still hold the view that funding more psychedelic research would be a good use of EA funds.
My views are the same as Carl’s, hence I didn’t make a further reply. (i.e. Low enough base rates imply the yield on chasing replications does not reach the—high—bar for EA).
Got it. This seems like our crux, in that case.
I think it’s about 30% − 40% that the psychedelic results found to date are real (i.e. that they replicate).
What’s your estimate of how likely the results are to replicate?
Not sure how helpful percentages are given effect sizes and interventions are varied.
Per my OP, I’d benchmark getting results similar or better to SSRIs (i.e. modestly effective for a few mental illnesses) to be the top 3% ish of what I’d expect research to confirm. I’d give 25% for essentially nothing replicating and it going the way of power poses, priming, or other psych dead ends Scott mentions.
The remaining 70% is smeared across much less impressive results (and worth noting SSRIs are hardly a miracle cure): maybe sort-of helpful for one condition, maybe helpful but only for a subset of motivated individuals, etc. etc.
Do you feel like you updated after reading the studies I point to?
e.g. were you initially like “there’s literally a 0% chance this is real” and now you’re like “well, maybe there’s a 3% chance that psychedelics are an effective treatment & 70% that psychedelics do something but aren’t more efficacious than SSRIs” ?
I see, thanks.
Do you assign a non-negligible chance to psychedelics meaningfully outperforming current treatments like SSRIs? (“3% for similar to or better than SSRIs” blurs together the case where psychedelics are just as efficacious as SSRIs and the case where they are massively more efficacious.)
This is important because if there’s a small, non-negligible chance of a large effect over that of current treatment, investment could still be warranted.
Comparison point: GiveWell has directed tens of millions USD to deworming programs, even though most GiveWell staffers think there’s only a 1-2% chance that deworming effects are real.
1) Generally my probability mass is skewed to the lower ends of the intervals I’m noting. Thus the 70% band is more with multiple caveats rather than just one (e.g. a bit like—as Scott describes it—Ketamine: only really useful for depression, and even then generally modest effects even as second line therapy). Likewise the 3% is mostly ‘around SSRIs and maybe slightly better’, with subpercentile mass as the dramatic breakthrough I think you have in mind.
2) Re. updates: There wasn’t a huge update on reading the studies (not that I claim to have examined them closely), because I was at least dimly aware since medical school of psychedelics having some promise in mental health.
Although this was before I appreciated the importance of being quantitative, I imagine I would have given higher estimates back then, with the difference mainly accounted for by my appreciation of how treacherous replication has proven in both medicine and psychology.
Seeing that at least some of the studies were conducted reasonably given their limitations has attenuated this hit, but I had mostly priced this in as I expected to see this (i.e. I wasn’t expecting to see the body of psychedelic work was obviously junk science etc.).
3) Aside: Givewell’s view doesn’t appear to be “1-2% that deworming effects are real”, but:
I.e. Their central estimate prices across a range of ‘no effect’ ‘modest effect’ ‘as good as the index study advertised’, but weighted towards the lower end.
One could argue whether, if applied to psychedelics, whether the discount factor they suggest should be higher or lower than this (multiple studies would probably push to a more generous discount factors, but an emphasis on quality might point to more pessimistic ones, as the Kremer index study has I think a stronger methodology—and a lot more vetting—than the work noted here). But even something like a discount of ~0.1 would make a lot of the results noted above considerably less exciting (e.g. The Calhart-Harris effect size drops to d~0.3, which is good but puts it back into the ranges seen with existing interventions like CBD).
VoI is distinct from this best guess (analogously, a further deworming RCT to reduce uncertainty may have higher or lower value than ‘exploiting’ based on current uncertainty), but I’d return to my prior remarks to suggest the likelihood of ending up with something ‘(roughly) as good as initial results advertise’ is low/negligible enough not to make it a good EA buy.
4) Further aside: Given the OP was about psychedelics generally (inc advocacy and research) rather than the particular points on whether confirmatory research was a good idea, I’d take other (counter-) arguments addressed more generally than this to be in scope.
Let’s approach the amount of EA causes as finite. We must prioritize issues and approaches to maximize relevant altruism.
Psychedelics have a not insignificant history and period of use, and a limited set of initial research that shows some promise and some bias.
Highlighted by the recent crystallization of the binding of LSD to SERT, the serotonin transporter, we generally can categorize the mechanisms of psychedelics so far to match pathways we currently have agents on, and that have some limitations.
Pursuing psychedelics as a strong target may come at the potential cost of not finding new mechanisms or avenues to go after mental disorders. We would be pursuing agents with known deleterious factors- e.g. MDMA shows serotonergic neurotoxicity in some models at certain doses, exacerbations of disorders and psychotic breaks—that represent a dose ceiling or limited room for treatment. A comparison to say, optimizing courmarin use, with its risks, rather than finding a new agent (like a factor Xa inhibitor).
There’s a particular subset of experiences and association in psychedelic use that is decidedly non-scientific and resistant to experimentation. Meta-physicial explanations and decidedly ‘hippie’ mentalities show resistance to some authorities and funding sources.
Psychedelics comprise a loud, distracting avenue that, while it may bring promise to some areas, may drain funding from novel approaches. Choosing psychedelics as a cause also defines a substance area as a cause, rather than humanistic approaches (one can argue about the psychedelic mentality). Does it truly match other EA causes or is it choosing an ‘hot topic’ because it comes from alternative point of view?
Note- these are mainly just random sentences rather than an argument
lol is this GPT2-generated?
[Deleted.]
Semantic point: I can’t see any way that psychedelics are a ‘cause area’. Either they’re one of many possible interventions in the cause area of mental health, or they’re one of many interventions in the cause area of the long term future, or possibly both. The psychedelics are means to an end, not an end in themselves.
Sure, framing this as “psychedelic interventions in the cause areas of mental health & longterm future” seems okay.
(I’m advocating for the EA community to pay more attention to psychedelic interventions, and I’m agnostic about how to frame that.)
I don’t have much to contribute beyond the many things that have already been said, but I suspect my overall opinion may be shared by many others: I think psychedelics could plausibly (but not >50%) be a very effective mental health intervention. One could perhaps call them a promising EA intervention, although the evidence base is quite thin at the moment. However psychedelics don’t seem likely to be a particularly effective long term intervention at the moment. They perhaps might be once they were legalized and there was some more evidence behind this, but that seems quite a long way away. Trying to legalize psychedelics or improve research for the long term impacts seems quite implausible as an effective intervention.
Curious for your thoughts on the long-termist argument I made in the OP?
I’m not really sure what you mean by “improve research for the long term impacts.”
Could you say a bit more about why liberalizing psychedelic access and conducting more academic research on psychedelics seem implausible as effective interventions?
Argument in OP:
I view this as a weak argument. I think one could make this sort of argument for a large number of interventions: reading great literature, yoga, a huge number of productivity systems, participating in healthy communities, quantified self, volunteering for local charities like working at a soup kitchen, etc. Some of these interventions focus more on the increasing capability aspect (productivity systems, productivity systems) and some focus more on improving intentions (participating in healthy communities, volunteering). Some focus on both to some degree.
The reason it seems like a weak argument to me is because:
(a) the average effects of psychedelics on increasing capability seem unlikely to be strong. They may be high for a small percentage of people, but I’m not aware of any particularly strong reason to think that the average effects are large.
They may be large for people with mental health issues, but then it’s not really an intervention for increasing capability in general, it’s a mental health intervention. These are distinct, and as I said above, psychedelics could plausibly be a top intervention for mental health.
(b) The improving intentions aspect looks to be on even shakier grounds. What is the evidence that taking psychedelics is an effective treatment for improving intentions in a manner relevant to working on the long term? I’ve never heard of any psychedelic or spiritual community being focused on long termism in an EA relevant manner. Some people report ego dissolution, but I’m not even aware of any anecdotal reports that ego dissolution led to non-EAs thinking and working on long term things. It sounds like you know some cases where it may have been helpful, but I’m skeptical that a high quality study would report something amazing.
Some discussion about this in this thread.
A crux here is probably that I’m modeling “mental health disorders like depression & anxiety” as on the far end of a continuous spectrum of unendorsed behavior patterns (and the unendorsed behavior patterns of “healthy-typed” people are also on this spectrum), and it seems like you are modeling “mental health disorders” as being in a separate conceptual bucket from the unendorsed behavior patterns of healthy-typed people.
Because I’m modeling all of these patterns on a continuous spectrum, I expect treatments that help with the pathologized cases (e.g. diagnosed depression) will also help with not-pathologized cases (e.g. bad-feeling thought patterns in people without a diagnosis).
Also I do want to say that I appreciate you trying hard to engage with skeptical people and try to figure out independently new promising areas! That’s valuable work for the community, even if this particular intervention doesn’t pan out.
Thank you :-)
Thanks for the clarification. I also share your model of mental health disorders being on the far end of a continuous spectrum of unendorsed behavior patterns. The crux for me here is more what the effect of psychedelics is on people not at the far end of the spectrum. I agree that it might be positive, it might even be likely to be positive, but I’m not aware of any compelling empirical evidence or other reason to think that it is strong.
I have essentially a mathematical objection, in that I think the math is unlikely to work out, but I don’t have a problem with the idea in principle (putting aside PR risks).
Thanks for linking your thread with Kit in your other reply. I think my objection is very similar to Kit’s. Consider:
Total benefit = effect from boosting efficacy of current long-termist labor (1) + effect from increasing the amount of long-termist labor (2) + effect from short-termist benefits (3)
I expect (1) to be extremely not worth it given the costs of making any substantial improvement in the availability of psychedelics, and (2) to be speculative and to almost certainly not be worth it. By (3), do you mean the mental health benefits for people in general?
Got it. I’m happy we clarified this!
Griffiths et al. 2008 & Griffiths et al. 2017 found highly positive effects for psychedelics in healthy-typed people. (Both studies are RCTs & quite well done, as far as I can tell.)
Here’s some commentary on the studies.
Yes. Because Kit doesn’t include short-termist considerations in his moral calculus (he’s not moved by parliamentary theories of moral uncertainty), we discounted short-termist considerations to 0 in our discussion.
Personally, I include short-termist considerations in my moral calculus.
Part of the reason I’m bullish on psychedelic interventions is that there’s both a plausible long-termist story & a plausible short-termist story (which seems somewhat additive, when aggregating).
Right, as Kit & I hashed out, I think it makes sense to discount (1) to 0.
(Probably almost all of the benefit of increasing capabilities of current researchers can be captured without further liberalizing psychedelics, as most current researchers live in enclaves where de facto psychedelic access is quite liberal (though illicit)).
I agree that (2) is speculative, but the possible benefit here is large enough that further research seems justified.
(If the psychedelic experience in a certain context can reliably boost altruism without incurring costs that nullify the effect, that seems like a really big deal that’d be worth knowing about. It would be straightforward to design & execute a study on this, if someone were willing to fund it.)
If psychedelics are to actually become accepted as a legitimate treatment for any psychiatric disorders and to become widely used, the spearhead will almost certainly have to be a new compound designed and tested by a mainstream pharmaceutical company. This will solve the legality problem: the drug in question will be one that has not actually been banned because it will be different (a genuinely new compound? An enantiomer of an existing one?) than any banned substance. It will solve the skepticism problem because it will undergo rigorous clinical trials by non-ideologues. Once it’s on the market, floodgates will *then* open for testing of illegal well known drugs to see if they have the same benefits. But until then, EA-sponsored research will not be widely accepted. It will look like the scientists performing the study are funded by an advocacy group interested in specifically psychedelics willing to accept any applications rather than interested in the disease to be treated and willing to accept any treatment. That isn’t a look that engenders confidence in positive results. The psychiatry world would likely ignore positive results generated under that sort of condition, and few individuals would try the substance as a treatment. In contrast, if a mainstream pharmaceutical company patents the compound and sells it in a mainstream fashion, the number of people who would take the medication for the indicated purpose would be orders of magnitude higher.
Investing into researching psychedelics will damage the “brand value” of EA. While many peoples’ opinions of EA would likely improve if the organization began investing into researching psychedelic, I imagine most ultra-wealthy donors would not fall into this group and are more likely to dislike this pursuit. Donating may even become associated with promoting drug use, and even if the ultra-wealthy donors are personally okay with this, they may see donating to EA as damaging the reputation of their business ventures.
Furthermore, while researching psychedelic therapies likely would be beneficial, there already exist (many?) organizations pursuing this line of avenue. Hence, EA’s impact likely wouldn’t totally transform the field, and this non-massive impact comes at the cost of EA being able to secure funding for other cause areas and being able to serve as an advocate for effective philanthropy.
EDIT: I see that agdfoster is already making an argument along the lines of this (and posted this argument before I did). Although, perhaps my focus on “damage to ultra-wealthy donor recruitment” is crucial to drilling home the importance of EA maintaining it’s brand.
I can’t believe I didn’t read this until just now. You are attacking unstated assumption of the philanthropy community writ large, but which includes EA. One is that better psychology is an area for philanthropy and altruism minded people to care about. Most people in our society put the needs of the body far higher than psychological/”spiritual” needs (and neglect taking care of the psychological distress of others as a work of charity). I think this argument would actually have to be won in order for the psychedelics argument to work as a promising new subset of that line, which I can buy that it may be. The metaphysical assumption that the mind matters as a separate issue from the needs of the body and that there are big gains to having better psychological tools for making the mind better or at least to not suffer. Once again, however, I suspect that most people have a hard time believing that better psychological states for people like them would have better visible real-world effects. They don’t believe in a tight coupling of greater psychological health and real-world improvement for people who are already doing fairly well on both fronts.
I want to note that mental health as a general EA cause area was the subject of another EA forum contest in December 2018.
My biggest qualm against most psychedelics is not that they don’t work per say, but that they are kind of redundant and not the most effective long term, when compared to the various meditations we have.
Mindfulness-based practices have been found to reduce anxiety, depression, pain, stress and can help people bounce back faster from negative events. They can be used alongside CBT or be employed when CBT fails to make an impact. The jhanas are highly pleasurable and virtually unknown to most people. The boundless attitudes / viharas may make one feel better about themselves and improve positive affect along with willingness to express compassionate, kindness, and engage in altruistic activities. Vipassana and Dzog Chen can chip away at ego construction and the illusion of permanent selfhood, and are calming. Zazen has parallels with all day awareness activities. And like yoga, these are all easily secularizable if that is ever an issue.
Besides mindfulness and mindfulness-based practices, not much clinical research (RCTs and the like) have been done with these other forms of meditation. There aren’t too many metanalyses and systematic reviews of Vipassana, jhanas, and the viharas, particularly because rcts including them are in their infancy and many are underway.
Various meditation may be able to improved cortical thickness, make brains faster and more efficient to counter the effects of aging, among other benefits like emotional/behavioral regulation. Perhaps they can be used to make people think more like consequentialists, now that I think about it...
Taking a page out of Buddhist ethics, psychedelics aren’t inherently bad, they are just usually unskillful, as they may easily lead to heedlessness and short term ethical carelessness (as opposed to vigilance or awareness). That is to say, while it is harder to enter meditative states, and they take some practice before one can do it at whim, they are easier to exit & the same cannot apply until the physiological and physiologic effects of a substance or very good or very bad trip wears off. This argument may not apply to microdosing, though, or future biohacking medications that reduce pain.
Meditation is not a panacea, sure, but it is among the fastest growing industries & practices (doubling in terms of Us practitioners in the past decade). It’s something EA should highly consider studying and possibly funding.
Lastly, once one knows how to engage in a meditative practice, it is free and can be done anywhere with little risk or danger to others. Anyplace and anytime. It may help reduce physician or healthcare worker burnout [a factor I’d posit as possibly responsible for costly medical errors], and has been used to treat many of the psychiatric conditions you referenced. There are a few conditions (schizophrenia) where some meditations may not be helpful and may indeed be harmful, but given the scope of contemplative practices, some may be clinically applicable.
Strong upvoted this one.
A prominent Buddhist monk in the Thai Forest Tradition (Ajahn Jayasaro) said the following, which I feel is highly relevant here:
Someone had asked (llama) Kohima, “What do you think of expanding minds through chemical means?” He said, if you have an ignorant mind then you just get expanded ignorance. I thought he was just on the spot. It is all within the sphere of darkness, isn’t it? You are still playing around with different modes of ignorance. You are not actually going beyond. You are not transcending. You are transcending one particular state of ignorance, but you are still in the same building, you haven’t got out of the building, you still haven’t got out of prison. So this sobriety is that whole turning away from all the strange and unusual experiences and visions, physiological, mental states that are available through chemical means and taking a delight in the simple down to earth clarity of awareness.”
We already have one gateway drug: poverty alleviation. We don’t need more. Psychedelics won’t change the civilisation’s path. Next.
“Psychedelics won’t change the civilisation’s path.”
It would help me if you unpacked the reasoning behind this claim.
I feel like the burden of proof is on you, no? how will psychedelics help avoid astronomical waste?
From the original post:
oh, by bad. apologies. thanks for the quote!
in terms of augmenting humans, my impression is that genetic engineering is by far the most effective intervention. my understanding is that we’re currently making a lot of progress in that area, yet some important research aspects seem neglected, and could have a transformative impact on the world.
I wonder if you disagree
Yes, I disagree.
There are currently several legal, high-quality psychedelic modalities on offer that I would be personally excited to work with (1, 2, 3, 4). Many more will be coming online within the next 5 years.
I haven’t heard of any genetic engineering interventions on the market that are currently having a transformative impact, and I wouldn’t feel comfortable personally participating in genetic engineering until it was way more battle-tested.
Humans have been using psychedelic healing modalities to good effect for thousands of years – that track record plus what we know from the research about their risk/reward profile makes me feel comfortable working with them (in a respectful way).
Genetic engineering doesn’t seem to have a comparable track record or a comparable evidence base.
thanks for your answer!
You get humans from primates with genetic modifications, not psychedelic :)
No intentionality though, just a blind process over millennia.
With intentionality, you can go from birds to 747s and F-16s in 70 years.
so am understanding you have short AI timelines, and so don’t think genetic engineering would have time to pay off, but psychedelics would, and that you think it’s of similar relevance as working directly on the problem
There are many excellent reasons why funding research on psychedelics should NOT be a top priority for EA (or any other group either, such as NSF, NIMH, or NIH).
First, as a ‘caveat’ I think its hard to define ‘top priorities’ ----I think there are many priorities, some of which are unknown, overlooked, or the standard EA measures of importance, neglectedness and tractability are not computed (or estimated) correctly. Noone knows what is the top or which ones. Also in my world, funds are always limited, so that means even if one has some good idea of what actions are candidates for being in a list of top rated priorities, one may not be able to fund all of them. And sometimes its better to practice ‘triage’, and just fund a few adequately so they have a chance of success, rather than all of them at such low levels that they will likely all fail.
The best reeasons NOT to fund psychedelic research are economic. There are huge industries in USA based on promoting acoholism, addiction to opiates and other pharmaceuticals, and tobbaco, among other things, as well as ones based on curing people of these addictions or the problems they cause for people. These industries also generate alot of feelings of social well being, because there are many people who gain pleasure either helping people self-medicate to feel better, or curing them when they feel ill. If pschedelics were available, and proved to be an alternative to currently available substances, its possible some jobs would be lost , and alot of social unhappiness would follow. EA generally is against increasing unhappiness (though they might argue for the change, if ‘gross national happiness’ increased. As has been argued for free trade, or any other ‘pareto increasing’ economic reallocation, it is always possible to compensate the ‘losers’ if there is a net gain. For example, produce and sell psychedelics rather than alcoohol, just as may occur with ‘synthetic meat’.)
Other values less recognized is having a whole lot of sick alcoholics and drug addicts around. They are a useful source of social stigma and an often easily recognized ‘underclass’ from which can be gained self-esteem for many not in that class. In past few months I have also heard many experts in the ‘chattering class’ have numerous discussions on radio about the rpbolem of this underclass, and worry they wouldn’t have mmuch to dop if they weren’t around, though perhaps they could find some other group to stigmatize into an underclass. But this issue suggests one may not want to risk eliminating the current underclass should that be an outcome of psychedelic research. Remember the tale of Pandora’s box—the cure may be worse than the disease, at least for some people.
There is some risk that adoption of psychedlics as a legal alternative, should research suggest that is reasonable, could lead to some of the same problems one has with other legal and illegal substances (including food). (Some countries use prescriptions for medical marijuana and opiates to try to control this problem).
Another reason NOT to fund such research is that if they were available it might change the way people look at the world. Their ‘doors of perception’ would be changed, leading to (this) ‘civilizational collapse’. Established institutions like religions, possibly education, views on desirable entertainment (eg sports, TV and talk radio) might face major impacts or ruptures.
Finally, as someone with some experience using psychedelics quite awhile ago (2 different wild species, which I found myself by sloggin iles through fields, swamps and deserts ) , another reason NOT to fund research on them that might make them acceptable and legal is because in my case those experiences of finding them made me very (or at least reasonably) healthy, clearminded (at least in my subjective opinion, which is not worth much I have found at least to others) and happy. (After i took those lieteral and figurative trips, i went right back to college and took courses in molecular pharmacology and quantum theory, though i was never a great student partly because i preferred being outside, but did pass.) In this culture where I can’t go out and find them, I can walk a block to a store or corner up the street and get something else which makes me very unhealthy, makes it nearly impossible to think or even walk far, may be unpleasant to others, and often miserable. (And that street can be dangerous to walk on at night or in the day.) There are health and mental health and mental professionals and industries dependent on sick, confused and unhappy people. Also to become such professionals they didn’t have to go through the ‘misery’ of taking quantum theory or pharmacology-just took psychology or counseling where they learn the dangers of psychedelics. I they had to change the expert curriculum to accept new knowledge that could traumatic, so that is another reason NOT to fund psychedelic research. Best not to upset the setup )
I don’t think that sarcastic comments like this, especially when they don’t include evidence or serious discussion of the question, are helpful to the post’s author or to other readers.