Thanks enormously for this very thorough write-up—shared despite your nervousness(!) - which was insightful, not just for your thinking about psychedelics, but also about non-profit and for-profit investing.
You said lots. I’m just going to focus on two things here.
1. (Dis)analogies between investing and donating
You drew the analogy that GiveWell-recommended charities—evidence-based ‘micro-interventions’ - are like index funds, whereas funding research is more like angel investing. I agree with you that the risk-return structure is similar, in the sense we think the former has lower variance and lower expected value and the latter has higher variance but also higher expected value. Crucially, ‘value’ here is being used ambiguously: for investing, we’re interested in financial value; for philanthropy, in moral value. Because of this, the analogy isn’t exact and it doesn’t follow we should think about investing and philanthropy the same way.
From an investor’s perspective, it does make sense to make both sorts of investments, but only because there are diminishing marginal returns to income on well-being. If there were no diminishing marginal returns to income on well-being, the best thing for your well-being would be whatever has the highest expected return on investing!
From the philanthropist’s perspective, because there aren’t diminishing marginal returns to value on, er, value—increasing happiness by 1 ‘unit’ is just as good, no matter how much happiness there already is—we really should just do the things that have the highest expected value and ignore concerns about variance.
Hence, if you think funding some project in psychedelics really has higher expected (moral) value than anything else, including GiveWell’s picks, it would be better (by your lights) to give to that, and recommend your listeners to do likewise. Put another way, note there’s something odd about saying “yeah, I really do think A would have the most impact, and all that matters here is impact, but you and I should do B anyway.”
Admittedly, you might have some concerns about 1. asking your listeners to follow your recommendations, rather than someone else’s (which wouldn’t be relevant to your own giving) and 2. it being psychological motivating to have some low-risk wins, i.e. you think you will give donations with a higher total expected value if some are lower expected value ‘sure-things’.
2. When is it worth doing detailed analyses of early-stage investments/philanthropy?
I’m not sure if we disagree here or not. In terms of a Value of Information approach, the less money you are putting towards something, and the less you expect to learn from investigating it—because e.g. you think there is no good evidence available, so you’d still be relying on your intuitions—the less valuable it is to do the investigation. For really big decisions, it can be worth doing this even if you’re very confident, because you might be wrong.
I suspect we probably agree on this in general, but we might disagree on exactly where ‘the bar’ is, that is, where it makes sense to sit down to write out one’s assumptions, put probabilities and values on things, and crunch some numbers. Broadly, I’m a fan of doing this: I find it helps clarify my thinking, plus if cost-effectiveness analysis doesn’t agree with the intuitive judgement, that is a good spur to think about where the difference emerges. It’s possible I’m suffering from bias here: quantifying hard-to-quantify stuff is what the conceptual tools of effective altruism (primarily philosophy and economics) allow one to do and I am familiar with. To the man with only a hammer, etc.
That said, I think one specific, valuable project would be sifting through the landscape of psychedelic funding opportunities. As you say, even some of the best projects are not getting funded, so it seems useful to think through exactly which those are and make the case for them so they get the money they need. This is a more or less apples-to-apples comparison and could be done quite qualitatively because it’s things like “fund research into compound A for X or fund compound A for Y”, so you can just compare X to Y. However, this is pretty hard to do without lots of inside knowledge of the players and projects, particularly as they change over time. HLI doesn’t have this knowledge, so we’d need to partner with someone in the know.
The other obvious valuable project would be comparing (the best thing in) psychedelics to other things. This is the familiar-but-difficult quantitative analysis piece. Given the money at stake, it’s worth doing even if one is pretty confident about the answer. Further, at least for EA-minded donors, it’s crucial to see a good attempt to do this before switching where they put their money. Again, a key input is what the best-in-class psychedelics thing is.
I’m wondering if this is something you might be interested in collaborating on. I’ll send you a message on the EA forum privately to ask you about this.
P.S. Regarding the Founders Pledge comparison of Usona to StrongMinds, they say it’s comparable on e.g. p. 69 of the psychedelics report. Sorry, I thought that was somewhere more obvious.
From an investor’s perspective, it does make sense to make both sorts of investments, but only because there are diminishing marginal returns to income on well-being. If there were no diminishing marginal returns to income on well-being, the best thing for your well-being would be whatever has the highest expected return on investing!
[...]
Hence, if you think funding some project in psychedelics really has higher expected (moral) value than anything else, including GiveWell’s picks, it would be better (by your lights) to give to that, and recommend your listeners to do likewise (emphasis added). Put another way, note there’s something odd about saying “yeah, I really do think A would have the most impact, and all that matters here is impact, but you and I should do B anyway.”
I basically agree with the model here — that there aren’t diminishing returns on moral value. That said, a couple of notes on the specific situation:
a) From the perspective of inspiring action, it would make sense to me if Tim saw his listeners as being somewhat risk-averse (as most people are!) and tried to recommend GiveWell in the expectation that this would raise more overall money than a higher-risk option. This approach might still be Tim’s best way to maximize his impact as a fundraiser. (No idea whether this is something he actually tries to do.)
b) Some of the opportunities Tim has supported (e.g. scientific studies by a particular lab) aren’t necessarily in a position to accept small donations, and so wouldn’t make sense to recommend for listeners. (That said, there are times when these opportunities have been available to small donors, and he’s advertised them.)
Many of these recommendations appear here because they are particularly good fits for individual donors—due to being able to make use of fairly arbitrary amounts of donations from individuals, and in some cases because the recommender thought they’d be particularly likely to appeal to readers. This shouldn’t be seen as a list of our strongest grantees overall (although of course there may be overlap).
Funnily enough, this actually is an analogy to investing; you need a certain amount of capital to invest in certain hedge funds, startups, etc. What a wealthy person does with their portfolio isn’t necessarily the same thing they can recommend to a broad audience.
That said, I think one specific, valuable project would be sifting through the landscape of psychedelic funding opportunities.
This also strikes me as valuable, though in light of point (b) above, you might want to select “best in class” funding opportunities for donors of different sizes (e.g. the best place to give if you plan to donate under $1000).
That said, this is possibly worse than creating some kind of psychedelics fund that can combine many small donations into grants of a size that make sense for universities to process. (I wouldn’t be surprised if this existed already and I wasn’t aware of it.)
Re (a), that would be a sufficient justification, I agree: you suggest the option that is less cost-effective in the expectation more people will do it and therefore its expectation value is higher nonetheless. My point was that, if you have a fixed total of resources then, as an investor, the lower-risk, lower ROI option can be better (due to diminishing marginal utility) but, as a donor, you just want to put the fixed total to the thing with higher ROI.
That said, this is possibly worse than creating some kind of psychedelics fund that can combine many small donations into grants of a size that make sense for universities to process
I am not aware of this, but I have had a bit of discussion with Jonas Vollmer about setting up a new EA fund that could do this. This hypothetical ‘human well-being fund’ would be an alternative to the global health and development fund. While the latter would (continue to) basically back ‘tried-and-tested’ GiveWell recommendations (which are in global health and development), the former could, inter alia, engage in hits-based giving and take a wider view.
Tim,
Thanks enormously for this very thorough write-up—shared despite your nervousness(!) -
which was insightful, not just for your thinking about psychedelics, but also about non-profit and for-profit investing.
You said lots. I’m just going to focus on two things here.
1. (Dis)analogies between investing and donating
You drew the analogy that GiveWell-recommended charities—evidence-based ‘micro-interventions’ - are like index funds, whereas funding research is more like angel investing. I agree with you that the risk-return structure is similar, in the sense we think the former has lower variance and lower expected value and the latter has higher variance but also higher expected value. Crucially, ‘value’ here is being used ambiguously: for investing, we’re interested in financial value; for philanthropy, in moral value. Because of this, the analogy isn’t exact and it doesn’t follow we should think about investing and philanthropy the same way.
From an investor’s perspective, it does make sense to make both sorts of investments, but only because there are diminishing marginal returns to income on well-being. If there were no diminishing marginal returns to income on well-being, the best thing for your well-being would be whatever has the highest expected return on investing!
From the philanthropist’s perspective, because there aren’t diminishing marginal returns to value on, er, value—increasing happiness by 1 ‘unit’ is just as good, no matter how much happiness there already is—we really should just do the things that have the highest expected value and ignore concerns about variance.
Hence, if you think funding some project in psychedelics really has higher expected (moral) value than anything else, including GiveWell’s picks, it would be better (by your lights) to give to that, and recommend your listeners to do likewise. Put another way, note there’s something odd about saying “yeah, I really do think A would have the most impact, and all that matters here is impact, but you and I should do B anyway.”
Admittedly, you might have some concerns about 1. asking your listeners to follow your recommendations, rather than someone else’s (which wouldn’t be relevant to your own giving) and 2. it being psychological motivating to have some low-risk wins, i.e. you think you will give donations with a higher total expected value if some are lower expected value ‘sure-things’.
2. When is it worth doing detailed analyses of early-stage investments/philanthropy?
I’m not sure if we disagree here or not. In terms of a Value of Information approach, the less money you are putting towards something, and the less you expect to learn from investigating it—because e.g. you think there is no good evidence available, so you’d still be relying on your intuitions—the less valuable it is to do the investigation. For really big decisions, it can be worth doing this even if you’re very confident, because you might be wrong.
I suspect we probably agree on this in general, but we might disagree on exactly where ‘the bar’ is, that is, where it makes sense to sit down to write out one’s assumptions, put probabilities and values on things, and crunch some numbers. Broadly, I’m a fan of doing this: I find it helps clarify my thinking, plus if cost-effectiveness analysis doesn’t agree with the intuitive judgement, that is a good spur to think about where the difference emerges. It’s possible I’m suffering from bias here: quantifying hard-to-quantify stuff is what the conceptual tools of effective altruism (primarily philosophy and economics) allow one to do and I am familiar with. To the man with only a hammer, etc.
That said, I think one specific, valuable project would be sifting through the landscape of psychedelic funding opportunities. As you say, even some of the best projects are not getting funded, so it seems useful to think through exactly which those are and make the case for them so they get the money they need. This is a more or less apples-to-apples comparison and could be done quite qualitatively because it’s things like “fund research into compound A for X or fund compound A for Y”, so you can just compare X to Y. However, this is pretty hard to do without lots of inside knowledge of the players and projects, particularly as they change over time. HLI doesn’t have this knowledge, so we’d need to partner with someone in the know.
The other obvious valuable project would be comparing (the best thing in) psychedelics to other things. This is the familiar-but-difficult quantitative analysis piece. Given the money at stake, it’s worth doing even if one is pretty confident about the answer. Further, at least for EA-minded donors, it’s crucial to see a good attempt to do this before switching where they put their money. Again, a key input is what the best-in-class psychedelics thing is.
I’m wondering if this is something you might be interested in collaborating on. I’ll send you a message on the EA forum privately to ask you about this.
P.S. Regarding the Founders Pledge comparison of Usona to StrongMinds, they say it’s comparable on e.g. p. 69 of the psychedelics report. Sorry, I thought that was somewhere more obvious.
I basically agree with the model here — that there aren’t diminishing returns on moral value. That said, a couple of notes on the specific situation:
a) From the perspective of inspiring action, it would make sense to me if Tim saw his listeners as being somewhat risk-averse (as most people are!) and tried to recommend GiveWell in the expectation that this would raise more overall money than a higher-risk option. This approach might still be Tim’s best way to maximize his impact as a fundraiser. (No idea whether this is something he actually tries to do.)
b) Some of the opportunities Tim has supported (e.g. scientific studies by a particular lab) aren’t necessarily in a position to accept small donations, and so wouldn’t make sense to recommend for listeners. (That said, there are times when these opportunities have been available to small donors, and he’s advertised them.)
From Open Phil’s latest set of suggestions for individual donors:
Funnily enough, this actually is an analogy to investing; you need a certain amount of capital to invest in certain hedge funds, startups, etc. What a wealthy person does with their portfolio isn’t necessarily the same thing they can recommend to a broad audience.
This also strikes me as valuable, though in light of point (b) above, you might want to select “best in class” funding opportunities for donors of different sizes (e.g. the best place to give if you plan to donate under $1000).
That said, this is possibly worse than creating some kind of psychedelics fund that can combine many small donations into grants of a size that make sense for universities to process. (I wouldn’t be surprised if this existed already and I wasn’t aware of it.)
Hello Aaron,
Re (a), that would be a sufficient justification, I agree: you suggest the option that is less cost-effective in the expectation more people will do it and therefore its expectation value is higher nonetheless. My point was that, if you have a fixed total of resources then, as an investor, the lower-risk, lower ROI option can be better (due to diminishing marginal utility) but, as a donor, you just want to put the fixed total to the thing with higher ROI.
I am not aware of this, but I have had a bit of discussion with Jonas Vollmer about setting up a new EA fund that could do this. This hypothetical ‘human well-being fund’ would be an alternative to the global health and development fund. While the latter would (continue to) basically back ‘tried-and-tested’ GiveWell recommendations (which are in global health and development), the former could, inter alia, engage in hits-based giving and take a wider view.