Hi Jason,
I think your blog and work is great, and I’m keen to see what comes out of Progress Studies.
I wanted to ask a question, and also to comment on your response to another question, that I think this has been incorrect after about 2017:
My perception of EA is that a lot of it is focused on saving lives and relieving suffering.
More figures here.
The following is more accurate:
I don’t see as much focus on general economic growth and scientific and technological progress.
(Though even then, Open Philanthropy has allocated $100m+ to scientific research, which would make it a significant fraction of the portfolio. They’ve also funded several areas of US policy research aimed at growth.)
However, the reason for less emphasis on economic growth is because the community members who are not focused on global health, are mostly focused on longtermism, and have argued it’s not the top priority from that perspective. I’m going to try to give a (rather direct) summary of why, and would be interested in your response.
Those focused on longtermism have argued that influencing the trajectory of civilization is far higher value than speeding up progress (e.g. one example of that argument here.)
Indeed, if you’re concerned about existential risk from technology, it becomes unclear if faster progress in the short-term is even positive at all – though my guess is that it is.
In addition, longtermists have also argued that long-term trajectory-shaping efforts – which include reducing existential risk but are not limited to that – tend to be far more neglected than efforts to speed-up economic growth.
This is partly because I think there are stronger theoretical reasons to expect them to be market failures, but also from empirical observation: e.g. the field of AI safety and reducing catastrophic biorisks both receive well under $100m of funding per year, and issues around existential risk receive little attention in policy. In contrast, the world spends $1 trillion plus per year on R&D, and boosting economic growth is perhaps the main priority of governments worldwide.
I’d argue that the expected value of marginal work on an issue is proportional to its importance and neglectedness, and so these factors would suggest work on trajectory changes could be several orders of magnitude more effective.
I agree Progress Studies itself is far more neglected than general work to boost economic growth, I expect that work on Progress Studies is very high-impact by ordinary standards, and I’d be happy if a some more EAs worked on it, but I’d still expect marginal resources towards research in topics like existential risk or longtermist global priorities research to be far more effective per dollar / per person.
I’ve never seen a proponent of boosting economic growth or Progress Studies clearly give their response to these points (though I have several of my own ideas). We tried discussing it with Tyler Cowen, but my impression of that interview was that he basically conceded that existential risk is the greater priority, defending economic growth mainly because it’s something the average person is better able / more likely to contribute to.
So my question would be: why should a longtermist EA work on boosting economic growth?
1) One way to see the problem is that in the past we used frugality as a hard-to-fake signal of altruism, but that signal no longer works.
I’m not sure that’s an entirely bad thing, because frugality seems mixed as a virtue e.g. it can lead to:
Not spending money on clearly worth it things (e.g. not paying to have a larger table at a student fair even when it would result in more sign ups; not getting a cleaner when you earn over $50/hour), which in turn can also make us seem not serious about maximising impact (e.g. this comment).
Even worse, getting distracted from the top priority by worrying about efforts to save relatively small amounts of money. Or not considering high upside projects that require a lot of resources, but where there’s a good chance of failure, due to a fear of not being able to justify the spending.
Feelings of guilt around spending and not being perfectly altruistic, which can lead to burn out.
Filtering out people who want a normal middle class lifestyle & family, but could have had a big impact (and go work at FAANG instead). Filtering out people from low income backgrounds or with dependents.
However, we need new hard-to-fake signals of seriousness to replace frugality. I’m not sure what these should be, but here are some alternative things we could try to signal, which seem closer to what we most care about:
That we nerd out hard about doing good.
Intense focus on the top priority.
Doing high upside things even if there’s a good chance they might not work out and seem unconventional.
Giving 10% (or more) (which is compatible with non-frugality)
The difficulty is to think of hard-to-fake and easy-to-explain ways to show we’re into these.
2) Another way to see the problem is that in the past we’ve used the following idea to get people into EA: “you can save a life for a few thousand dollars and should maximise your donations to that cause”. But this idea is obviously in tension with the activities that many see as the top priorities these days (e.g. wanting to convince top computer scientists to work on the AI alignment problem).
My view is that we should try to move past this way of introducing effective altruism, and instead focus more on ideas like:
Let’s do the most we can to tackle big, neglected global problems. (I’d probably start by introducing climate change and/or pandemics rather than global health.)
Find high-upside projects that help tackle the biggest bottlenecks in those problems.
If you want to do good, do it effectively, and focus on the highest-leverage ways you can help (but ~no-one is perfectly altruistic and it’s fine to have a nice life too).