Thanks for your work and thanks for doing this!
In your interview with Patrick Collison, he says the following:
“I think of EA as sort of like a metal detector, and they’ve invented a new kind of metal detector that’s really good at detecting some metals that other detectors are not very good at detecting. But I actually think we need some diversity in the different metallic substances which our detectors are attuned to, and for me EA would not be the only one”
Discussion on the EA forum here, link to the interview here.
First, do you broadly agree with that framework?
Second, given that you likely think that progress studies is one of the most important things to work on, do you think it should worry us that the EA detector did not on its own seem to pick up on progress studies as an opportunity to do good, before it became a more mainstream view? Why didn’t EAs launch this field years ago? Why isn’t it one of the main EA cause areas? Does this hint at a way our detector may be broken? (Note to say that personally I am agnostic for now as to whether this should be a main EA cause area.)
Third, how can we tune the EA metal detector to be more effective at finding new niches where there’s room to do good effectively? I think Patrick is probably right that the EA detector isn’t good enough to pick up on everything that you would want to pick up on. But unlike other detectors, we do have the explicit goal to find all the most important things to do at the margin. So how can we get closer to that goal?
I am broadly sympathetic to Patrick’s way of looking at this, yes.
If progress studies feels like a miss on EA’s part to you… I think folks within EA, especially those who have been well within it for a long time, are better placed to analyze why/how that happened. Maybe rather than give an answer, let me suggest some hypotheses that might be fruitful to explore:
A focus on saving lives and relieving suffering, with these seen as more moral or important than comfort, entertainment, enjoyment, or luxury; or economic growth; or the advance of knowledge?
A data-driven focus that naturally leads to more short-term, measurable impact? (Vs., say, a more historical and philosophical focus?)
A concern about existential risk from technology and progress?
Some other tendency to see technology, capitalism, and economic growth as less important, less moral, or otherwise lower-status?
An assumption that these things are already popular and well-served by market mechanisms and therefore not-neglected?
As for “tuning the metal detector”, I think a root-cause analysis on progress studies or any other area you feel you “missed” would be the best way to approach it!
Well, one final thought: The question of “how to do the most good” is deep and challenging enough that you can’t answer it with anything less than an entire philosophy. I suspect that EA is significantly influenced by a certain philosophical orientation, and that orientation is fundamentally altruistic. Progress isn’t really altruistic, at least not to my mind. Altruism is about giving, whereas progress is about creating. They’re not unrelated, but they’re different orientations.
But I could be wrong here, and @Benjamin_Todd, above, has given me a whole bunch of stuff to read to challenge my understanding of EA, so I should go digest that before speculating any more.