I am broadly sympathetic to Patrick’s way of looking at this, yes.
If progress studies feels like a miss on EA’s part to you… I think folks within EA, especially those who have been well within it for a long time, are better placed to analyze why/how that happened. Maybe rather than give an answer, let me suggest some hypotheses that might be fruitful to explore:
A focus on saving lives and relieving suffering, with these seen as more moral or important than comfort, entertainment, enjoyment, or luxury; or economic growth; or the advance of knowledge?
A data-driven focus that naturally leads to more short-term, measurable impact? (Vs., say, a more historical and philosophical focus?)
A concern about existential risk from technology and progress?
Some other tendency to see technology, capitalism, and economic growth as less important, less moral, or otherwise lower-status?
An assumption that these things are already popular and well-served by market mechanisms and therefore not-neglected?
As for “tuning the metal detector”, I think a root-cause analysis on progress studies or any other area you feel you “missed” would be the best way to approach it!
Well, one final thought: The question of “how to do the most good” is deep and challenging enough that you can’t answer it with anything less than an entire philosophy. I suspect that EA is significantly influenced by a certain philosophical orientation, and that orientation is fundamentally altruistic. Progress isn’t really altruistic, at least not to my mind. Altruism is about giving, whereas progress is about creating. They’re not unrelated, but they’re different orientations.
But I could be wrong here, and @Benjamin_Todd, above, has given me a whole bunch of stuff to read to challenge my understanding of EA, so I should go digest that before speculating any more.
I am broadly sympathetic to Patrick’s way of looking at this, yes.
If progress studies feels like a miss on EA’s part to you… I think folks within EA, especially those who have been well within it for a long time, are better placed to analyze why/how that happened. Maybe rather than give an answer, let me suggest some hypotheses that might be fruitful to explore:
A focus on saving lives and relieving suffering, with these seen as more moral or important than comfort, entertainment, enjoyment, or luxury; or economic growth; or the advance of knowledge?
A data-driven focus that naturally leads to more short-term, measurable impact? (Vs., say, a more historical and philosophical focus?)
A concern about existential risk from technology and progress?
Some other tendency to see technology, capitalism, and economic growth as less important, less moral, or otherwise lower-status?
An assumption that these things are already popular and well-served by market mechanisms and therefore not-neglected?
As for “tuning the metal detector”, I think a root-cause analysis on progress studies or any other area you feel you “missed” would be the best way to approach it!
Well, one final thought: The question of “how to do the most good” is deep and challenging enough that you can’t answer it with anything less than an entire philosophy. I suspect that EA is significantly influenced by a certain philosophical orientation, and that orientation is fundamentally altruistic. Progress isn’t really altruistic, at least not to my mind. Altruism is about giving, whereas progress is about creating. They’re not unrelated, but they’re different orientations.
But I could be wrong here, and @Benjamin_Todd, above, has given me a whole bunch of stuff to read to challenge my understanding of EA, so I should go digest that before speculating any more.