I agree with the substance but not the valence of this post.
I think it’s true that EAs have made many mistakes, including me, some of which I’ve discussed with you :)
But I think that this post is an example of “counting down” when we should also remember the frame of “counting up.”
That is — EAs are doing badly in the areas you mentioned because humans are very bad at the areas you mentioned. I don’t know of any group where they have actually-correct incentives, reliably drive after truth, get big, complicated, messy questions like cross-cause prioritisation right — like, holy heck, that is a tall order!!
So you’re right in substance, but I think your post has a valence of “EAs should feel embarrassed by their failures on this front”, which I strongly disagree with. I think EAs should feel damn proud that they’re trying.
Thanks for the feedback, and I’m sorry for causing that unintended (but foreseeable) reaction. I edited the wording of the original take to address your feedback. My intention for writing this was to encourage others to figure things out independently, share our thinking, and listen to our guts—especially when we disagree with the aforementioned sources of deference about how to do the most good.
I think EAs have done a surprisingly good job at identifying crucial insights, and acting accordingly. EAs also seem unusually willing to explicitly acknowledge opportunity cost and trade-offs (which I often find the rest of the world frustratingly unwilling to do). These are definitely worth celebrating.
However, I think our track record at translating the above into actually improving the future is nowhere near our potential.
Since I experience a lot of guilt about not being a good enough person, the EA community has provided a lot of much-needed comfort to handle the daunting challenge of doing as much good as I can. It’s been scary to confront the possibility that the “adults in charge” don’t have the important things figured out about how to do the most good. Given how the last few years have unfolded, they don’t even seem to be doing a particularly good job. Of course, this is very understandable. FTX trauma is intense, and the world is incredibly complicated. I don’t think I’m doing a particularly good job either.
But it has been liberating to allow myself to actually think, trust my gut, and not rely on the EA community/funders/orgs to assess how much impact I’m having relative to my potential. I expect that with more independent thinking, ambition, and courage, our community will do much better at realizing our potential moving forward.
I agree with the substance but not the valence of this post.
I think it’s true that EAs have made many mistakes, including me, some of which I’ve discussed with you :)
But I think that this post is an example of “counting down” when we should also remember the frame of “counting up.”
That is — EAs are doing badly in the areas you mentioned because humans are very bad at the areas you mentioned. I don’t know of any group where they have actually-correct incentives, reliably drive after truth, get big, complicated, messy questions like cross-cause prioritisation right — like, holy heck, that is a tall order!!
So you’re right in substance, but I think your post has a valence of “EAs should feel embarrassed by their failures on this front”, which I strongly disagree with. I think EAs should feel damn proud that they’re trying.
Thanks for the feedback, and I’m sorry for causing that unintended (but foreseeable) reaction. I edited the wording of the original take to address your feedback. My intention for writing this was to encourage others to figure things out independently, share our thinking, and listen to our guts—especially when we disagree with the aforementioned sources of deference about how to do the most good.
I think EAs have done a surprisingly good job at identifying crucial insights, and acting accordingly. EAs also seem unusually willing to explicitly acknowledge opportunity cost and trade-offs (which I often find the rest of the world frustratingly unwilling to do). These are definitely worth celebrating.
However, I think our track record at translating the above into actually improving the future is nowhere near our potential.
Since I experience a lot of guilt about not being a good enough person, the EA community has provided a lot of much-needed comfort to handle the daunting challenge of doing as much good as I can. It’s been scary to confront the possibility that the “adults in charge” don’t have the important things figured out about how to do the most good. Given how the last few years have unfolded, they don’t even seem to be doing a particularly good job. Of course, this is very understandable. FTX trauma is intense, and the world is incredibly complicated. I don’t think I’m doing a particularly good job either.
But it has been liberating to allow myself to actually think, trust my gut, and not rely on the EA community/funders/orgs to assess how much impact I’m having relative to my potential. I expect that with more independent thinking, ambition, and courage, our community will do much better at realizing our potential moving forward.