Thanks for writing this out. I think it’s important to keep in mind that there’s a significant difference in lived experience between the median human being on this planet and the median EA.
As far as hype: AI might or might not be hype. The question is whether we can accept the risk of it being not-hype. Even if development plateaued in the near future, it is already powerful enough to have significant effects on (e.g.) world economies. I’d submit that we especially need non-Western perspectives in thinking about how AI will affect the lives of people in developing countries (cf. the discussion here). In my view, there’s a tendency in EA/EA-adjacent circles to assume technological progress will lift all boats, rather than considering that people have used technological advances throughout history to support their positions of power and privilege.
To be fair to 80K here, it is seeking to figure out where the people it advises can have the most career impact on the margin. That’s not necessarily the same question as what areas are most important on the abstract. For example, someone could believe that climate change is the most important problem facing humanity right now, but nevertheless believe things like progress on climate change is bottlenecked by something other than new talent (e.g., money) and/or there is enough recruitment for people to work on climate change to fill the field’s capacity with excellent candidates without any work on 80K’s part. So I’d encourage you to consider refining your critique to also address how likely devoting the additional resource(s) in question to your preferred cause area(s) is to make a difference.
Thanks for writing this out. I think it’s important to keep in mind that there’s a significant difference in lived experience between the median human being on this planet and the median EA.
As far as hype: AI might or might not be hype. The question is whether we can accept the risk of it being not-hype. Even if development plateaued in the near future, it is already powerful enough to have significant effects on (e.g.) world economies. I’d submit that we especially need non-Western perspectives in thinking about how AI will affect the lives of people in developing countries (cf. the discussion here). In my view, there’s a tendency in EA/EA-adjacent circles to assume technological progress will lift all boats, rather than considering that people have used technological advances throughout history to support their positions of power and privilege.
To be fair to 80K here, it is seeking to figure out where the people it advises can have the most career impact on the margin. That’s not necessarily the same question as what areas are most important on the abstract. For example, someone could believe that climate change is the most important problem facing humanity right now, but nevertheless believe things like progress on climate change is bottlenecked by something other than new talent (e.g., money) and/or there is enough recruitment for people to work on climate change to fill the field’s capacity with excellent candidates without any work on 80K’s part. So I’d encourage you to consider refining your critique to also address how likely devoting the additional resource(s) in question to your preferred cause area(s) is to make a difference.