I would also be interested in whether they take into account recent discussions/criticisms of model choices in longtermist math that strike me as especially important for the kind of advising 80.000 hours does
My guess is that 80,000 Hours[1] is aware of these, but I would be curious to know the extent to which their longtermism article discusses such concerns. I added it to my list, but feel free to have a look yourself!
As a teacher at a university, I often try to encourage students to rethink their career choices from an EA angle.
Great that you do this!
80.000 hours is a natural place to recommend for interested students, but I am wary of recommending it to non-longtermist students. Probably good seems to offer a more shorttermist alternative, but are significantly newer and have less brand recognition. I think there would be considerable value in having the biggest career-advising organization (80k) be a non-partisan EA advising organization, whereas I currently take them to be strongly favoring longtermism in their advice.
I would say it makes sense for 80,000 Hours to tailor their advice to what they consider are the most pressing problems. On the other hand, I think it may well be the case that 80,000 Hours is overestimating the difference between the pressingness of areas traditionally classified as longtermist and neartermist[2]. Rather than picking one of these 2 views, I wonder 80,000 Hours’ had better rank problems along various metrics covering a wider range of areas. For example:
Increasing welfare in the next few decades. I expect improving the welfare of farmed animals would come out on top here.
Boosting economic growth in the next few decades. I guess global health and development as well as high-leverage ways to speed up economic growth would be strong candidates to come out on top here.
Decreasing global catastrophic risk in the next few decades. I guess decreasing biorisk would come out on top here.
Decreasing extinction risk this century. I guess decreasing AI risk would come out on top here.
Improving positive values. This is important, but vague, and applicable to many areas due to indirect effects, so producing a ranking would be difficult.
I believe having neartermism and longtermism as the only 2 categories would be overly reductive, as traditionally neartermist interventions have longterm effects (e.g. economic growth), and traditionally longtermist interventions have nearterm effects (e.g. less deaths of people currently alive). Furthermore, the above decomposition may mitigate a problem I see in 80,000 Hours’ current ranking of problems:
Climate change is currently ranked as a top problem in 5th, whereas animal welfare and global health and development are not top problems.
However, I think animal welfare and global health and development have a greater chance than climate change of topping one of the above 5 rankings. So I would say they should be prioritised over climate change.
I strongly endorse expectactional total hedonistic utilitarianism, and therefore agree that all metrics can in theory be mapped to a single dimension, such that all problems could be ranked together. However, doing this in practice is difficul because there is lots of uncertainty.
Thanks for engaging, mhendric!
My guess is that 80,000 Hours[1] is aware of these, but I would be curious to know the extent to which their longtermism article discusses such concerns. I added it to my list, but feel free to have a look yourself!
Great that you do this!
I would say it makes sense for 80,000 Hours to tailor their advice to what they consider are the most pressing problems. On the other hand, I think it may well be the case that 80,000 Hours is overestimating the difference between the pressingness of areas traditionally classified as longtermist and neartermist[2]. Rather than picking one of these 2 views, I wonder 80,000 Hours’ had better rank problems along various metrics covering a wider range of areas. For example:
Increasing welfare in the next few decades. I expect improving the welfare of farmed animals would come out on top here.
Boosting economic growth in the next few decades. I guess global health and development as well as high-leverage ways to speed up economic growth would be strong candidates to come out on top here.
Decreasing global catastrophic risk in the next few decades. I guess decreasing biorisk would come out on top here.
Decreasing extinction risk this century. I guess decreasing AI risk would come out on top here.
Improving positive values. This is important, but vague, and applicable to many areas due to indirect effects, so producing a ranking would be difficult.
I believe having neartermism and longtermism as the only 2 categories would be overly reductive, as traditionally neartermist interventions have longterm effects (e.g. economic growth), and traditionally longtermist interventions have nearterm effects (e.g. less deaths of people currently alive). Furthermore, the above decomposition may mitigate a problem I see in 80,000 Hours’ current ranking of problems:
Climate change is currently ranked as a top problem in 5th, whereas animal welfare and global health and development are not top problems.
However, I think animal welfare and global health and development have a greater chance than climate change of topping one of the above 5 rankings. So I would say they should be prioritised over climate change.
I strongly endorse expectactional total hedonistic utilitarianism, and therefore agree that all metrics can in theory be mapped to a single dimension, such that all problems could be ranked together. However, doing this in practice is difficul because there is lots of uncertainty.
Nitpick, there is a comma after “80”, not a dot.
I have not seen the term shortermist being used much.