One issue here with some of the latter numbers is that a lot of the work is being done by the expected value of the far future being very high, and (to a lesser extent) by us living in the hinge of history.
Among the set of potential longtermist projects to work on (e.g. AI alignment, vs. technical biosecurity, or EA community building, or longtermist grantmaking, or AI policy, or macrostrategy), I don’t think the present analysis of very high ethical value (in absolute terms) should be dispositive in causing someone to choose careers in AI alignment.
Yes, that is true. I’m sure those other careers are also tremendously valuable. Frankly I have no idea if they’re more or less valuable than direct AI safety work. I wasn’t making any attempt to compare them (though doing so would be useful). My main counterfactual was a regular career in academia or something, and I chose to look at AI safety because I think I might have good personal fit and I saw opportunities to get into that area.
One issue here with some of the latter numbers is that a lot of the work is being done by the expected value of the far future being very high, and (to a lesser extent) by us living in the hinge of history.
Among the set of potential longtermist projects to work on (e.g. AI alignment, vs. technical biosecurity, or EA community building, or longtermist grantmaking, or AI policy, or macrostrategy), I don’t think the present analysis of very high ethical value (in absolute terms) should be dispositive in causing someone to choose careers in AI alignment.
Yes, that is true. I’m sure those other careers are also tremendously valuable. Frankly I have no idea if they’re more or less valuable than direct AI safety work. I wasn’t making any attempt to compare them (though doing so would be useful). My main counterfactual was a regular career in academia or something, and I chose to look at AI safety because I think I might have good personal fit and I saw opportunities to get into that area.
Thanks, this makes sense!
I do appreciate you (and others) thinking clearly about this, and your interest in safeguarding the future.