I wasn’t proposing that “follow your passion” was the other idea I was going with. I do think that some combination of personal interest and external importance will probably be highest utility for a given personality. I just wanted to make sure that AGI alignment wasn’t so great that I would have to practically throw away my feelings for humanity’s existence. I have also read a recent post questioning the basis of the “recommended careers” in EA and 80k hours (post). Thanks for the post!
I wasn’t proposing that “follow your passion” was the other idea I was going with. I do think that some combination of personal interest and external importance will probably be highest utility for a given personality. I just wanted to make sure that AGI alignment wasn’t so great that I would have to practically throw away my feelings for humanity’s existence. I have also read a recent post questioning the basis of the “recommended careers” in EA and 80k hours (post). Thanks for the post!