I am also feeling that there could be many other survival problems (jobs, equality, etc) in the society, that makes them feel this problem could still be very far. But I know of a friend who try to work on AI governance and raise more awareness.
ZY
Appreciate the post, and I see this very valuable to raise, as I have similar thoughts, but didn’t see many posts discussing/raising this. One of my worries is that without awareness and perspective from different groups, we will make decisions that are biased by human nature (not by intention), and would probably also effect prioritization decisions. I have also encountered situations from related community where they have mentioned “rape is a side effect of erotic excitement, and so probably should not work towards getting rid of it because that means getting rid of erotic excitement”, which again may not be intentional, but poorly aware/educated.
Thanks for the post. Your post/chart is exactly what I have been thinking about and recently, and glad to see someone started the conversation in 2022.
I once saw a post https://www.alignmentforum.org/posts/ho63vCb2MNFijinzY/agi-safety-career-advice that is specific to AI, detailed the directions within both research and governance, and found it useful. Maybe some general education post (but on more general EA topics) like this would be very helpful.
I think it might be fine if people have genuine interest in research (though had to be intrinsic motivation), which will make their learning fast with more devoted energy. But overall generally I see a lot of value in operations/management/application work, as it gives people opportunities to learn how to land research into real impacts, and how tricky sometimes real world or applications can be.
Thanks for sharing this! I was looking at https://www.givingwhatwecan.org/best-charities-to-donate-to-2024# to find some good related NGOs to donate to for a friend’s birthday but didn’t find a section in the front page (maybe in some subpages?). But I will donate to some of the orgs mentioned here!