Just noting that in the comments of the original post by Nathan Young that the authors linked to, the top-upvoted suggestion was to offset the gap in nuclear security funding created by the MacArthur Foundation’s exit from the field. I recently had an opportunity to speak to someone who was there at the time of MacArthur’s decision and can share more about that privately, but suffice to say that our community should not treat the foundation’s shift in priorities as a strong signal about the importance or viability of work in this space going forward.
I wouldn’t treat the upvotes there as much evidence; I think most EAs voting on these things don’t have very good qualitative or quantitative models of xrisks and what it’d take to stop them.
A reductio ad absurdum here you might have is whether this is an indictment of the karma system in general. I don’t think it is, because (to pick a sample of other posts on the frontpage) posts about burnout and productivity can simply invoke people’s internal sense/vibe of what makes them worried, so just using affect isn’t terrible, posts about internship/job opportunities can be voted based on which jobs EAs are internally excited for themselves or their acquaintances/coworkers to work at, posts about detailed specific topics have enough details in them that people can try to evaluate post on the merits of the posts themselves, etc.
Just noting that in the comments of the original post by Nathan Young that the authors linked to, the top-upvoted suggestion was to offset the gap in nuclear security funding created by the MacArthur Foundation’s exit from the field. I recently had an opportunity to speak to someone who was there at the time of MacArthur’s decision and can share more about that privately, but suffice to say that our community should not treat the foundation’s shift in priorities as a strong signal about the importance or viability of work in this space going forward.
I wouldn’t treat the upvotes there as much evidence; I think most EAs voting on these things don’t have very good qualitative or quantitative models of xrisks and what it’d take to stop them.
A reductio ad absurdum here you might have is whether this is an indictment of the karma system in general. I don’t think it is, because (to pick a sample of other posts on the frontpage) posts about burnout and productivity can simply invoke people’s internal sense/vibe of what makes them worried, so just using affect isn’t terrible, posts about internship/job opportunities can be voted based on which jobs EAs are internally excited for themselves or their acquaintances/coworkers to work at, posts about detailed specific topics have enough details in them that people can try to evaluate post on the merits of the posts themselves, etc.