(1) There should be (at least) one EA org focused on AI risk career advice; it is important that this org operate at a high level at the present time.
(2) If there should be such an org, it should be—or maybe can only be -- 80K; it is more capable of meeting criterion (1) quickly than any other org that could try. It already has staff with significant experience in the area and organizational competence to deliver career advising services with moderately high throughput.
(3) Thus, 80K should focus on AI risk career advice.
If one generally accepts both your original three points and these three, I think they are left with a tradeoff to make, focusing on questions like:
If both versions of statement (1) cannot be fulfilled in the next 1-3 years (i.e., until another org can sufficiently plug whichever hole 80K didn’t fill), which version is more important to fulfill during that time frame?
Given the capabilities and limitations of other orgs (both extant and potential future), would it be easier for another org to plug the AI-focused hole or the general hole?
Good reply! I thought of something similar as a possible objection against my premise (2) that 80k should fill the role of the cause-neutral org. Basically, there are opportunity costs to 80k filling this role because it could also fill the role of (e.g.) an AI-focused org. The question is how high these opportunity costs are and you point out two important factors. What I take to be important, and plausibly decisive, is that 80k is especially well suited to fill the role of the cause-neutral org (more so than the role of the AI-focused org) due to its biography and the brand it has built. Combined with a ‘global’ perspective on EA according to which there should be one such org, it seems plausible to me that it should be 80k.
But one could also reason:
(1) There should be (at least) one EA org focused on AI risk career advice; it is important that this org operate at a high level at the present time.
(2) If there should be such an org, it should be—or maybe can only be -- 80K; it is more capable of meeting criterion (1) quickly than any other org that could try. It already has staff with significant experience in the area and organizational competence to deliver career advising services with moderately high throughput.
(3) Thus, 80K should focus on AI risk career advice.
If one generally accepts both your original three points and these three, I think they are left with a tradeoff to make, focusing on questions like:
If both versions of statement (1) cannot be fulfilled in the next 1-3 years (i.e., until another org can sufficiently plug whichever hole 80K didn’t fill), which version is more important to fulfill during that time frame?
Given the capabilities and limitations of other orgs (both extant and potential future), would it be easier for another org to plug the AI-focused hole or the general hole?
Good reply! I thought of something similar as a possible objection against my premise (2) that 80k should fill the role of the cause-neutral org. Basically, there are opportunity costs to 80k filling this role because it could also fill the role of (e.g.) an AI-focused org. The question is how high these opportunity costs are and you point out two important factors. What I take to be important, and plausibly decisive, is that 80k is especially well suited to fill the role of the cause-neutral org (more so than the role of the AI-focused org) due to its biography and the brand it has built. Combined with a ‘global’ perspective on EA according to which there should be one such org, it seems plausible to me that it should be 80k.