As I tried to communicate in my previous comment, Iâm not convinced there is anyone who âwill have their plans changed for the better by seeing OpenAI safety positions on 80kâs boardâ, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and Iâd change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who donât pay much attention to disclaimers and interpret listing a role as saying âsomeone who takes this role will have a large positive impact in expectationâ.
If there did turn out to be a lot of people in that category Iâd recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role youâll have high positive impact in expectation (Iâd biasedly put the NAOâs current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.
As I tried to communicate in my previous comment, Iâm not convinced there is anyone who âwill have their plans changed for the better by seeing OpenAI safety positions on 80kâs boardâ, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think the board should generally list jobs that, under some combinations of values and world models that the job board runners think are plausible, are plausibly one of the highest impact opportunities for the right person. I think in cases like working in OpenAIâs safety roles where anyone who is the âright personâ almost certainly already knows about the role, thereâs not much value in listing it but also not much harm.
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and Iâd change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who donât pay much attention to disclaimers and interpret listing a role as saying âsomeone who takes this role will have a large positive impact in expectationâ.
If there did turn out to be a lot of people in that category Iâd recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role youâll have high positive impact in expectation (Iâd biasedly put the NAOâs current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.