I don’t think the dishonesty entirely rules out working at OpenAI. Whether or not OpenAI safety positions should be on the 80k job board depends on the exact mission of the job board. I have my models, but let me ask you: who is it you think will have their plans changed for the better by seeing OpenAI safety positions[1] on 80k’s board?
I’m excluding IS positions from this question because it seems possible someone skilled in IS would not think to apply to OpenAI. I don’t see how anyone qualified for OpenAI safety positions could need 80k to inform them the positions exist.
I don’t object to dropping OpenAI safety positions from the 80k job board on the grounds that the people who would be highly impactful in those roles don’t need the job board to learn about them, especially when combined with the other factors we’ve been discussing.
As I tried to communicate in my previous comment, I’m not convinced there is anyone who “will have their plans changed for the better by seeing OpenAI safety positions on 80k’s board”, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and I’d change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who don’t pay much attention to disclaimers and interpret listing a role as saying “someone who takes this role will have a large positive impact in expectation”.
If there did turn out to be a lot of people in that category I’d recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role you’ll have high positive impact in expectation (I’d biasedly put the NAO’s current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.
I don’t think the dishonesty entirely rules out working at OpenAI. Whether or not OpenAI safety positions should be on the 80k job board depends on the exact mission of the job board. I have my models, but let me ask you: who is it you think will have their plans changed for the better by seeing OpenAI safety positions[1] on 80k’s board?
I’m excluding IS positions from this question because it seems possible someone skilled in IS would not think to apply to OpenAI. I don’t see how anyone qualified for OpenAI safety positions could need 80k to inform them the positions exist.
I don’t object to dropping OpenAI safety positions from the 80k job board on the grounds that the people who would be highly impactful in those roles don’t need the job board to learn about them, especially when combined with the other factors we’ve been discussing.
In this subthread I’m pushing back on your broader “I think a job board shouldn’t host companies that have taken already-earned compensation hostage”.
I still think the question of “who is the job board aimed at?” is relevant here, and would like to hear your answer.
As I tried to communicate in my previous comment, I’m not convinced there is anyone who “will have their plans changed for the better by seeing OpenAI safety positions on 80k’s board”, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think the board should generally list jobs that, under some combinations of values and world models that the job board runners think are plausible, are plausibly one of the highest impact opportunities for the right person. I think in cases like working in OpenAI’s safety roles where anyone who is the “right person” almost certainly already knows about the role, there’s not much value in listing it but also not much harm.
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and I’d change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who don’t pay much attention to disclaimers and interpret listing a role as saying “someone who takes this role will have a large positive impact in expectation”.
If there did turn out to be a lot of people in that category I’d recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role you’ll have high positive impact in expectation (I’d biasedly put the NAO’s current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.