I donāt object to dropping OpenAI safety positions from the 80k job board on the grounds that the people who would be highly impactful in those roles donāt need the job board to learn about them, especially when combined with the other factors weāve been discussing.
As I tried to communicate in my previous comment, Iām not convinced there is anyone who āwill have their plans changed for the better by seeing OpenAI safety positions on 80kās boardā, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and Iād change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who donāt pay much attention to disclaimers and interpret listing a role as saying āsomeone who takes this role will have a large positive impact in expectationā.
If there did turn out to be a lot of people in that category Iād recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role youāll have high positive impact in expectation (Iād biasedly put the NAOās current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.
I donāt object to dropping OpenAI safety positions from the 80k job board on the grounds that the people who would be highly impactful in those roles donāt need the job board to learn about them, especially when combined with the other factors weāve been discussing.
In this subthread Iām pushing back on your broader āI think a job board shouldnāt host companies that have taken already-earned compensation hostageā.
I still think the question of āwho is the job board aimed at?ā is relevant here, and would like to hear your answer.
As I tried to communicate in my previous comment, Iām not convinced there is anyone who āwill have their plans changed for the better by seeing OpenAI safety positions on 80kās boardā, and am not arguing for including them on the board.
EDIT: after a bit of offline messaging I realize I misunderstood Elizabeth; I thought the parent comment was pushing me to answer the question posed in the great grandcomment but actually it was accepting my request to bring this up a level of generality and not be specific to OpenAI. Sorry!
I think the board should generally list jobs that, under some combinations of values and world models that the job board runners think are plausible, are plausibly one of the highest impact opportunities for the right person. I think in cases like working in OpenAIās safety roles where anyone who is the āright personā almost certainly already knows about the role, thereās not much value in listing it but also not much harm.
I think this mostly comes down to a disagreement over how sophisticated we think job board participants are, and Iād change my view on this if it turned out that a lot of people reading the board are new-to-EA folks who donāt pay much attention to disclaimers and interpret listing a role as saying āsomeone who takes this role will have a large positive impact in expectationā.
If there did turn out to be a lot of people in that category Iād recommend splitting the board into a visible-by-default section with jobs where conditional on getting the role youāll have high positive impact in expectation (Iād biasedly put the NAOās current openings in this category) and a you-need-to-click-show-more section with jobs where you need to think carefully about whether the combination of you and the role is a good one.