I think job ads in particular are a filter for “being more typical.”
I expect the people who have a chance of doing a good job to be well connected to previous people who worked at OpenAI, with some experience under their belt navigating organizational social scenes while holding onto their own epistemics. I expect such a person to basically not need to see the job ad.
You’re referring to job boards generally but we’re talking about the 80K job board which is no typical job board.
I would expect someone who will do a good job to be someone going in wanting to stop OpenAI destroying the world. That seems to be someone who would read the 80K Hours job board. 80K is all about preserving the future.
They of course also have to be good at navigating organizational social scenes while holding onto their own epistemics which in my opinion are skills commonly found in the EA community!
I think EAs vary wildly. I think most EAs do not have those skills – I think it is a very difficult skill. Merely caring about the world is not enough.
I think most EAs do not, by default, prioritize epistemics that highly, unless they came in through the rationalist scene, and even then, I think holding onto your epistemics while navigating social pressure is a very difficult skill that even rationalists who specialize in it tend to fail at. (Getting into details here is tricky because it involves judgment calls about individuals, in social situations that are selected for being murky and controversial, but, no, I do not think the median EA or even the 99th percentile EA is going to competent enough at this for it to be worthwhile for them to join OpenAI. I think ~99.5th percentile is the point where it seems even worth talking about, and I don’t think those people get most of their job leads through the job board).
The person who gets the role is obviously going to be highly intelligent, probably socially adept, and highly-qualified with experience working in AI etc. etc. OpenAI wouldn’t hire someone who wasn’t.
The question is do you want this person also to care about safety. If so I would think advertising on the EA job board would increase the chance of this.
If you think EAs or people who look at the 80K Hours job board are for some reason less good epistemically than others then you will have to explain why because I believe the opposite.
I think job ads in particular are a filter for “being more typical.”
I expect the people who have a chance of doing a good job to be well connected to previous people who worked at OpenAI, with some experience under their belt navigating organizational social scenes while holding onto their own epistemics. I expect such a person to basically not need to see the job ad.
You’re referring to job boards generally but we’re talking about the 80K job board which is no typical job board.
I would expect someone who will do a good job to be someone going in wanting to stop OpenAI destroying the world. That seems to be someone who would read the 80K Hours job board. 80K is all about preserving the future.
They of course also have to be good at navigating organizational social scenes while holding onto their own epistemics which in my opinion are skills commonly found in the EA community!
I think EAs vary wildly. I think most EAs do not have those skills – I think it is a very difficult skill. Merely caring about the world is not enough.
I think most EAs do not, by default, prioritize epistemics that highly, unless they came in through the rationalist scene, and even then, I think holding onto your epistemics while navigating social pressure is a very difficult skill that even rationalists who specialize in it tend to fail at. (Getting into details here is tricky because it involves judgment calls about individuals, in social situations that are selected for being murky and controversial, but, no, I do not think the median EA or even the 99th percentile EA is going to competent enough at this for it to be worthwhile for them to join OpenAI. I think ~99.5th percentile is the point where it seems even worth talking about, and I don’t think those people get most of their job leads through the job board).
The person who gets the role is obviously going to be highly intelligent, probably socially adept, and highly-qualified with experience working in AI etc. etc. OpenAI wouldn’t hire someone who wasn’t.
The question is do you want this person also to care about safety. If so I would think advertising on the EA job board would increase the chance of this.
If you think EAs or people who look at the 80K Hours job board are for some reason less good epistemically than others then you will have to explain why because I believe the opposite.