Yeah, there’s almost certainly some self-selection bias there. If someone thinks that talk of AI x-risk is merely bad science fiction, they will either choose not to become an EA or one chooses to go into a different cause area (and are unlikely to spend significant time thinking any more about AI x-risk or discussing their heterodox view).
For example, people in crypto have thought so much more about crypto than people like me . . . but I would not defer to the viewpoints of people in crypto about crypto. I would want to defer to a group of smart, ethical people who I had bribed so heavily that they were all willing to think deeply about crypto whether they thought it was snake oil or more powerful than AGI. People who chose to go into crypto without my massive bribery are much more likely to be pro-crypto than an unbiased sample of people would be.
Yeah, there’s almost certainly some self-selection bias there. If someone thinks that talk of AI x-risk is merely bad science fiction, they will either choose not to become an EA or one chooses to go into a different cause area (and are unlikely to spend significant time thinking any more about AI x-risk or discussing their heterodox view).
For example, people in crypto have thought so much more about crypto than people like me . . . but I would not defer to the viewpoints of people in crypto about crypto. I would want to defer to a group of smart, ethical people who I had bribed so heavily that they were all willing to think deeply about crypto whether they thought it was snake oil or more powerful than AGI. People who chose to go into crypto without my massive bribery are much more likely to be pro-crypto than an unbiased sample of people would be.