I suggest that [pointing out a bias, like “interesting”, that leads more people to some direction] and [suggesting that we should put slightly less focus on that direction because of that bias] isn’t such good decision making
I don’t understand your model here. You think that it’s wrong because it’s bad to actively work to counteract a bias, or because you don’t think the bias exists, or because it will predictably lead to worse outcomes?
Because [actively working to correct for a bias] is less good than [figure out what the correct unbiased answer should be]
Especially when the bias is “do X a bit more”
(there are probably some other ways I’d pick where I would or wouldn’t use this strategy, but TL;DR: Deciding how many people should do something like ai-safety seems like a “figure out the correct solution” and not “adjust slightly for biases” situation. Do you agree with that part?)
“People are biased not to work on AI Safety because it often seems weird to their families, so we should push more people to work on it”—I don’t actually believe this, but I am saying that we can find biases like these to many many directions (and so it’s probably not a good way to make decisions)
I think that it’s reasonable to think about which biases are relevant, and consider whether they matter and what should be done to account for them. More specifically, AI safety being weird-sounding is definitely something that people in EA have spent significant effort working to counteract.
Also, “interesting” is subjective. Different people find different things to be interesting. I was surprised you called “theoretical research” interesting (but then reminded myself that “interesting” is subjective)
“Interesting” is subjective, but there can still be areas that a population tends to find interesting. I find David’s proposals of what the EA population tends to find interesting plausible, though ultimately the question could be resolved with a survey
Any given person should look at what they find most interesting, and make sure to double-check that they aren’t claiming it’s impactful because they enjoy doing it. This was the point of the conclusion, especially footnote 6.
I suggest that [pointing out a bias, like “interesting”, that leads more people to some direction] and [suggesting that we should put slightly less focus on that direction because of that bias] isn’t such good decision making
I don’t understand your model here. You think that it’s wrong because it’s bad to actively work to counteract a bias, or because you don’t think the bias exists, or because it will predictably lead to worse outcomes?
Because [actively working to correct for a bias] is less good than [figure out what the correct unbiased answer should be]
Especially when the bias is “do X a bit more”
(there are probably some other ways I’d pick where I would or wouldn’t use this strategy, but TL;DR: Deciding how many people should do something like ai-safety seems like a “figure out the correct solution” and not “adjust slightly for biases” situation. Do you agree with that part?)
As a naive example to make my point more clear:
“People are biased not to work on AI Safety because it often seems weird to their families, so we should push more people to work on it”—I don’t actually believe this, but I am saying that we can find biases like these to many many directions (and so it’s probably not a good way to make decisions)
What do you think?
I think that it’s reasonable to think about which biases are relevant, and consider whether they matter and what should be done to account for them. More specifically, AI safety being weird-sounding is definitely something that people in EA have spent significant effort working to counteract.
Also, “interesting” is subjective. Different people find different things to be interesting. I was surprised you called “theoretical research” interesting (but then reminded myself that “interesting” is subjective)
“Interesting” is subjective, but there can still be areas that a population tends to find interesting. I find David’s proposals of what the EA population tends to find interesting plausible, though ultimately the question could be resolved with a survey
Any given person should look at what they find most interesting, and make sure to double-check that they aren’t claiming it’s impactful because they enjoy doing it. This was the point of the conclusion, especially footnote 6.