OK, thanks. Not sure I can pull it off, that was just a toy example. Probably even my best arguments would have a smaller impact than a factor of three, at least when averaged across the whole community.
I agree with your explanation of the ways this would improve things… I guess I’m just concerned about opportunity costs.
Like, it seems to me that a tripling of credence in Sudden Emergence shouldn’t change what people do by more than, say, 10%. When you factor in tractability, neglectedness, personal fit, doing things that are beneficial under both Sudden Emergence and non-Sudden Emergence, etc. a factor of 3 in the probability of sudden emergence probably won’t change the bottom line for what 90% of people should be doing with their time. For example, I’m currently working on acausal trade stuff, and I think that if my credence in sudden emergence decreased by a factor of 3 I’d still keep doing what I’m doing.
Meanwhile, I could be working on AI safety directly, or I could be working on acausal trade stuff (which I think could plausibly lead to a more than 10% improvement in EA effort allocation. Or at least, more plausibly than working on Sudden Emergence, it seems to me right now).
Did you end up writing this post? (I looked through your LW posts since the timestamp of the parent comment but it doesn’t seem like you did.) If not, I would be interested in seeing some sort of outline or short list of points even if you don’t have time to write the full post.
Thanks for following up. Nope, I didn’t write it, but comments like this one and this one are making me bump it up in priority! Maybe it’s what I’ll do next.
OK, thanks. Not sure I can pull it off, that was just a toy example. Probably even my best arguments would have a smaller impact than a factor of three, at least when averaged across the whole community.
I agree with your explanation of the ways this would improve things… I guess I’m just concerned about opportunity costs.
Like, it seems to me that a tripling of credence in Sudden Emergence shouldn’t change what people do by more than, say, 10%. When you factor in tractability, neglectedness, personal fit, doing things that are beneficial under both Sudden Emergence and non-Sudden Emergence, etc. a factor of 3 in the probability of sudden emergence probably won’t change the bottom line for what 90% of people should be doing with their time. For example, I’m currently working on acausal trade stuff, and I think that if my credence in sudden emergence decreased by a factor of 3 I’d still keep doing what I’m doing.
Meanwhile, I could be working on AI safety directly, or I could be working on acausal trade stuff (which I think could plausibly lead to a more than 10% improvement in EA effort allocation. Or at least, more plausibly than working on Sudden Emergence, it seems to me right now).
I’m very uncertain about all this, of course.
Did you end up writing this post? (I looked through your LW posts since the timestamp of the parent comment but it doesn’t seem like you did.) If not, I would be interested in seeing some sort of outline or short list of points even if you don’t have time to write the full post.
Thanks for following up. Nope, I didn’t write it, but comments like this one and this one are making me bump it up in priority! Maybe it’s what I’ll do next.