I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I think thatโs unhealthy and contrary to collaborative knowledge growing.
Yudkowsky has laid out his arguments for doom at length. I donโt fully agree with those arguments (I believe heโs mistaken in 2 โ 3 serious and important ways), but he has laid them out, and I can disagree on the object level with him because of that.
Given that the explicit arguments are present, I would prefer posts that engaged with and directly refuted the arguments if you found them flawed in some way.
I donโt like this direction of attacking his overall credibility.
Attacking someoneโs credibility in lieu of their arguments feels like a severe epistemic transgression.
I am not convinced that the community is better for a norm that accepts such epistemic call out posts.
I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I think I roughly agree with you on this point, although I would guess I have at least a somewhat weaker version of your view. If discourse about peopleโs track records or reliability starts taking up (e.g.) more than a fifth of the space that object-level argument does, within the most engaged core of people, then I do think that will tend to suggest an unhealthy or at least not-very-intellectually-productive community.
One caveat: For less engaged people, I do actually think it can make sense to spend most of your time thinking about questions around deference. If Iโm only going to spend ten hours thinking about nanotechnology risk, for example, then I might actually want to spend most of this time trying to get a sense of what different people believe and how much weight I should give their views; Iโm probably not going to be able to make a ton of headway getting a good gears-level-understanding of the relevant issues, particularly as someone without a chemistry or engineering background.
> I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I think itโs fair to talk about a personโs lifetime performance when we are talking about forecasting. When we donโt have the expertise ourselves, all we have to go on is what little we understand and the track records of the experts we defer to. Many people defer to Eliezer so I think itโs a service to lay out his track record so that we can know how meaningful his levels of confidence and special insights into this kind of problem are.
I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I donโt think this is realistic. There is much more important knowledge than one can engage with in a lifetime. The only way of forming views about many things is to somehow decide who to listen to, or at least how to aggregate relevant more strongly based opinions (so, who to count as an expert and who not to and with what weight).
To expand on my complaints in the above comment.
I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I think thatโs unhealthy and contrary to collaborative knowledge growing.
Yudkowsky has laid out his arguments for doom at length. I donโt fully agree with those arguments (I believe heโs mistaken in 2 โ 3 serious and important ways), but he has laid them out, and I can disagree on the object level with him because of that.
Given that the explicit arguments are present, I would prefer posts that engaged with and directly refuted the arguments if you found them flawed in some way.
I donโt like this direction of attacking his overall credibility.
Attacking someoneโs credibility in lieu of their arguments feels like a severe epistemic transgression.
I am not convinced that the community is better for a norm that accepts such epistemic call out posts.
I think I roughly agree with you on this point, although I would guess I have at least a somewhat weaker version of your view. If discourse about peopleโs track records or reliability starts taking up (e.g.) more than a fifth of the space that object-level argument does, within the most engaged core of people, then I do think that will tend to suggest an unhealthy or at least not-very-intellectually-productive community.
One caveat: For less engaged people, I do actually think it can make sense to spend most of your time thinking about questions around deference. If Iโm only going to spend ten hours thinking about nanotechnology risk, for example, then I might actually want to spend most of this time trying to get a sense of what different people believe and how much weight I should give their views; Iโm probably not going to be able to make a ton of headway getting a good gears-level-understanding of the relevant issues, particularly as someone without a chemistry or engineering background.
> I do not want an epistemic culture that finds it acceptable to challenge an individuals overall credibility in lieu of directly engaging with their arguments.
I think itโs fair to talk about a personโs lifetime performance when we are talking about forecasting. When we donโt have the expertise ourselves, all we have to go on is what little we understand and the track records of the experts we defer to. Many people defer to Eliezer so I think itโs a service to lay out his track record so that we can know how meaningful his levels of confidence and special insights into this kind of problem are.
I donโt think this is realistic. There is much more important knowledge than one can engage with in a lifetime. The only way of forming views about many things is to somehow decide who to listen to, or at least how to aggregate relevant more strongly based opinions (so, who to count as an expert and who not to and with what weight).