I welcome attempts to steelman parts of this book[1], or associated works, as I do think it is important for EA to understand and to engage with our critics.
I think “steelmanning” a view is not something we should generally strive to do, and I wish people stopped treating the inclination to “steelman” an opponent as a sign of epistemic virtue. “Steelmanning” is a pretty poor hueristic for many reasons, including that (1) it lacks the resources to distinguish views that may deserve to be steelmanned from views that ought to be dismissed as nonsense; (2) it is an instance of countering a potential bias in one direction with a bias in the opposite direction, which will often result in excessive correction, insufficient correction, or unnecessary correction; and (3) it often impedes understanding and communication, since the “steelmanned” version of a view may bear little resemblance to what its proponent intended.
As superior alternatives to “steelmanning”, I would suggest trying to pass the ideological Turing test, writing a hypothetical apostasy, and being willing to turn disagreements into bets (a practice which, to paraphrase Dr Johnson, “concentrates the mind wonderfully”).
I disagree with this take, and fortunately now have a post to link to. I think steelmanning is a fine response to this situation.
I think your (3) is the one I spend the most time digging into in the post, and I feel quite confident is not a good reason not to steelman.
Re: 1&2, I agree I’m, like, not that bullish on getting a bunch of value from this book, but it looks like a bunch of people have already gotten value from the theme of excessive focus on measurability. And generally I want to see more constructive engagement with criticism, and don’t think “eh, low prior on it working” is a good critique of a good mental move.
Thanks. I hadn’t seen that post, nor most of the arguments against steelmanning that Rob Bensinger mentions. I thought I was expressing a less popular view than now seems to me to be the case. I found it particularly interesting to read that Holden Karnofsky finds it unsatisfying to engage with “steelmanned” versions of his views.
I agree with you that steelmanning in the context of a discussion with others or of interpreting the views of others is importantly different from steelmanning in your own inner monologue, and I think the latter may be justified in some cases. Specifically, I think steelmanning can indeed be useful as a heuristic device for uncovering relevant considerations for or against some view as part of a brainstorming session. This seems pretty different from how steelmanning is typically applied, though.
I think steelmanning is bad for understanding and engaging with your critics, but is still useful for engaging with criticism, and for challenging and refining your own ideas.
We ought to have a new word, besides “steelmanning”, for “I think this idea is bad, but it made me think of another, much stronger idea that sounds similar, and I want to look at that idea now and ignore the first idea and probably whoever was advocating it”.
I liked your comment but disagree with this part:
I think “steelmanning” a view is not something we should generally strive to do, and I wish people stopped treating the inclination to “steelman” an opponent as a sign of epistemic virtue. “Steelmanning” is a pretty poor hueristic for many reasons, including that (1) it lacks the resources to distinguish views that may deserve to be steelmanned from views that ought to be dismissed as nonsense; (2) it is an instance of countering a potential bias in one direction with a bias in the opposite direction, which will often result in excessive correction, insufficient correction, or unnecessary correction; and (3) it often impedes understanding and communication, since the “steelmanned” version of a view may bear little resemblance to what its proponent intended.
As superior alternatives to “steelmanning”, I would suggest trying to pass the ideological Turing test, writing a hypothetical apostasy, and being willing to turn disagreements into bets (a practice which, to paraphrase Dr Johnson, “concentrates the mind wonderfully”).
I disagree with this take, and fortunately now have a post to link to. I think steelmanning is a fine response to this situation.
I think your (3) is the one I spend the most time digging into in the post, and I feel quite confident is not a good reason not to steelman.
Re: 1&2, I agree I’m, like, not that bullish on getting a bunch of value from this book, but it looks like a bunch of people have already gotten value from the theme of excessive focus on measurability. And generally I want to see more constructive engagement with criticism, and don’t think “eh, low prior on it working” is a good critique of a good mental move.
Thanks. I hadn’t seen that post, nor most of the arguments against steelmanning that Rob Bensinger mentions. I thought I was expressing a less popular view than now seems to me to be the case. I found it particularly interesting to read that Holden Karnofsky finds it unsatisfying to engage with “steelmanned” versions of his views.
I agree with you that steelmanning in the context of a discussion with others or of interpreting the views of others is importantly different from steelmanning in your own inner monologue, and I think the latter may be justified in some cases. Specifically, I think steelmanning can indeed be useful as a heuristic device for uncovering relevant considerations for or against some view as part of a brainstorming session. This seems pretty different from how steelmanning is typically applied, though.
I think steelmanning is bad for understanding and engaging with your critics, but is still useful for engaging with criticism, and for challenging and refining your own ideas.
We ought to have a new word, besides “steelmanning”, for “I think this idea is bad, but it made me think of another, much stronger idea that sounds similar, and I want to look at that idea now and ignore the first idea and probably whoever was advocating it”.