Apologies, I don’t mean to imply that EA is unique in getting things wrong / being bad at steelmanning. Agree that the “and everyone else” part is important for clarity.
I think whether steelmanning makes sense depends on your immediate goal when reading things.
If the immediate goal is to improve the accuracy of your beliefs and work out how you can have more impact, then I think steelmanning makes sense.
If the immediate goal is to offer useful feedback to the author and better understand the author’s view, steelmanning isn’t a good idea.
There is a place for both of these goals, and importantly the second goal can be a means to achieving the first goal, but generally I think it makes sense for EAs to prioritise the first goal over the second.
Thanks, I think this is an excellent response and I agree both are important goals.
I’m curious to learn more about why you think that steelmanning is good for improving one’s beliefs/impact. It seems to me that that would be true if you believe yourself to be much more likely to be correct than the author of a post. Otherwise, it seems that trying to understand their original argument is better than trying to steelman it.
I could see that perhaps you should try to do both (ie, both the author’s literal intent and whether they are directionally correct)?
[EDIT: I’m particularly curious because I think that my current understanding seems to imply that steelmanning like this would be hubristic, and I think that probably that’s not what you’re going for. So almost certainly I’m missing some piece of what you’re saying!]
Apologies, I don’t mean to imply that EA is unique in getting things wrong / being bad at steelmanning. Agree that the “and everyone else” part is important for clarity.
I think whether steelmanning makes sense depends on your immediate goal when reading things.
If the immediate goal is to improve the accuracy of your beliefs and work out how you can have more impact, then I think steelmanning makes sense.
If the immediate goal is to offer useful feedback to the author and better understand the author’s view, steelmanning isn’t a good idea.
There is a place for both of these goals, and importantly the second goal can be a means to achieving the first goal, but generally I think it makes sense for EAs to prioritise the first goal over the second.
Thanks, I think this is an excellent response and I agree both are important goals.
I’m curious to learn more about why you think that steelmanning is good for improving one’s beliefs/impact. It seems to me that that would be true if you believe yourself to be much more likely to be correct than the author of a post. Otherwise, it seems that trying to understand their original argument is better than trying to steelman it.
I could see that perhaps you should try to do both (ie, both the author’s literal intent and whether they are directionally correct)?
[EDIT: I’m particularly curious because I think that my current understanding seems to imply that steelmanning like this would be hubristic, and I think that probably that’s not what you’re going for. So almost certainly I’m missing some piece of what you’re saying!]