I find it annoying if people argue for a policy for reasons other than the ones that truly motivate them, although it is very instrumentally useful and I don’t think it’s necessarily morally wrong.
Declining to share an opinion on a topic (eg Kelsey not telling the public she was proposing for a global pandemic) seems completely fine? Unless I’m missing some context and she was writing about COVID in a way that contradicted her actual beliefs at that time? I agree it would have been better for her to share her beliefs but there is no rule that people, even public intellectuals, need to share every thought!
In the EA space, people like Will sharing every thought they have could in some cases have negative effects because people in EA have a history of deferring (and it gets worse the more Will talks to EAs about EA).
If Will had a weekly podcast where he was like “Does lobster welfare matter? Ehh, probably not” and “I’d love to see an EA working on every nuclear submarine” that wouldn’t actually be a good thing, even if he believed both of those points. I predict a small group of EAs would love the podcast, defer to it way too much, and adopt Will’s opinions wholesale.
(I notice that all three of your examples are situations where you wanted people to adopt the public intellectual’s opinions wholesale)
I find it annoying if people argue for a policy for reasons other than the ones that truly motivate them, although it is very instrumentally useful and I don’t think it’s necessarily morally wrong.
Declining to share an opinion on a topic (eg Kelsey not telling the public she was proposing for a global pandemic) seems completely fine? Unless I’m missing some context and she was writing about COVID in a way that contradicted her actual beliefs at that time? I agree it would have been better for her to share her beliefs but there is no rule that people, even public intellectuals, need to share every thought!
In the EA space, people like Will sharing every thought they have could in some cases have negative effects because people in EA have a history of deferring (and it gets worse the more Will talks to EAs about EA).
If Will had a weekly podcast where he was like “Does lobster welfare matter? Ehh, probably not” and “I’d love to see an EA working on every nuclear submarine” that wouldn’t actually be a good thing, even if he believed both of those points. I predict a small group of EAs would love the podcast, defer to it way too much, and adopt Will’s opinions wholesale.
(I notice that all three of your examples are situations where you wanted people to adopt the public intellectual’s opinions wholesale)