Great post! I most agree with that we should be more clear that things are still very, very uncertain. I think there are several factors that push against this:
The EA community and discourse doesn’t have any formal structure for propagating ideas, unlike academia. You are likely to hear about something if it’s already popular. Critical or new posts and ideas are unpopular by definition to begin with, so they fall by the wayside.
The story for impact for many existing EA organizations often relies on a somewhat narrow worldview. It does seem correct to me that we should both be trying to figure out the truth and taking bets on worlds where we have a lot of important things to do right now. But it’s easy to mentally conflate “taking an important bet” and “being confident that this is what the world looks like”, both from inside and outside an organization. I personally try to pursue a mixed strategy, where I take some actions assuming a particular worldview where I have a lot of leverage now, and some actions trying to get at the truth. But it’s kind of a weird mental state to hold, and I assume most EAs don’t have enough career flexibility to do this.
I do think that the closer you get to people doing direct work, the more people are skeptical and consider alternative views. I think the kind of deference you talk about in this post is much more common among people who are less involved with the movement.
That being said, it’s not great that the ideas that newcomers and people who aren’t in the innermost circles see are not the best representatives of the truth or of the amount of uncertainty involved. I’m interested in trying to think of ways to fix that—like I said, I think it’s hard because there are lots of different channels and no formal mechanism for what ideas “the movement” is exposed to. Without formal mechanisms, it seems hard to leave an equilibrium where a small number of reputable people or old but popular works of literature have disproportionate influence.
That being said, I really appreciate a lot of recent attempts by people to express uncertainty more publically—see e.g. Ben’s podcast,Will’s talk,80K’s recent posts, my talk and interviews. For better or for worse, it does seem like a small number of individuals have disproportionate influence over the discourse, and as such I think they do have some responsibility to convey uncertainty in a thoughtful way.
Such a great comment, I agree with most you say, thank you for writing this up. Curious about a formal mechanism of communcal belief formation/belief dissemination. How could this look like? Would this be net good in comparision to baseline?
Great post! I most agree with that we should be more clear that things are still very, very uncertain. I think there are several factors that push against this:
The EA community and discourse doesn’t have any formal structure for propagating ideas, unlike academia. You are likely to hear about something if it’s already popular. Critical or new posts and ideas are unpopular by definition to begin with, so they fall by the wayside.
The story for impact for many existing EA organizations often relies on a somewhat narrow worldview. It does seem correct to me that we should both be trying to figure out the truth and taking bets on worlds where we have a lot of important things to do right now. But it’s easy to mentally conflate “taking an important bet” and “being confident that this is what the world looks like”, both from inside and outside an organization. I personally try to pursue a mixed strategy, where I take some actions assuming a particular worldview where I have a lot of leverage now, and some actions trying to get at the truth. But it’s kind of a weird mental state to hold, and I assume most EAs don’t have enough career flexibility to do this.
I do think that the closer you get to people doing direct work, the more people are skeptical and consider alternative views. I think the kind of deference you talk about in this post is much more common among people who are less involved with the movement.
That being said, it’s not great that the ideas that newcomers and people who aren’t in the innermost circles see are not the best representatives of the truth or of the amount of uncertainty involved. I’m interested in trying to think of ways to fix that—like I said, I think it’s hard because there are lots of different channels and no formal mechanism for what ideas “the movement” is exposed to. Without formal mechanisms, it seems hard to leave an equilibrium where a small number of reputable people or old but popular works of literature have disproportionate influence.
That being said, I really appreciate a lot of recent attempts by people to express uncertainty more publically—see e.g. Ben’s podcast, Will’s talk, 80K’s recent posts, my talk and interviews. For better or for worse, it does seem like a small number of individuals have disproportionate influence over the discourse, and as such I think they do have some responsibility to convey uncertainty in a thoughtful way.
Such a great comment, I agree with most you say, thank you for writing this up. Curious about a formal mechanism of communcal belief formation/belief dissemination. How could this look like? Would this be net good in comparision to baseline?