I worry that a naïve approach to complexity and pluralism is detrimental, but agree that this is important. As you said, “the complex web of impacts of research also need to be untangled. This is tricky, and needs to be done very carefully.”
I also think that you’re preaching to the choir, in an important sense. The people in EA working on existential risk reduction are aware of the complexity of the debates and discussions, while the average EA posting on the forum seems not to be. This is equivalent to the difference between climate expert’s views and the lay public.
To explain the example more, I think that most people’s view of climate risk isn’t that it destabilizes complex systems and may contribute to risk understood broadly in unpredictable ways. Their view is that it’s bad, and we need to stop it, and that worrying about other things isn’t productive because we need to do something about the bad thing now. But this leads to approaches that could easily contribute to risks rather than mitigate them—a more fragile electrical grid, or as you cited from Tang and Kemp. more reliance on mitigations like geoengineering that are poorly understood and build in new systemic risks of failure.
Of course, popular science books don’t necessarily go into the details, or when read casually leave the lay public with a at least somewhat misleading view—but one that pushes in the direction of supporting actions that the experts recommend. (Note that as a general rule, people working in the climate space are not pushing for geoengineering, they are pushing for emissions reductions, work increasing resilience to impacts, and similar.) The equivalent in EA is skimming the precipice, and ignoring Toby’s footnotes, citations, and cautions. Those first starting to work on risk and policy , or writing EA forum posts often have this view, but I think it’s usually tempered fairly quickly via discussion. Unfortunately, many who see the discussions simply claim longtermism is getting everything wrong, while agreeing with us on both priorities, and approaches.
So I agree that we need to appreciate the more sophisticated approach to risk, and blend them with cause prioritization and actually considering what might work. I also strongly applaud your efforts to inject nuance and push in the right direction, appropriately, without ignoring the nuance and complexity. And yes, squaring the circle with effectiveness is a difficult question—but I think it’s one that is appreciated.
I worry that a naïve approach to complexity and pluralism is detrimental, but agree that this is important. As you said, “the complex web of impacts of research also need to be untangled. This is tricky, and needs to be done very carefully.”
I also think that you’re preaching to the choir, in an important sense. The people in EA working on existential risk reduction are aware of the complexity of the debates and discussions, while the average EA posting on the forum seems not to be. This is equivalent to the difference between climate expert’s views and the lay public.
To explain the example more, I think that most people’s view of climate risk isn’t that it destabilizes complex systems and may contribute to risk understood broadly in unpredictable ways. Their view is that it’s bad, and we need to stop it, and that worrying about other things isn’t productive because we need to do something about the bad thing now. But this leads to approaches that could easily contribute to risks rather than mitigate them—a more fragile electrical grid, or as you cited from Tang and Kemp. more reliance on mitigations like geoengineering that are poorly understood and build in new systemic risks of failure.
Of course, popular science books don’t necessarily go into the details, or when read casually leave the lay public with a at least somewhat misleading view—but one that pushes in the direction of supporting actions that the experts recommend. (Note that as a general rule, people working in the climate space are not pushing for geoengineering, they are pushing for emissions reductions, work increasing resilience to impacts, and similar.) The equivalent in EA is skimming the precipice, and ignoring Toby’s footnotes, citations, and cautions. Those first starting to work on risk and policy , or writing EA forum posts often have this view, but I think it’s usually tempered fairly quickly via discussion. Unfortunately, many who see the discussions simply claim longtermism is getting everything wrong, while agreeing with us on both priorities, and approaches.
So I agree that we need to appreciate the more sophisticated approach to risk, and blend them with cause prioritization and actually considering what might work. I also strongly applaud your efforts to inject nuance and push in the right direction, appropriately, without ignoring the nuance and complexity. And yes, squaring the circle with effectiveness is a difficult question—but I think it’s one that is appreciated.