In the section on the sources of shorttermist biases, John and MacAskill write:Cognitive biases include actors’ tendencies to respond more strongly to vivid risks than to information acquired from abstract, general social scientific trends, as well as over-optimism about their ability to control and eliminate risks under situations of uncertainty. The attention that political actors pay to the future and to the nearby past are asymmetric because voters and many other political actors “can readily observe past economic performance but have little information about future conditions.” Thus, to economize on cognitive effort, many political actors forego the task of making predictions about the future and choose policies which have worked in the recent past. [emphasis mine]
What John and MacAskill are describing here doesn’t sound like a bias—it sounds like an actual political philosophy, one which people like Matt Ridley or Steven Pinker would probably endorse. There are many reasonable people who believe that we should extrapolate from past performance rather than “abstract, general social scientific trends”, and that we should be more optimistic with regards to our ability to deal with risks in due time rather than rely on hastily implemented policies.The people who believe this might be wrong, but you have to actually argue that they’re mistaken instead of just dismissing their worldview as a cognitive bias. Arguably their philosophy was a useful corrective to issues in the past that involved long-term trends. The people who responded to the concrete overpopulation scare in the 20th century with vague optimism about our ability to feed more people were correct, whereas the people who had “abstract social scientific” reasons for expecting resources to run out were wrong, and disastrously so given the mass sterilization and population control programs they inspired in India and China. (Obviously, I don’t think John and MacAskill would endorse those atrocities—my point is simply that what they call a cognitive bias would have prevented all that unnecessary suffering.)To zoom out a bit, we should be careful that we don’t implement longtermist reform in a way that dismisses the optimistic philosophy of governance which places greater weight on past experiences.
I disagree with this, primarily because of 2 reasons:
I disagree with the presumption that they were rational in being optimistic, primarily because while there is real progress in history (only if we count humans.), I don’t agree with the implication that we should expect an optimistic bright future. I would argue that technological x-risk has wiped out all expected value from the future, especially under a longtermist view, assuming the future is positive thus x-risk reduction is our main priority. If the expected value of the future is negative, then moral-circle expansion is the most important thing to do.
I disagree with the implication that the population bomb didn’t happen, ergo the sterilization was wrong. This is a classic case of hindsight bias, and there was no mitigation against this bias. More exhaustively, you need to make the claim that the population bomb can’t happen or was likely not to happen in order for your argument to go through. A longer comment by EricHerboso summarizes the miracles necessary in order to defuse the population bomb.
There was good reason back then to believe that overpopulation was a real problem whose time would come relatively soon. If it wasn’t for technological breakthroughs with dwarf wheat and IR8 rice variants, spearheaded by Norman Borlaug and others, our population would have seriously passed our ability to grow food by this point—the so-called Malthusian trap.
Using overpopulation as an example here would be akin to using something like global climate change as an example in the present, if it turns out that a technological breakthrough in the next 5-10 years completely obviates the need for us to be careful about greenhouse gas release in the future.
Because of this, I don’t think overpopulation as a cause area would make for the best example that you’re trying to make here.
I disagree with the presumption that they were rational in being optimistic, primarily because while there is real progress in history (only if we count humans.), I don’t agree with the implication that we should expect an optimistic bright future. I would argue that technological x-risk has wiped out all expected value from the future, especially under a longtermist view, assuming the future is positive thus x-risk reduction is our main priority. If the expected value of the future is negative, then moral-circle expansion is the most important thing to do
Or just making society wealthier overall (aka maximizing economic growth) so can enjoy these last few hundred years more. Nonetheless, I don’t share your pessimism.
I disagree with the implication that the population bomb didn’t happen, ergo the sterilization was wrong. This is a classic case of hindsight bias, and there was no mitigation against this bias. More exhaustively, you need to make the claim that the population bomb can’t happen or was likely not to happen in order for your argument to go through
But my point is precisely that we couldn’t have known in advance what those solutions looked like in advance because knowledge growth is unpredictable. But given the fact that we do end up solving many of these seemingly devastating problems, we should update in favor of a vague optimism about our future capabilities to deal with problems. I give the example of peak oil worries later in this post:
In the 70s, it was a common belief among the relevant technical experts that we would hit peak oil by the 90s. They could not have anticipated the new technologies that made more oil reserves accessible to us. If there was a longtermist research institute within the government at that time, it would have recommended that we stock up on foreign oil, and the end result of this would have been unaffordable transportation and heating for the poorest people on the planet.
I disagree with this, primarily because of 2 reasons:
I disagree with the presumption that they were rational in being optimistic, primarily because while there is real progress in history (only if we count humans.), I don’t agree with the implication that we should expect an optimistic bright future. I would argue that technological x-risk has wiped out all expected value from the future, especially under a longtermist view, assuming the future is positive thus x-risk reduction is our main priority. If the expected value of the future is negative, then moral-circle expansion is the most important thing to do.
I disagree with the implication that the population bomb didn’t happen, ergo the sterilization was wrong. This is a classic case of hindsight bias, and there was no mitigation against this bias. More exhaustively, you need to make the claim that the population bomb can’t happen or was likely not to happen in order for your argument to go through. A longer comment by EricHerboso summarizes the miracles necessary in order to defuse the population bomb.
Thanks for the comment.
Or just making society wealthier overall (aka maximizing economic growth) so can enjoy these last few hundred years more. Nonetheless, I don’t share your pessimism.
But my point is precisely that we couldn’t have known in advance what those solutions looked like in advance because knowledge growth is unpredictable. But given the fact that we do end up solving many of these seemingly devastating problems, we should update in favor of a vague optimism about our future capabilities to deal with problems. I give the example of peak oil worries later in this post: