Iām somewhat of a skeptic on the dangers of AI, so I may not be the best person to address that point. On pandemics, I think itās likely that Gain of Function research should be heavily curtailedābut I donāt think thatās a core neoliberal value or anything, just my personal opinion.
More broadly, I donāt really think of x-risk and economic growth as things that necessarily have to be traded for each other. I think that in many important ways a more prosperous society has more stability and less x-risk. One important area to worry about is the possibility of nuclear conflict. To me it seems pretty clear that the more than every country in the world can become a rich country, the more stable international geopolitics will be and the lower the risk of a catastrophic war will be.
Morally, I think any attempt to slow growth in the name of x-risk had better be really damn sure that the x-risk is truly intolerable, because in practical terms ātrade off for growthā means āpotentially impoverish millions/ābillions of peopleā. Purposefully trying to slow economic growth seems to me to be a moral evil in almost all cases, barring some exceptional edge cases, simply because economic growth is so good (and this is especially true in developing countries).
I think Tyler Cowenās general idea that economic growth is extremely important is true and underrated in our political discourse. Iām not always in agreement with the prescriptions that Cowen believes would actually achieve that growth. Thereās an exciting revival in the general big-tent-neoliberal world of a āpro-growth progressiveā attitude. Sometimes you hear this called a āsupply side liberalā. Ezra Klein wrote about this in the NYTimes, but I would also recommend the general works of Sam Hammond at the Niskanen Center, Matt Yglesias and Noah Smith for other versions of this.
I often see people citing Tyler Cowen on the moral imperative to do economic growth for our descendants while simultaneously claiming that existential risk is low but nonzero. Of course thereās nothing wrong with agreeing with someone on one thing but not another, but in this case I feel like thereās a premise missing or something.
What do you think should be neoliberalsā approach to existential risks to humanity? The two x-risks that the EA community focuses on the most are:
Advanced artificial intelligence
Genetically engineered pandemics
How would you trade off between mitigating x-risks and increasing economic growth?
Another longtermist question: What do you think about Stubborn Attachments?
Iām somewhat of a skeptic on the dangers of AI, so I may not be the best person to address that point. On pandemics, I think itās likely that Gain of Function research should be heavily curtailedābut I donāt think thatās a core neoliberal value or anything, just my personal opinion.
More broadly, I donāt really think of x-risk and economic growth as things that necessarily have to be traded for each other. I think that in many important ways a more prosperous society has more stability and less x-risk. One important area to worry about is the possibility of nuclear conflict. To me it seems pretty clear that the more than every country in the world can become a rich country, the more stable international geopolitics will be and the lower the risk of a catastrophic war will be.
Morally, I think any attempt to slow growth in the name of x-risk had better be really damn sure that the x-risk is truly intolerable, because in practical terms ātrade off for growthā means āpotentially impoverish millions/ābillions of peopleā. Purposefully trying to slow economic growth seems to me to be a moral evil in almost all cases, barring some exceptional edge cases, simply because economic growth is so good (and this is especially true in developing countries).
I think Tyler Cowenās general idea that economic growth is extremely important is true and underrated in our political discourse. Iām not always in agreement with the prescriptions that Cowen believes would actually achieve that growth. Thereās an exciting revival in the general big-tent-neoliberal world of a āpro-growth progressiveā attitude. Sometimes you hear this called a āsupply side liberalā. Ezra Klein wrote about this in the NYTimes, but I would also recommend the general works of Sam Hammond at the Niskanen Center, Matt Yglesias and Noah Smith for other versions of this.
To be clear, Tyler Cowen believes that growth is more important than averting x-risk as a logical consequence of believing that human extinction on the timescale of 700 years or so is inevitable (H/āT AppliedDivinityStudies, pg.12).
I often see people citing Tyler Cowen on the moral imperative to do economic growth for our descendants while simultaneously claiming that existential risk is low but nonzero. Of course thereās nothing wrong with agreeing with someone on one thing but not another, but in this case I feel like thereās a premise missing or something.