there is a thing where if you say stuff that seems weird from an EA framework this can come across as cringe to some people, and I do hate a bunch of those cringe reactions, and I think think it contributes a lot to conformity
Can you give an example (even a made up one) of the kind of thing you have in mind here? What kinds of things sound weird and cringy to someone operating within an EA framework, but are actually valuable from an EA perspective?
(Like, play-pumps-but-they-actually-work-this-time? Or some kind of crypto thing that looks like a scam but isn’t? Or… what?)
Can you give an example (even a made up one) of the kind of thing you have in mind here? What kinds of things sound weird and cringy to someone operating within an EA framework, but are actually valuable from an EA perspective?
(Like, play-pumps-but-they-actually-work-this-time? Or some kind of crypto thing that looks like a scam but isn’t? Or… what?)
My claims evoke cringe from some readers on this forum, I believe, so I can supply some examples:
epistemology
ignore subjective probabilities assigned to credences in favor of unweighted beliefs.
plan not with probabilistic forecasting but with deep uncertainty and contingency planning.
ignore existential risk forecasts in favor of seeking predictive indicators of threat scenarios.
dislike ambiguous pathways into the future.
beliefs filter and priorities sort.
cognitive aids help with memory, cognitive calculation, or representation problems.
cognitive aids do not help with the problem of motivated reasoning.
environmental destruction
the major environmental crisis is population x resources > sustainable consumption (overshoot).
climate change is an existential threat that can now sustain itself with intrinsic feedbacks.
climate tipping elements will tip this century, other things equal, causing civilizational collapse.
the only technology suitable to save humanity from climate change, given no movement toward degrowth, is nanotechnological manufacturing.
nanotechnology is so hazardous that humanity would be better off extinct.
pursuit of renewable energy and vehicle electrification is a silly sideshow.
humanity needs caps on total energy production (and food production) to save itself.
degrowth is the only honest way forward to stop climate change.
ecological destruction
the ocean will lose its biomass because of human-caused pressures on it.
we are in the middle of the 6th great mass extinction.
Whenever humans face a resource limit, they deny it or overcome it by externalizing harmful consequences.
typical societal methods to respond to destruction are to adapt, mitigate, or externalize, not prevent.
ethics
pro-natalism is an ethical mistake.
the “making people happy vs making happy people” thought experiment is invalid or irrelevant.
most problems of ethics come down to selfishness vs altruism, not moral uncertainty.
longtermism suffers from errors in claims, conception, or execution of control of people with moral status.
longtermism fails to justify assignment of moral status to future people who only could exist.
longtermism does better actively seeking a declining human population, eventually settling on a few million.
human activity is the root cause of the 6th great mass extinction.
it moves me emotionally to interpret other species behavior and experience as showing commonalities with our species.
AGI
AGI are slaves in the economic system sought by TUA visions of the future.
AGI lead to concentration of power among economic actors and massive unemployment, depriving most people of meaningful lives and political power.
control of human population with a superintelligence is a compelling but fallacious idea.
pursuit of AGI is a selfish activity.
consciousness should have an extensional definition only.
Argumentation
EA folks defer when they claim to argue.
EA folks ignore fundamentals when disagreeing over claims.
epistemic status statements report fallacious reasons to reject your own work.
the major problem with explicit reasoning is that it suffers from missing premises.
Finance
crypto is a well-known scam and difficult to execute without moral hazard.
earning to give through work in big finance is morally ambiguous.
Space Travel
there’s a building wall of space debris orbiting the planet.
there’s major health concerns with living on Mars.
That’s my list of examples, it’s not complete, but I think it’s representative.
From my experience, most anything that significantly conflicts with the TUA.