Avoiding Infocalypse: How a decline in epistemic competence makes catastrophic risks inevitable — and what EA’s can do about it
This would be shortened and modified version of a talk I gave at Cambridge University, at the Leverhulme Centre for the Future of Intelligence and Centre for the Study of Existential Risk. The general public version of many of the ideas can be found in this TEDx talk that I gave in 2018 (ignore the title, not my choice).
Part 1: Framing the underlying problem
Describe what is meant by epistemic competence (the ability or desire for individuals, organizations, governments, etc. to effectively make sense of the world). Illustrate how it is declining, and how that is likely to get worse.
Part 2: Connect to catastrophic risks Describe how lower epistemic competence makes it extremely difficult to do any sort of crucial coordination, making global coordination on catastrophic risks increasingly unlikely. In addition, lower epistemic competence makes catastrophic forcing functions more likely and individual mitigation steps less likely.
Part 3: Exploring mitigations
Discuss what can be done, and show that many of these problems are related to other better understood EA cause areas (including e.g. the connection between synthetic media and AGI).
I would be interested in a late session. My goal is to more broadly circulate these concerns in the EA community, which I have been adjacent to for many years (e.g. this podcast episode I did with Julia Galef) but never deeply engaged.
Avoiding Infocalypse: How a decline in epistemic competence makes catastrophic risks inevitable — and what EA’s can do about it
This would be shortened and modified version of a talk I gave at Cambridge University, at the Leverhulme Centre for the Future of Intelligence and Centre for the Study of Existential Risk. The general public version of many of the ideas can be found in this TEDx talk that I gave in 2018 (ignore the title, not my choice).
Part 1: Framing the underlying problem
Describe what is meant by epistemic competence (the ability or desire for individuals, organizations, governments, etc. to effectively make sense of the world). Illustrate how it is declining, and how that is likely to get worse.
Part 2: Connect to catastrophic risks
Describe how lower epistemic competence makes it extremely difficult to do any sort of crucial coordination, making global coordination on catastrophic risks increasingly unlikely. In addition, lower epistemic competence makes catastrophic forcing functions more likely and individual mitigation steps less likely.
Part 3: Exploring mitigations
Discuss what can be done, and show that many of these problems are related to other better understood EA cause areas (including e.g. the connection between synthetic media and AGI).
------
More about my work here for context: aviv.me, twitter.com/metaviv.
I would be interested in a late session. My goal is to more broadly circulate these concerns in the EA community, which I have been adjacent to for many years (e.g. this podcast episode I did with Julia Galef) but never deeply engaged.