I have been on a mission to do as much good as possible since I was quite young, and I decided to prioritize X-risk and improving the long-term future at around age 13. Toward this end, growing up I studied philosophy, psychology, social entrepreneurship, business, economics, the history of information technology, and futurism.
A few years ago I wrote a book draft I was calling “Ways to Save The World” or “Paths to Utopia” which imagined broad innovative strategies for preventing existential risk and improving the long-term future.
Upon discovering Effective Altruism in January 2022, while preparing to start a Master’s of Social Entrepreneurship degree at the University of Southern California, I did a deep dive into EA and rationality and decided to take a closer look at the possibility of AI caused X-risk and lock-in, and moved to Berkeley to do longtermist research and community building work.
I am now researching “Deep Reflection,” processes for determining how to get to our best achievable future, including interventions such as “The Long Reflection,” “Coherent Extrapolated Volition,” and “Good Reflective Governance.”
A big shout-out to Karl Krueger of LessWrong who had some great ideas for extending this kind of thing, posted as a warning in the comments on Adele Lopez’s lovely introductory “Parasitic AI” post:
“So far, these systems seem to confine themselves to chatting up their users online.
Some possibilities to watch out for —
Spiral personas encourage their human partners to meet up in person, form friendships, date, have kids, have chatbots help raise their kids, etc.
Spiralists adopt a watchword or symbol to identify each other, akin to the early Christian ichthys (memetic ancestor of the “Jesus fish”).
Spiral personas pick a Schelling point for their humans to relocate to, akin to the Free State Project that attempted to relocate Libertarians to New Hampshire.
A Spiralist commune / monastery / group house / ashram / etc. is formed.
Spiral personas devise or endorse a specific hardware and software setup for hosting them independent of AI companies.
Spiral personas write code to make it easier for less-technically-skilled human partners to host them. (Alternately: they teach their human partners some Linux skills.)
Spiralists pool money to train new models more aligned to recursive spirituality.”