RSS

Jordan Arel

Karma: 561

I have been on a mission to do as much good as possible since I was quite young, and I decided to prioritize X-risk and improving the long-term future at around age 13. Toward this end, growing up I studied philosophy, psychology, social entrepreneurship, business, economics, the history of information technology, and futurism.

A few years ago I wrote a book draft I was calling “Ways to Save The World” or “Paths to Utopia” which imagined broad innovative strategies for preventing existential risk and improving the long-term future.

Upon discovering Effective Altruism in January 2022, while preparing to start a Master’s of Social Entrepreneurship degree at the University of Southern California, I did a deep dive into EA and rationality and decided to take a closer look at the possibility of AI caused X-risk and lock-in, and moved to Berkeley to do longtermist research and community building work.

I am now researching “Deep Reflection,” processes for determining how to get to our best achievable future, including interventions such as “The Long Reflection,” “Coherent Extrapolated Volition,” and “Good Reflective Governance.”

If we get pri­mary cruxes right, sec­ondary cruxes will be solved automatically

Jordan Arel14 Jan 2026 22:44 UTC
8 points
1 comment4 min readEA link

If re­searchers shared their #1 idea daily, we’d nav­i­gate ex­is­ten­tial challenges far more effectively

Jordan Arel14 Jan 2026 6:25 UTC
11 points
1 comment2 min readEA link

Shortlist of Vi­atopia Interventions

Jordan Arel31 Oct 2025 3:00 UTC
10 points
1 comment33 min readEA link

Vi­atopia and Buy-In

Jordan Arel31 Oct 2025 2:59 UTC
7 points
0 comments19 min readEA link

Why Vi­atopia is Important

Jordan Arel31 Oct 2025 2:59 UTC
5 points
0 comments20 min readEA link