This post got me considering opening a Facebook group called “Against World Destruction Israel” or something like that (as a sibling to “Effective Altruism Israel”)
Most of our material would be around prioritizing x-risks, because I think many people already care about x-risk, they’re just focusing on the wrong ones.
As a secondary but still major point, “here are things you can do to help”.
What do you (or others) think? (This is a rough idea after 2 minutes of thinking)
I wouldn’t discuss longtermism at all, it’s complicated, unintuitive, or more formally: a far inferential distance from a huge amount of people compared to just x-risk.
And moat of the actionable conclusions are the same anyway, no?
That didn’t come off as clearly as I had hoped. What I meant was that maybe the leading with X-Risk will resonate for some and Longtermism for others. It seems worth having separate groups that focus on both to appeal to both types of people.
This post got me considering opening a Facebook group called “Against World Destruction Israel” or something like that (as a sibling to “Effective Altruism Israel”)
Most of our material would be around prioritizing x-risks, because I think many people already care about x-risk, they’re just focusing on the wrong ones.
As a secondary but still major point, “here are things you can do to help”.
What do you (or others) think? (This is a rough idea after 2 minutes of thinking)
This sounds like a great idea. Maybe the answer to the pitching Longtermism or pitching x-risk question is both?
I wouldn’t discuss longtermism at all, it’s complicated, unintuitive, or more formally: a far inferential distance from a huge amount of people compared to just x-risk. And moat of the actionable conclusions are the same anyway, no?
That didn’t come off as clearly as I had hoped. What I meant was that maybe the leading with X-Risk will resonate for some and Longtermism for others. It seems worth having separate groups that focus on both to appeal to both types of people.