Yes! Thanks for this Scott. X-risk prevention is a cause that both neartermists and longtermists can get behind. I think it should be reinstated as a top-level EA cause area in it’s own right, distinct from longtermism (as I’ve said here).
if you’re under ~50, unaligned AI might kill you and everyone you know. Not your great-great-(...)-great-grandchildren in the year 30,000 AD. Not even your children. You and everyone you know.
Yes! Thanks for this Scott. X-risk prevention is a cause that both neartermists and longtermists can get behind. I think it should be reinstated as a top-level EA cause area in it’s own right, distinct from longtermism (as I’ve said here).
It’s a sobering thought. See also: AGI x-risk timelines: 10% chance (by year X) estimates should be the headline, not 50%.