Maybe the only way to really push for x-safety is with If Anyone Builds It style “you too should believe in and seek to stop the impending singularity” outreach. That just feels like such a tough sell even if people would believe in the x-safety conditional on believing in the singularity. Agh. I’m conflicted here. No idea.
I wish I had more strategic clarity here.
I believe there was a recent UN general assembly where world leaders were literally asking around for, like, ideas for AI red lines.
I would be surprised if anything serious comes sout of this immediately, but I really like this framing because it normalises the idea that we should have red lines.
Thanks for the detailed comments.
I wish I had more strategic clarity here.
I would be surprised if anything serious comes sout of this immediately, but I really like this framing because it normalises the idea that we should have red lines.