I’m curious about the pathways you have in mind. I may have missed something here.
I think it’s basically things flowing in some form through “the people working on the powerful technology spend time with people seriously concerned with large-scale risks”. From a very zoomed out perspective it just seems obvious that we should be more optimistic about worlds where that’s happening compared to worlds where it’s not (which doesn’t mean that necessarily remains true when we zoom in, but it sure affects my priors).
If I try to tell more concrete stories they include things of the form “the safety-concerned people have better situational awareness and may make better plans later”, and also “when systems start showing troubling indicators, culturally that’s taken much more seriously”. (Ok, I’m not going super concrete in my stories here, but that’s because I don’t want to anchor things on a particular narrow pathway.)
I think it’s basically things flowing in some form through “the people working on the powerful technology spend time with people seriously concerned with large-scale risks”. From a very zoomed out perspective it just seems obvious that we should be more optimistic about worlds where that’s happening compared to worlds where it’s not (which doesn’t mean that necessarily remains true when we zoom in, but it sure affects my priors).
If I try to tell more concrete stories they include things of the form “the safety-concerned people have better situational awareness and may make better plans later”, and also “when systems start showing troubling indicators, culturally that’s taken much more seriously”. (Ok, I’m not going super concrete in my stories here, but that’s because I don’t want to anchor things on a particular narrow pathway.)
Thanks for clarifying.