I kinda sorta answered Q2 above (I don’t really have anything to add to it).
Q3: I’m not too clear on this myself. I’m just an object-level AI alignment researcher :P
Q4: I broadly agree this is a problem, though I think this:
Before PS and EA/XR even resolve our debate, the car might be run off the road—either as an accident caused by fighting groups, or on purpose.
seems pretty unlikely to me, where I’m interpreting it as “civilization stops making any progress and regresses to the lower quality of life from the past, and this is a permanent effect”.
I haven’t thought about it much, but my immediate reaction is that it seems a lot harder to influence the world in a good way through the public, and so other actions seem better. That being said, you could search for “raising the sanity waterline” (probably more so on LessWrong than here) for some discussion of approaches to this sort of social progress (though it isn’t about educating people about the value of progress in particular).
I kinda sorta answered Q2 above (I don’t really have anything to add to it).
Q3: I’m not too clear on this myself. I’m just an object-level AI alignment researcher :P
Q4: I broadly agree this is a problem, though I think this:
seems pretty unlikely to me, where I’m interpreting it as “civilization stops making any progress and regresses to the lower quality of life from the past, and this is a permanent effect”.
I haven’t thought about it much, but my immediate reaction is that it seems a lot harder to influence the world in a good way through the public, and so other actions seem better. That being said, you could search for “raising the sanity waterline” (probably more so on LessWrong than here) for some discussion of approaches to this sort of social progress (though it isn’t about educating people about the value of progress in particular).