“Too unilateral or rash” is not a euphemism for “non-peaceful”: I really do specifically mean that in these EA/LW/etc circles there’s a tendency to have a pathological fear (that can only be discharged by fully assuaging the scrupulosity of oneself and one’s peers) of taking decisive impactful action.
I cannot help but see the AGI-accelerationist side of things winning decisively, soon, and irreversibly if those who are opposed continue to be so self-limitingly scrupulous about taking action because of incredibly nebulous fears.
I second this. I further think there are a lot of image and tribe concerns that go into these sentiments. Many people in EA and especially AI Safety sort of see themselves in the same tribe with AGI companies, whether they are working toward the singularity or just generally being a tech person who understands that tech progress improves humanity and guides history. Another aspect of this is being drawn to technocracy and disdaining traditional advocacy (very not grey tribe). Some EAs actually work for AGI companies and others feel pressure to cooperate and not “defect” on others around them have made alliances with AGI companies.
I second this. I further think there are a lot of image and tribe concerns that go into these sentiments. Many people in EA and especially AI Safety sort of see themselves in the same tribe with AGI companies, whether they are working toward the singularity or just generally being a tech person who understands that tech progress improves humanity and guides history. Another aspect of this is being drawn to technocracy and disdaining traditional advocacy (very not grey tribe). Some EAs actually work for AGI companies and others feel pressure to cooperate and not “defect” on others around them have made alliances with AGI companies.