Another way it could be bad that EAs are focusing on AI is if EAs are accelerating AGI capabilities / shortening timelines way more than we’re helping with alignment (or otherwise increasing the probability of good outcomes).
Another way it could be bad that EAs are focusing on AI is if EAs are accelerating AGI capabilities / shortening timelines way more than we’re helping with alignment (or otherwise increasing the probability of good outcomes).