What about simply growing the EA movement? That clearly seems like a more efficient way to address x-risk, and something where funding could be used more readily.
That is plausible. But “definitely” definitely wouldn’t be called for when comparing Yang with Grow EA. How many EA people who could be sold on an AI PhD do you think could recruited with $20 million?
I meant that it’s definitely more efficient to grow the EA movement than to grow Yang’s constituency. That’s how it seems to me, at least. It takes millions of people to nominate a candidate.
Well, there are >100 million people who have to join some constituency (i.e. pick a candidate), whereas potential EA recruits aren’t otherwise picking between a small set of cults philosophical movements. Also, AI PhD-ready people are in much shorter supply than, e.g. Iowans, and they’d be giving up much much much more than someone just casting a vote for Andrew Yang.
There are numerous minor, subtle ways that EAs reduce AI risk. Small in comparison to a research career, but large in comparison to voting. (Voting can actually be one of them.)
What about simply growing the EA movement? That clearly seems like a more efficient way to address x-risk, and something where funding could be used more readily.
That is plausible. But “definitely” definitely wouldn’t be called for when comparing Yang with Grow EA. How many EA people who could be sold on an AI PhD do you think could recruited with $20 million?
I meant that it’s definitely more efficient to grow the EA movement than to grow Yang’s constituency. That’s how it seems to me, at least. It takes millions of people to nominate a candidate.
Well, there are >100 million people who have to join some constituency (i.e. pick a candidate), whereas potential EA recruits aren’t otherwise picking between a small set of cults philosophical movements. Also, AI PhD-ready people are in much shorter supply than, e.g. Iowans, and they’d be giving up much much much more than someone just casting a vote for Andrew Yang.
There are numerous minor, subtle ways that EAs reduce AI risk. Small in comparison to a research career, but large in comparison to voting. (Voting can actually be one of them.)