I think if we’re at the most influential point in history “EA community building” doesn’t make much sense. As others have said it would probably make more sense to be shouting about why we’re at the most influential point in history i.e. do “x-risk community building” or of course do more direct x-risk work.
I suspect we’d also do less global priorities research (although perhaps we don’t do that much as it is). If you think we’re at the most influential time you probably have a good reason for thinking that (x-risk abnormally high) which then informs what we should do (reduce it). So you wouldn’t need much more global priorities research. You would still need more granular research into how to reduce x-risk though.
More is also being said on the possibility of investing for the future financially which isn’t a great idea if we’re at the most influential time in history.
I agree the movement is mostly “hingy” in nature but perhaps not to the same extent you do. 80,000 Hours is an influential body that promotes EA community building, global priorities research, and to some extent investing for the future.
My point is that you could engage in “x-risk community building” which may more effectively get people working on reducing x-risk than “EA community building” would.
There is a bunch of consideration affecting that, including that we already do EA community building and that big switches tend to be costly. However that pans out in aggregate I think “doesn’t make much sense” is an overstatement.
I never actually said we should switch, but if we knew from the start “oh wow we live at the most influential time ever because x-risk is so high” we probably would have created an x-risk community not an EA one.
And to be clear I’m not sure where I personally come out on the hinginess debate. In fact I would say I’m probably more sympathetic to Will’s view that we currently aren’t at the most influential time than most others are.
My feeling is that it was a bit that people who wanted to attack global poverty efficiently decided to call themselves effective altruists, and then a bunch of Less Wrongers came over and convinced (a lot of) them that ‘hey, going extinct is an even biggler deal’, but the name still stuck, because names are sticky things.
That also depends on how wide you consider a “point”. A lot of longtermists talk of this as the “most important century”, not the most important year, or even decade. Considering EA as a whole is less than twenty years old, investing in EA and global priorities research might still make sense, even under a simplified model where 100% of the impact EA will ever have occurs by 2100, and then we don’t care any more. Given a standard explore/exploit algorithm, we should spend around 37% of the space exploring, so if we assume EA started around 2005, we should still be exploring until 2040 or so before pivoting and going all-in on the best things we’ve found.
I think if we’re at the most influential point in history “EA community building” doesn’t make much sense. As others have said it would probably make more sense to be shouting about why we’re at the most influential point in history i.e. do “x-risk community building” or of course do more direct x-risk work.
I suspect we’d also do less global priorities research (although perhaps we don’t do that much as it is). If you think we’re at the most influential time you probably have a good reason for thinking that (x-risk abnormally high) which then informs what we should do (reduce it). So you wouldn’t need much more global priorities research. You would still need more granular research into how to reduce x-risk though.
More is also being said on the possibility of investing for the future financially which isn’t a great idea if we’re at the most influential time in history.
I agree the movement is mostly “hingy” in nature but perhaps not to the same extent you do. 80,000 Hours is an influential body that promotes EA community building, global priorities research, and to some extent investing for the future.
I’m not sure I agree with that. It seems to me that EA community building is channelling quite a few people to direct existential risk reduction work.
My point is that you could engage in “x-risk community building” which may more effectively get people working on reducing x-risk than “EA community building” would.
There is a bunch of consideration affecting that, including that we already do EA community building and that big switches tend to be costly. However that pans out in aggregate I think “doesn’t make much sense” is an overstatement.
I never actually said we should switch, but if we knew from the start “oh wow we live at the most influential time ever because x-risk is so high” we probably would have created an x-risk community not an EA one.
And to be clear I’m not sure where I personally come out on the hinginess debate. In fact I would say I’m probably more sympathetic to Will’s view that we currently aren’t at the most influential time than most others are.
My feeling is that it was a bit that people who wanted to attack global poverty efficiently decided to call themselves effective altruists, and then a bunch of Less Wrongers came over and convinced (a lot of) them that ‘hey, going extinct is an even biggler deal’, but the name still stuck, because names are sticky things.
That also depends on how wide you consider a “point”. A lot of longtermists talk of this as the “most important century”, not the most important year, or even decade. Considering EA as a whole is less than twenty years old, investing in EA and global priorities research might still make sense, even under a simplified model where 100% of the impact EA will ever have occurs by 2100, and then we don’t care any more. Given a standard explore/exploit algorithm, we should spend around 37% of the space exploring, so if we assume EA started around 2005, we should still be exploring until 2040 or so before pivoting and going all-in on the best things we’ve found.