I feel subtweeted :p As far as I can tell, most of the wider world isn’t aware of the arguments for shorter timelines, and my pieces are aimed at them, rather than people already in the bubble.
That said, I do think there was a significant shortening of timelines from 2022 to 2024, and many people in EA should reassess whether their plans still make sense in light of that (e.g. general EA movement building looks less attractive relative to direct AI work compared to before).
Beyond that, I agree people shouldn’t be making month-to-month adjustments to their plans based on timelines, and should try to look for robust interventions.
I also agree many people should be on paths that build their leverage into the 2030s, even if there’s a chance it’s ‘too late’. It’s possible to get ~10x more leverage by investing in career capital / org building / movement building, and that can easily offset. I’ll try to get this message across in the new 80k AI guide.
Also agree for strategy it’s usually better to discuss specific capabilities and specific transformative effects you’re concerned about, rather than ‘AGI’ in general. (I wrote about AGI because it’s the most commonly used term outside of EA and was aiming to reach new people.)
Honestly, I wasn’t thinking of you! People planning their individual careers is one of the better reasons to engage with timelines imo. It’s more the selection of interventions where I think the conversation is moot, not where and how individuals can connect to those interventions.
The hypothetical example of people abandoning projects that culminate in 2029 was actually inspired by PauseAI—there is a contingent of people who think protesting and irl organizing takes too long and that we should just be trying to go viral on social media. I think the irl protests and community is what make PauseAI a real force and we have greater impact, including by drawing social media attention, all along that path—not just once our protests are big.
That said, I do see a lot of people making the mistakes I mentioned about their career paths. I’ve had a number of people looking for career advice through PauseAI say things like, “well, obviously getting a PhD is ruled out”, as if there is nothing they can do to have impact until they have the PhD. I think being a PhD student can be a great source of authority and a flexible job (with at least some income, often) where you have time to organize a willing population of students! (That’s what I did with EA at Harvard.) The mistake here isn’t even really a timelines issue; it’s not modeling the impact distribution along a career path well. Seems like you’ve been covering this:
>I also agree many people should be on paths that build their leverage into the 2030s, even if there’s a chance it’s ‘too late’. It’s possible to get ~10x more leverage by investing in career capital / org building / movement building, and that can easily offset. I’ll try to get this message across in the new 80k AI guide.
I feel subtweeted :p As far as I can tell, most of the wider world isn’t aware of the arguments for shorter timelines, and my pieces are aimed at them, rather than people already in the bubble.
That said, I do think there was a significant shortening of timelines from 2022 to 2024, and many people in EA should reassess whether their plans still make sense in light of that (e.g. general EA movement building looks less attractive relative to direct AI work compared to before).
Beyond that, I agree people shouldn’t be making month-to-month adjustments to their plans based on timelines, and should try to look for robust interventions.
I also agree many people should be on paths that build their leverage into the 2030s, even if there’s a chance it’s ‘too late’. It’s possible to get ~10x more leverage by investing in career capital / org building / movement building, and that can easily offset. I’ll try to get this message across in the new 80k AI guide.
Also agree for strategy it’s usually better to discuss specific capabilities and specific transformative effects you’re concerned about, rather than ‘AGI’ in general. (I wrote about AGI because it’s the most commonly used term outside of EA and was aiming to reach new people.)
Honestly, I wasn’t thinking of you! People planning their individual careers is one of the better reasons to engage with timelines imo. It’s more the selection of interventions where I think the conversation is moot, not where and how individuals can connect to those interventions.
The hypothetical example of people abandoning projects that culminate in 2029 was actually inspired by PauseAI—there is a contingent of people who think protesting and irl organizing takes too long and that we should just be trying to go viral on social media. I think the irl protests and community is what make PauseAI a real force and we have greater impact, including by drawing social media attention, all along that path—not just once our protests are big.
That said, I do see a lot of people making the mistakes I mentioned about their career paths. I’ve had a number of people looking for career advice through PauseAI say things like, “well, obviously getting a PhD is ruled out”, as if there is nothing they can do to have impact until they have the PhD. I think being a PhD student can be a great source of authority and a flexible job (with at least some income, often) where you have time to organize a willing population of students! (That’s what I did with EA at Harvard.) The mistake here isn’t even really a timelines issue; it’s not modeling the impact distribution along a career path well. Seems like you’ve been covering this:
>I also agree many people should be on paths that build their leverage into the 2030s, even if there’s a chance it’s ‘too late’. It’s possible to get ~10x more leverage by investing in career capital / org building / movement building, and that can easily offset. I’ll try to get this message across in the new 80k AI guide.