(Responding as an 80k team member, though I’m quite new)
I appreciate this take; I was until recently working at CEA, and was in a lot of ways very very glad that Zach Robinson was all in on general EA. It remains the case (as I see it) that, from a strategic and moral point of view, there’s a ton of value in EA in general. It says what’s true in a clear and inspiring way, a lot of people are looking for a worldview that makes sense, and there’s still a lot we don’t know about the future. (And, as you say, non-fanaticism and pluralistic elements have a lot to offer, and there are some lessons to be learned about this from the FTX era)
At the same time, when I look around the EA community, I want to see a set of institutions, organizations, funders and people that are live players, responding to the world as they see it, making sure they aren’t missing the biggest thing currently happening. (or, if like 80k they are an org where one of its main jobs is communicating important things, letting their audiences miss it.) Most importantly, I want people to act on their beliefs (with appropriate incorporation of heuristics, rules of thumb, outside views, etc). And to the extent that 80k staff and leadership’s beliefs changed with the new evidence, I’m excited for them to be acting on it.
I wasn’t involved in this strategic pivot, but when I was considering the job, I was excited to see a certain kind of leaping to action in the organization as I was considering whether to join.
It could definitely be a mistake even within this framework (by causing 80k to not appeal parts of its potential audience) or empirically (on size of AI risk, or sizes of other problems) or long term (because of the damage it does to the EA community or intellectual lifeblood / eating the seed corn). In the past I’ve worried that various parts of the community were jumping too fast into what’s shiny and new, but 80k has been talking about this for more than a year, which is reassuring.
I think the 80k leadership have thoughts about all of these, but I agree that this blog post alone doesn’t fully make the case.
I think the right answer to these uncertainties is some combination of digging in and arguing about them (as you’ve started here — maybe there’s a longer conversation to be had), or waiting and see how these bets turn out.
Anyway, I appreciate considerations like the ones you’ve laid out because I think they’ll help 80k figure out if it’s making a mistake (now or in the future), even though I’m currently really energized and excited by the strategic pivot.
Thanks @ChanaMessinger I appreciate this comment, and think that your kind of tone here is healthier than the original announcement. Your well written one sentence captures many of the important issues well.
”It could definitely be a mistake even within this framework (by causing 80k to not appeal parts of its potential audience) or empirically (on size of AI risk, or sizes of other problems) or long term (because of the damage it does to the EA community or intellectual lifeblood / eating the seed corn).”
FWIW I think a clear mistake is the poor communication here. That the most obvious and serious potential community impacts have been missed and the tone is poor. If this had been presented in a way that it looked like the most serious potential downsides were considered, I would both feel better about it and be more confident that 80k has done a deep SWAT analysis here rather than the really basic framing of the post which is more like...
“AI risk is really bad and urgent let’s go all in”
This makes the decision seem not only insensitive but also poorly thought through which in sure is not the case. I imagine the chief concerns of the commenters were discussed at the highest level.
I’m assuming there are comms people at 80k and it surprises me that this would slip through like this.
Thanks for the feedback here. I mostly want to just echo Niel’s reply, which basically says what I would have wanted to. But I also want to add for transparency/accountability’s sake that I reviewed this post before we published it with the aim of helping it communicate the shift well – I focused mostly on helping it communicate clearly and succinctly, which I do think is really important, but I think your feedback makes sense, and I wish that I’d also done more to help it demonstrate the thought we’ve put into the tradeoffs involved and awareness of the costs. For what it’s worth, & we don’t have dedicated comms staff at 80k—helping with comms is currently part of my role, which is to lead our web programme.
Hey Zach,
(Responding as an 80k team member, though I’m quite new)
I appreciate this take; I was until recently working at CEA, and was in a lot of ways very very glad that Zach Robinson was all in on general EA. It remains the case (as I see it) that, from a strategic and moral point of view, there’s a ton of value in EA in general. It says what’s true in a clear and inspiring way, a lot of people are looking for a worldview that makes sense, and there’s still a lot we don’t know about the future. (And, as you say, non-fanaticism and pluralistic elements have a lot to offer, and there are some lessons to be learned about this from the FTX era)
At the same time, when I look around the EA community, I want to see a set of institutions, organizations, funders and people that are live players, responding to the world as they see it, making sure they aren’t missing the biggest thing currently happening. (or, if like 80k they are an org where one of its main jobs is communicating important things, letting their audiences miss it.) Most importantly, I want people to act on their beliefs (with appropriate incorporation of heuristics, rules of thumb, outside views, etc). And to the extent that 80k staff and leadership’s beliefs changed with the new evidence, I’m excited for them to be acting on it.
I wasn’t involved in this strategic pivot, but when I was considering the job, I was excited to see a certain kind of leaping to action in the organization as I was considering whether to join.
It could definitely be a mistake even within this framework (by causing 80k to not appeal parts of its potential audience) or empirically (on size of AI risk, or sizes of other problems) or long term (because of the damage it does to the EA community or intellectual lifeblood / eating the seed corn). In the past I’ve worried that various parts of the community were jumping too fast into what’s shiny and new, but 80k has been talking about this for more than a year, which is reassuring.
I think the 80k leadership have thoughts about all of these, but I agree that this blog post alone doesn’t fully make the case.
I think the right answer to these uncertainties is some combination of digging in and arguing about them (as you’ve started here — maybe there’s a longer conversation to be had), or waiting and see how these bets turn out.
Anyway, I appreciate considerations like the ones you’ve laid out because I think they’ll help 80k figure out if it’s making a mistake (now or in the future), even though I’m currently really energized and excited by the strategic pivot.
Thanks @ChanaMessinger I appreciate this comment, and think that your kind of tone here is healthier than the original announcement. Your well written one sentence captures many of the important issues well.
”It could definitely be a mistake even within this framework (by causing 80k to not appeal parts of its potential audience) or empirically (on size of AI risk, or sizes of other problems) or long term (because of the damage it does to the EA community or intellectual lifeblood / eating the seed corn).”
FWIW I think a clear mistake is the poor communication here. That the most obvious and serious potential community impacts have been missed and the tone is poor. If this had been presented in a way that it looked like the most serious potential downsides were considered, I would both feel better about it and be more confident that 80k has done a deep SWAT analysis here rather than the really basic framing of the post which is more like...
“AI risk is really bad and urgent let’s go all in”
This makes the decision seem not only insensitive but also poorly thought through which in sure is not the case. I imagine the chief concerns of the commenters were discussed at the highest level.
I’m assuming there are comms people at 80k and it surprises me that this would slip through like this.
Thanks for the feedback here. I mostly want to just echo Niel’s reply, which basically says what I would have wanted to. But I also want to add for transparency/accountability’s sake that I reviewed this post before we published it with the aim of helping it communicate the shift well – I focused mostly on helping it communicate clearly and succinctly, which I do think is really important, but I think your feedback makes sense, and I wish that I’d also done more to help it demonstrate the thought we’ve put into the tradeoffs involved and awareness of the costs. For what it’s worth, & we don’t have dedicated comms staff at 80k—helping with comms is currently part of my role, which is to lead our web programme.