Hi Ollie, thanks for sharing your thoughts here. A lot has already been covered in the comments, so perhaps some unexplored points here:
Most moral and political ideologies at some point imply power-seeking? Highly revolutionary leftist ideologies imply, well⌠revolution. Conservative ideologies at some level imply gaining enough power to conserve the parts of society that are valued. After some reflection, I agree that I donât think that longtermism necessarily implies power seeking, at least Iâm not sure itâs out of the same class here as other political theories.
So I think what does seem to be causing this is not the philosophy but the practical on-the-ground response of a) longtermism growing rapidly in prominence in EA and gaining funding[1] as well as b) EA increasing in visibility to the wider world, as well as wider influence (e.g. Toby Ord talking to the UN, EA being associated with the recent wave of concern about AIXR policy, the significant amount of promotion for WWOTF). From the outside, making a naive extrapolation[2], it would appear like longtermism would be on the way to becoming a lot more influential in the near future.
The best examples of where this might actually be true comes from some highly speculative/âambitious/âdangerous ideas present in Bay-Area Rationalist orgs, though I think these happened before âlongtermismâ was actually a thing:[3]
There are suggestions that Leverage Research had some half-baked grand plan to convert their hoped-for breakthroughs about human behaviour/âpsychology to lead them to be able to take over the US Government (source1, source2 - ctrl+F âtake overâ)
There are also hints at the plan from MIRI involving a âpivotal actâ, a plan that seems to cash out as getting researchers to develop a somewhat aligned proto-AGI and using that to take over the world and preventing any unaligned AGIs from being built (source1, source2 - not as sure what the truth is here, but I think there is some smoke)
Finally, I think a big thing is that contributes to this idea of EA and longtermism as inherently and/âor dangerously power seeking is framing by its ideological enemies. For instance, I donât think Crary or Torres[4] are coming at this from a perspective of âmistake theoryâ - itâs âconflict theoryâ all the way for them. I donât think theyâre misreading longtermist works by accident, I think that they view longtermism as inherently dangerous because of the ideological differences, and theyâre shouting loudly about it. This is a lot of peopleâs first impression of longtermism/âEAâand unfortunately I think it often sticks. Critically, prominent longtermists seem to have ceded this ground to their critics, and donât prominently push back against it, which I think is a mistaken strategy.
Would be really interested to hear what you (or others) think about this.
IMPORTANT NOTE: I donât claim to know fully what the truth behind these claims are, but it did stick in my mind thinking about the post. But Iâm happy to amend/âretract if provided clearer evidence by those in the know. I donât think it was likely any of these plans had any chance of succeeding, but it still points to a concerning background trend if true.
Especially Torresâwho seems to be fighting a personal holy war against longtermism. I remained perplexed about what happened here, since he seems to have been an active EA concerned about xRisk for some time.
Hi Ollie, thanks for sharing your thoughts here. A lot has already been covered in the comments, so perhaps some unexplored points here:
Most moral and political ideologies at some point imply power-seeking? Highly revolutionary leftist ideologies imply, well⌠revolution. Conservative ideologies at some level imply gaining enough power to conserve the parts of society that are valued. After some reflection, I agree that I donât think that longtermism necessarily implies power seeking, at least Iâm not sure itâs out of the same class here as other political theories.
So I think what does seem to be causing this is not the philosophy but the practical on-the-ground response of a) longtermism growing rapidly in prominence in EA and gaining funding[1] as well as b) EA increasing in visibility to the wider world, as well as wider influence (e.g. Toby Ord talking to the UN, EA being associated with the recent wave of concern about AIXR policy, the significant amount of promotion for WWOTF). From the outside, making a naive extrapolation[2], it would appear like longtermism would be on the way to becoming a lot more influential in the near future.
The best examples of where this might actually be true comes from some highly speculative/âambitious/âdangerous ideas present in Bay-Area Rationalist orgs, though I think these happened before âlongtermismâ was actually a thing:[3]
There are suggestions that Leverage Research had some half-baked grand plan to convert their hoped-for breakthroughs about human behaviour/âpsychology to lead them to be able to take over the US Government (source1, source2 - ctrl+F âtake overâ)
There are also hints at the plan from MIRI involving a âpivotal actâ, a plan that seems to cash out as getting researchers to develop a somewhat aligned proto-AGI and using that to take over the world and preventing any unaligned AGIs from being built (source1, source2 - not as sure what the truth is here, but I think there is some smoke)
Finally, I think a big thing is that contributes to this idea of EA and longtermism as inherently and/âor dangerously power seeking is framing by its ideological enemies. For instance, I donât think Crary or Torres[4] are coming at this from a perspective of âmistake theoryâ - itâs âconflict theoryâ all the way for them. I donât think theyâre misreading longtermist works by accident, I think that they view longtermism as inherently dangerous because of the ideological differences, and theyâre shouting loudly about it. This is a lot of peopleâs first impression of longtermism/âEAâand unfortunately I think it often sticks. Critically, prominent longtermists seem to have ceded this ground to their critics, and donât prominently push back against it, which I think is a mistaken strategy.
Would be really interested to hear what you (or others) think about this.
though again, to the best of my knowledge GH&D is still the number 1 cause area by funding
especially pre-FTX collapse
IMPORTANT NOTE: I donât claim to know fully what the truth behind these claims are, but it did stick in my mind thinking about the post. But Iâm happy to amend/âretract if provided clearer evidence by those in the know. I donât think it was likely any of these plans had any chance of succeeding, but it still points to a concerning background trend if true.
Especially Torresâwho seems to be fighting a personal holy war against longtermism. I remained perplexed about what happened here, since he seems to have been an active EA concerned about xRisk for some time.