So EA isn’t “just longtermism,” but maybe it’s “a lot of longtermism”? And maybe it’s moving towards becoming “just longtermism”?
EA has definitely been moving towards “a lot of longtermism”.
The OP has already provided some evidence of this with funding data. Another thing that signals to me that this is happening is the way 80k hours has been changing their career guide. Their earlier career guide started by talking about Seligmann’s factors/Positive Psychology and made the very simple claim that if you want a satisfying career, positive psychology says your life must involve an aspect of helping others in a meaningful way. Then one fine day they made the key ideas page and somehow now longtermism has become the “foundation” of their advice. When the EA org that is most people’s first contact with EA makes longtermism as the “foundation” of their recommendations, it should defintiely mean that EA now wants to move towards “a lot of longtermism”.
What if EA was just longtermism? Would that be bad? Should EA just be longtermism?
Yes, it would be bad if EA was just longtermism.
I believe that striving to make the simplest and honest case for a cause area is not a choice but an intellectual obligation. It is irresponsible to put forth unnecessarily complicated ideas and chase away people who might have otherwise contributed to that cause. I think Longtermism is currently an unnecessarily complicated way for us to make the case to convince people to contribute to most of the important EA cause areas. My thoughts come from this excellent post.
I am willing to concede my stance on this 2nd question if you can argue convincingly that:
striving to make the simplest and honest case for a cause area is not an intellectual obligation we need to hold.
there are some cause areas where longtermism is actually the simplest argument one can make to convince people to work on them. Of course, that would mean EA can then only work on those cause areas where this is true, but that might not be so bad if it is still highly impactful.
Some hedging:
I still believe Longtermism is an important idea. If you are a philosopher I would highly encourage you to work on it. But I just don’t think it is as important for EA as EA orgs are currently making it seem like. This is especially true of Strong Longterism.
I also think that this could all be a case of nomenclature being confusing. Here is a post talking about this confusion. Maybe those who do Global health & development are also on the longtermism spectrum but are just not as ‘gaga’ about it as Strong longtermists seem to be. After all it’s not as though Expectation value bets on the future (maybe not far future) can’t be made in a Global health & development intervention! If we clarify the nomenclature here then it could be possible that “Longtermism” (or whatever else the clarified nomenclature would call it) could become a clearer & simpler explanation to convince people to contribute to a cause area. Then I would still be fine with EA becoming “Longtermism” (in the clarified nomenclature).
So EA isn’t “just longtermism,” but maybe it’s “a lot of longtermism”? And maybe it’s moving towards becoming “just longtermism”?
EA has definitely been moving towards “a lot of longtermism”.
The OP has already provided some evidence of this with funding data. Another thing that signals to me that this is happening is the way 80k hours has been changing their career guide. Their earlier career guide started by talking about Seligmann’s factors/Positive Psychology and made the very simple claim that if you want a satisfying career, positive psychology says your life must involve an aspect of helping others in a meaningful way. Then one fine day they made the key ideas page and somehow now longtermism has become the “foundation” of their advice. When the EA org that is most people’s first contact with EA makes longtermism as the “foundation” of their recommendations, it should defintiely mean that EA now wants to move towards “a lot of longtermism”.
What if EA was just longtermism? Would that be bad? Should EA just be longtermism?
Yes, it would be bad if EA was just longtermism.
I believe that striving to make the simplest and honest case for a cause area is not a choice but an intellectual obligation. It is irresponsible to put forth unnecessarily complicated ideas and chase away people who might have otherwise contributed to that cause. I think Longtermism is currently an unnecessarily complicated way for us to make the case to convince people to contribute to most of the important EA cause areas. My thoughts come from this excellent post.
I am willing to concede my stance on this 2nd question if you can argue convincingly that:
striving to make the simplest and honest case for a cause area is not an intellectual obligation we need to hold.
there are some cause areas where longtermism is actually the simplest argument one can make to convince people to work on them. Of course, that would mean EA can then only work on those cause areas where this is true, but that might not be so bad if it is still highly impactful.
Some hedging:
I still believe Longtermism is an important idea. If you are a philosopher I would highly encourage you to work on it. But I just don’t think it is as important for EA as EA orgs are currently making it seem like. This is especially true of Strong Longterism.
I also think that this could all be a case of nomenclature being confusing. Here is a post talking about this confusion. Maybe those who do Global health & development are also on the longtermism spectrum but are just not as ‘gaga’ about it as Strong longtermists seem to be. After all it’s not as though Expectation value bets on the future (maybe not far future) can’t be made in a Global health & development intervention! If we clarify the nomenclature here then it could be possible that “Longtermism” (or whatever else the clarified nomenclature would call it) could become a clearer & simpler explanation to convince people to contribute to a cause area. Then I would still be fine with EA becoming “Longtermism” (in the clarified nomenclature).