Thanks for your work Claire. I am really grateful.
I feel frustrated that many people learned a new concept “longtermism” which many people misunderstand and relate to EA but now even many EAs don’t think this concept is that high priority. Feels like an error from us, that could have been predicted beforehand.
I am grateful for all the hard work that went into popularising the concept and I think weak longtermism is correct. But I dunno, seems like an oops moment that it would be helpful for someone to acknowledge.
now even many EAs don’t think this concept is that high priority
Could be true if you mean “high priority to communicate for community growth purposes”, but I still think it’s fairly fundamental to a lot of thinking about prioritisation (e.g. a large part of Open Phil’s spending is classified as longtermist).
I agree that there have probably been costly and avoidable misunderstandings.
Thanks for your work Claire. I am really grateful.
I feel frustrated that many people learned a new concept “longtermism” which many people misunderstand and relate to EA but now even many EAs don’t think this concept is that high priority. Feels like an error from us, that could have been predicted beforehand.
I am grateful for all the hard work that went into popularising the concept and I think weak longtermism is correct. But I dunno, seems like an oops moment that it would be helpful for someone to acknowledge.
I’m not sure I agree with:
Could be true if you mean “high priority to communicate for community growth purposes”, but I still think it’s fairly fundamental to a lot of thinking about prioritisation (e.g. a large part of Open Phil’s spending is classified as longtermist).
I agree that there have probably been costly and avoidable misunderstandings.