This link post is part of LessWrong for EA, a LessWrong repost & low-commitment discussion group (inspired by this comment). Each week I will revive an EA-relevant post from the LessWrong Archives, more or less at random, from a list I started from the comments here (additional suggestions welcome via PM or in the comments).
Initially, I talked about hosting Zoom discussion for those that were interested, but I think it’s a bit more than I can take on right now (not so low-commitment). If anyone wants to organize one, comment or PM me and I will be happy to coordinate for future posts.
LW4EA: Humans are not automatically strategic
Link post
Written by LW user AnnaSalamon.
This link post is part of LessWrong for EA, a LessWrong repost & low-commitment discussion group (inspired by this comment). Each week I will revive an EA-relevant post from the LessWrong Archives, more or less at random, from a list I started from the comments here (additional suggestions welcome via PM or in the comments).
Please feel free to,
Discuss in the comments
Subscribe to the LessWrong for EA tag to be notified of future posts
Tag other LessWrong reposts with LessWrong for EA.
Initially, I talked about hosting Zoom discussion for those that were interested, but I think it’s a bit more than I can take on right now (not so low-commitment). If anyone wants to organize one, comment or PM me and I will be happy to coordinate for future posts.