I recently heard that some people want CEA to publicly state how we approach content curation, and justify those decisions. We already do this, so I’m writing this linkpost to draw attention to this page on our website.[1]
I published this page in October, and I think it broadly represents how CEA tried to approach these topics during my tenure (Jan 2019 - Feb 2023).[2] I also think it’s still broadly representative of how CEA is currently doing moderation/content, but I no longer make those decisions.
Future CEA leadership might take things in a different direction, and I don’t want this post to tie their hands.
This is a really tricky topic, and I’m not sure if I handled this well while I was running CEA.
Some things that seem difficult about it:
Everyone has different views about how to approach this, and I don’t think that there’s a version that will please everyone.
If you’re doing moderation or content curation for something with “effective altruism” in the name, it seems like you should try to represent what “effective altruism” thinks. But this concept doesn’t really seem coherent (“effective altruism” doesn’t think anything—there are a bunch of people who identify to greater/lesser degrees with effective altruism and they all think different things). So it’s hard to come up with good heuristics/principles for representing EA.
Even if you try to approach this question in some principled way, you’re going to be biased by your own views. If you ask other people for advice and input, then by default it’s going to be biased towards the views of your network (who probably have similar biases). You can try to compensate for this, but you probably won’t fully do so.
CEA staff tend to be pretty into “EA principles”, but I think we’re often into longtermism/X-risk stuff too, so you should expect us to be a bit biased towards this, though we try to get input from people who disagree with us.
But when I put together this document and our EA Handbook I did try to get input from a variety of people, especially people with a different cause prioritization.
If you get input and content from a bunch of different viewpoints (which seems kinda good from one point of view), you risk creating content that feels garbled or incoherent.[3]
Thoughts and feedback are welcome, but note that I now don’t have control over these things, and CEA is biased against making large changes while we search for new leadership. But I do have some influence in my role as Interim Advisor, and we might be able to make tweaks. Obviously, new leadership might take things in a substantially different direction.
For the intro sequence, I nevertheless tried to get content from a bunch of different viewpoints, and I think that having a variety of voices can also be pretty good for showing the variety of thinking styles/attitudes within EA—different voices might resonate with different people.
CEA’s Approach to Moderation and Content Curation
Link post
I recently heard that some people want CEA to publicly state how we approach content curation, and justify those decisions. We already do this, so I’m writing this linkpost to draw attention to this page on our website.[1]
I published this page in October, and I think it broadly represents how CEA tried to approach these topics during my tenure (Jan 2019 - Feb 2023).[2] I also think it’s still broadly representative of how CEA is currently doing moderation/content, but I no longer make those decisions.
Future CEA leadership might take things in a different direction, and I don’t want this post to tie their hands.
This is a really tricky topic, and I’m not sure if I handled this well while I was running CEA.
Some things that seem difficult about it:
Everyone has different views about how to approach this, and I don’t think that there’s a version that will please everyone.
If you’re doing moderation or content curation for something with “effective altruism” in the name, it seems like you should try to represent what “effective altruism” thinks. But this concept doesn’t really seem coherent (“effective altruism” doesn’t think anything—there are a bunch of people who identify to greater/lesser degrees with effective altruism and they all think different things). So it’s hard to come up with good heuristics/principles for representing EA.
Around the same time as we published the content curation page, we also published a page setting out what we thought of as the core principles of EA.
Even if you try to approach this question in some principled way, you’re going to be biased by your own views. If you ask other people for advice and input, then by default it’s going to be biased towards the views of your network (who probably have similar biases). You can try to compensate for this, but you probably won’t fully do so.
CEA staff tend to be pretty into “EA principles”, but I think we’re often into longtermism/X-risk stuff too, so you should expect us to be a bit biased towards this, though we try to get input from people who disagree with us.
But when I put together this document and our EA Handbook I did try to get input from a variety of people, especially people with a different cause prioritization.
If you get input and content from a bunch of different viewpoints (which seems kinda good from one point of view), you risk creating content that feels garbled or incoherent.[3]
Thoughts and feedback are welcome, but note that I now don’t have control over these things, and CEA is biased against making large changes while we search for new leadership. But I do have some influence in my role as Interim Advisor, and we might be able to make tweaks. Obviously, new leadership might take things in a substantially different direction.
I think it’s my fault for not publicizing this earlier: it’s reasonable for people not to know about every page on our website!
Though we made some mistakes around this in the earlier part of that tenure.
For the intro sequence, I nevertheless tried to get content from a bunch of different viewpoints, and I think that having a variety of voices can also be pretty good for showing the variety of thinking styles/attitudes within EA—different voices might resonate with different people.