As somebody currently involved in a university group, I am extremely sympathetic towards the EA Munich group, even though they might have made a mistake here. There is a huge amount of pressure to avoid controversial topics/speakers, and it seems like they did not have a lot of time to make a decision in light of new evidence. I have hosted Peter Singer for multiple events (and am glad to have done so), but it has led to multiple uncomfortable confrontations that the average student group (e.g., knitting society) just does not have to deal with.
This highlights why Larks’ post is so important. When groups face decisions about when to carry out or cancel an event, having an explicit framework for this decision making would be incredibly helpful. I’m very glad to see Julia Wise/CEA engage with this post, as I think it would be helpful for both CEA and local groups to decide at the beginning of term/before inviting speakers what qualifies people to be speakers.
The main (in my opinion, reasonable) principles elucidated in this post as I read it are:
1. Openness to unusual ideas is one of the guiding principles of Effective Altruism; groups should uphold and promote this.
2. Fundamental cause research that challenges existing ideas to the movement is important; we should not punish people for engaging in it.
But it is also important to consider what *disqualifies* people from speaking.
The most critical thing to me would be a speaker’s history of promoting ideas in bad faith. (E.g., promoting ideas that have been clearly falsified with scientific evidence; deliberately falsifying data in order to push a specific agenda.) I am sure there are other factors that would also make sense to consider! It would be helpful for them to be elucidated somewhere.
For this and also Robert Wiblin’s comment, I’m interested in whether unrepentant opponents of scientific replication should be considered beyond the pale in EA circles. It’s not a central problem in most people’s minds, but a) it’s uncontroversially bad in our circles and b) EAs have a stronger case for considering denial of truth very bad than other groups.
This is arguably not a hypothetical example (note that I do not have an opinion on the original research).
EDIT: Removed concrete examples since they might be a distraction.
I would actually be really interested in talking to someone like Baumeister at an event, or ideally someone a bit more careful. I do think I would be somewhat unhappy to see them given just a talk with Q&A, with no natural place to provide pushback and followup discussion, but if someone were to organize an event with Baumeister debating some EA with opinions on scientific methodology, I would love to attend that.
I do think I would be somewhat unhappy to see them given just a talk with Q&A, with no natural place to provide pushback and followup discussion, but if someone were to organize an event with Baumeister debating some EA with opinions on scientific methodology, I would love to attend that.
I think that’s roughly my position as well.
Same. Especially agree that the format of the event needs to be structured so that ideas are not presented as facts, but are instead open to (lots of public) criticism.