I think it’s fairly unsurprising that EA is mostly consequentialists or utilitarians. But often it goes way beyond that, into very specific niches that are not all a requirement for trying to “do good effectively”.
For example, a disproportionate amount of people here are are capital R “Rationalists”, referring to the subculture built around fans of the “sequences” blogposts on Lesswrong written by Yudkowsky. I think this subgroup in particular suffers from “not invented here” syndrome, where philosophical ideas that haven’t been translated into rationalist jargon are not engaged with seriously.
I think it’s fairly unsurprising that EA is mostly consequentialists or utilitarians. But often it goes way beyond that, into very specific niches that are not all a requirement for trying to “do good effectively”.
For example, a disproportionate amount of people here are are capital R “Rationalists”, referring to the subculture built around fans of the “sequences” blogposts on Lesswrong written by Yudkowsky. I think this subgroup in particular suffers from “not invented here” syndrome, where philosophical ideas that haven’t been translated into rationalist jargon are not engaged with seriously.
I think the note on Not Invented Here syndrome is actually amazing and I’m very happy you introduced that concept into this discussion.