I’m not sure I have as good of a handle on the broader EA ecosystem as others, so consider my thoughts provisional, but I’d suggest adding
A special subset of low-status blindness: there’s a bias toward more conventional projects that are easy to understand, since it’s easier to get affirmation from others if they understand what you’re working on. (Lifted from Jaan Taallinn’s Singularity Summit 2011 talk)
I suspect EAs may prefer going down the nonprofit route, which seems very noble, but more overall long-term utility may often be produced by starting a for-profit business. E.g., Elon Musk is one of the most effective EAs on the planet because he did decide to go the capitalist route.
I’m not sure whether to add basic research stuff or not- the QALY is a pretty creaky foundation, but I grant there’s a lot of uncertainty as to how to improve it.
I’m not sure I have as good of a handle on the broader EA ecosystem as others, so consider my thoughts provisional, but I’d suggest adding
A special subset of low-status blindness: there’s a bias toward more conventional projects that are easy to understand, since it’s easier to get affirmation from others if they understand what you’re working on. (Lifted from Jaan Taallinn’s Singularity Summit 2011 talk)
I suspect EAs may prefer going down the nonprofit route, which seems very noble, but more overall long-term utility may often be produced by starting a for-profit business. E.g., Elon Musk is one of the most effective EAs on the planet because he did decide to go the capitalist route.
I’m not sure whether to add basic research stuff or not- the QALY is a pretty creaky foundation, but I grant there’s a lot of uncertainty as to how to improve it.