well, i think the right framework is net-negative on the margin, and the world will be better place if people will talk less about human rights.
and I’m not… talking about things because they are useful. i also don’t think it’s fair description to what you do—it’s look to me you actually believe in animal rights, and this is why you support it. and i just… don’t.
the reason EA don’t use this framework, in my model, is that most EA don’t believe in animal right. to change that, you need to convince people that it’s good framework and they should use it.
and then they will use it, nit because it’s useful, but because it’s true. which is the right thing to do, in my opinion—saying things because one believe they are true. i don’t even start the discussion about “is right talk will help animals and the world”, because even if the answer is “yes”, i will not be part of it as long as i believe it’s wrong framework.
so, in short: EA tend to believe less in the rights framework then the general population and so use it less.
well, i think the right framework is net-negative on the margin, and the world will be better place if people will talk less about human rights.
and I’m not… talking about things because they are useful. i also don’t think it’s fair description to what you do—it’s look to me you actually believe in animal rights, and this is why you support it. and i just… don’t.
the reason EA don’t use this framework, in my model, is that most EA don’t believe in animal right. to change that, you need to convince people that it’s good framework and they should use it.
and then they will use it, nit because it’s useful, but because it’s true. which is the right thing to do, in my opinion—saying things because one believe they are true. i don’t even start the discussion about “is right talk will help animals and the world”, because even if the answer is “yes”, i will not be part of it as long as i believe it’s wrong framework.
so, in short: EA tend to believe less in the rights framework then the general population and so use it less.
(it’s also, in my opinion, one of the things that define EA, but Ozy explain it better then i can: https://thingofthings.substack.com/p/effective-altruism-maximizing-welfarist )