Meta: I’m concerned about the amount of downvotes I see that aren’t accompanied with any justification. Consider that there is a lot of information value in a negative judgment. I imagine that the author would be very happy to hear about this, and more generally, I imagine that EA as a whole would skill up a *lot* faster if downvotes came with instructions.
Downvotes aren’t primarily to help the person being downvoted. They help other readers, which after all there are many more of than writers. Creating an expectation that they should all be explained increases the burden on the downvoter significantly, making them less likely to be used and therefore less useful.
The title of this post is “Does EA need an underlying philosophy? Could Sentientism be that philosophy?”
I would consider both of those points to betray a lack of basic understanding of effective altruism. There are lots of good resources elsewhere- perhaps it would be good to have a basic faq article / wiki that could be Linked in such cases?
Thanks all. I’d love to hear thoughts from anyone who has downvoted. No obligation of course.
Alasdair—I think I’m reasonably familiar with EA but I could have been clearer. I was trying to explore two points:
1) Given both sentientism and EA focus on using evidence and reason and having broad moral compassion—I thought the term and the philosophy might be of interest to EA people generally.
2) Many (all?) of the problems EA looks to address are exacerbated by the fact that billions of people believe and act without a basis in evidence, reason or broad moral compassion. I’m interested in whether people think there is value in trying to bring large numbers of people up towards a simple, common philosophical baseline like Sentientism.
Meta: I’m concerned about the amount of downvotes I see that aren’t accompanied with any justification. Consider that there is a lot of information value in a negative judgment. I imagine that the author would be very happy to hear about this, and more generally, I imagine that EA as a whole would skill up a *lot* faster if downvotes came with instructions.
Downvotes aren’t primarily to help the person being downvoted. They help other readers, which after all there are many more of than writers. Creating an expectation that they should all be explained increases the burden on the downvoter significantly, making them less likely to be used and therefore less useful.
The title of this post is “Does EA need an underlying philosophy? Could Sentientism be that philosophy?” I would consider both of those points to betray a lack of basic understanding of effective altruism. There are lots of good resources elsewhere- perhaps it would be good to have a basic faq article / wiki that could be Linked in such cases?
Thanks all. I’d love to hear thoughts from anyone who has downvoted. No obligation of course.
Alasdair—I think I’m reasonably familiar with EA but I could have been clearer. I was trying to explore two points:
1) Given both sentientism and EA focus on using evidence and reason and having broad moral compassion—I thought the term and the philosophy might be of interest to EA people generally.
2) Many (all?) of the problems EA looks to address are exacerbated by the fact that billions of people believe and act without a basis in evidence, reason or broad moral compassion. I’m interested in whether people think there is value in trying to bring large numbers of people up towards a simple, common philosophical baseline like Sentientism.