Dear @Charlie_Guthmann, sorry for correcting you, but I’m not confusing EA with utilitarianism at all.
Indeed, I consider myself a utilitarian and I would give to A rather than B. Because in my version of utilitarianism the relationship between subjective wellbeing and utility is nonlinear and I would consider an increase from 0 to 1 subjective wellbeing as a larger increase in utility than one from 8 to 10.
My trouble is that a recent talk at the EAGxVirtual conference by someone from the Happier Lives Institute seemed to suggest that one should measure the amount of doing good by adding up increases in subjective wellbeing (rather than utility), and that that would lead us giving to B, which to me seems a little unfair.
Don’t feel sorry about correcting me, I appreciate the discourse. However I still disagree with your original comment. You originally said “according to EA…” EA doesn’t prescribe whether or not you should have linear/log/etc utility function with respect to subjective well being. EA is not a monolith. That’s simply what someone at the happier lives institute seems to believe, or implied without realizing what they were implying. There are definitely people in the community who share your views.
I’ll agree my original statement was lacking nuance.
I’m receptive to the idea that most of the people in ea do xyz, or EA in so fourth as it is made up of certain institutions seems to act or speak as as if it’s values are xyz, but the phrasing “according to ea” gives me the sense that you think there is some normative Bible of EA, which for better or worse there isn’t. If you meant one of the above statements, then I’m in agreement. Large swaths of people who identify as EAs say things that basically equate a very simple/specific form of utilitarianism with doing good. This troubles me as well.
Thanks for this clarification. I was sloppy originally when saying “it seems that according to EA, …” instead of saying “it seems that according to prominent members of the EA community, …”.
So do you know of any statistics on what the values of people who identify with the EA community actually are, and what they would say to such questions as mine?
I looked briefly but wasn’t able to find anything. There is the Demographic Survey also, but the level of specificity re moral views is not going to help you much.
EA Polls Facebook group ← I don’t think they have the poll you are looking for but I would recommend posting your question here if you want more responses. Also Short-form posts generally don’t get much engagement, and the engagement they do get is often by the most committed EAs, so if you really care you might consider making a top level post, though that is understandably more stressful.
Dear @Charlie_Guthmann, sorry for correcting you, but I’m not confusing EA with utilitarianism at all.
Indeed, I consider myself a utilitarian and I would give to A rather than B. Because in my version of utilitarianism the relationship between subjective wellbeing and utility is nonlinear and I would consider an increase from 0 to 1 subjective wellbeing as a larger increase in utility than one from 8 to 10.
My trouble is that a recent talk at the EAGxVirtual conference by someone from the Happier Lives Institute seemed to suggest that one should measure the amount of doing good by adding up increases in subjective wellbeing (rather than utility), and that that would lead us giving to B, which to me seems a little unfair.
Don’t feel sorry about correcting me, I appreciate the discourse. However I still disagree with your original comment. You originally said “according to EA…” EA doesn’t prescribe whether or not you should have linear/log/etc utility function with respect to subjective well being. EA is not a monolith. That’s simply what someone at the happier lives institute seems to believe, or implied without realizing what they were implying. There are definitely people in the community who share your views.
I’ll agree my original statement was lacking nuance.
I’m receptive to the idea that most of the people in ea do xyz, or EA in so fourth as it is made up of certain institutions seems to act or speak as as if it’s values are xyz, but the phrasing “according to ea” gives me the sense that you think there is some normative Bible of EA, which for better or worse there isn’t. If you meant one of the above statements, then I’m in agreement. Large swaths of people who identify as EAs say things that basically equate a very simple/specific form of utilitarianism with doing good. This troubles me as well.
Thanks for this clarification. I was sloppy originally when saying “it seems that according to EA, …” instead of saying “it seems that according to prominent members of the EA community, …”.
So do you know of any statistics on what the values of people who identify with the EA community actually are, and what they would say to such questions as mine?
I looked briefly but wasn’t able to find anything. There is the Demographic Survey also, but the level of specificity re moral views is not going to help you much.
EA Polls Facebook group ← I don’t think they have the poll you are looking for but I would recommend posting your question here if you want more responses. Also Short-form posts generally don’t get much engagement, and the engagement they do get is often by the most committed EAs, so if you really care you might consider making a top level post, though that is understandably more stressful.
Thanks, Charlie. Unfortunally I boycott Faceb**k, but I will consider making a top-level post, on some thought experiments like this one.