Hm what about saying that ok then if one values different objectives then they can still try to do the most good with their spare resources, making some kind of a conditional or a weighted average in their mind (for example, one can think that if they enjoyed Van Gogh ‘this much’ they can then focus on family ‘that much’ and then make philanthropic investments, which can enable people to do the same ‘emotions-based prioritization,’ such as care for family’s basic needs and enjoy aesthetics of family presentation or can enable communities to act in this way, such as see if non-human animals should receive some attention, if economics should be improved, and if time to enjoy relationships should be allocated—this can be better possible in less industrialized areas which may make more decisions by ‘emotional consensus’).
The problem here is that it’s still overtly utilitarian, with just a bit more wiggle room. It still forces people to weigh one thing against the other, which is what I think they might be uncomfortable doing. Buck Shlegerissays’ everything is triage’ and I think you’d agree with this sentiment. However, I don’t think everyone likes to think this way, and I don’t want that hiccup to be the reason they don’t further investigate EA.
Hmmm …but is it more so about the presentation of relative power between the one who offers and the one who contemplates EA as a reasonable or not framework? For example, if Buck Shlegeris (or anyone else) offers that he has “the utmost respect in [his] heart” for “dumb” people, if the person was implying that different thinking should be dismissed or ridiculed (to assert dominance by fear as opposed to critical thinking invitation), then regardless of the framework that the offering person would support, the suggestion may be relatively less well accepted among people who seek to cooperate.
So, if ‘everything is a triage’ is meant to (or happens to, in interpretation) allude to the notion of ‘an exclusive group of decisionmakers does not have time for the emotional requests of a much larger group, which can be perceived as almost disgusting by persistent appeals’ then a utilitarian framework with actually highly significant wiggle room may be accepted relatively less well than when ‘everything is a triage’ connotes ‘everyone is a decisionmaker; decisions are challenging; we have to take care of our close ones but also others; requests are received well and always welcome but there is only so much one can do, perhaps the best is to inspire and encourage’ - for example.
So, sure, I think that any framing that shows that the one who pitches EA sincerely cares about the perspective of the other person or people but also is confident that EA is a great option should work. What you are suggesting can work for many people, perhaps a stereotype of older affluent decisionmakers who seek to be seen as righteous and caring for others/their group by almost privileging them. It should be actually brilliant for appealing to such decisionmakers.
What I was suggesting can appeal to non-decisionmakers, those who perhaps do not much enjoy Van Gogh because they may prefer to save the gallery entrance fee and time spent there to develop relationships with others—may understand decisionmaking more as an emotional consensus. Your pitch would not work there; the people would feel like they have nothing to contribute.
Between your and maybe moral circles/triage pitches, moral circles/triage invites decisionmakers’ and others’ critical thinking about institutions/standards, by reason. It just makes sense to get a bit organized about the impact. There is nothing personal, no ‘failed hopes and dreams’ of ‘future potential’ and asking others to solve ‘one’s concerns’ by being on their side of fairness. Thus, your framing can attract a group of people who could come across as seeking to emotionally influence others who they expect to not ‘yield’ because they do not have to. That is a movement stagnation risk.
So, I suggest the moral circles/triage/high impact with some, those which can be well spared, resources as an option that one offers when pitching EA and reason of pitching it to be that it actually benefits them personally, is quite cool, so it is almost a personal tip with absolutely no feelings or judgment regarding one’s thinking about such tip.
Hm what about saying that ok then if one values different objectives then they can still try to do the most good with their spare resources, making some kind of a conditional or a weighted average in their mind (for example, one can think that if they enjoyed Van Gogh ‘this much’ they can then focus on family ‘that much’ and then make philanthropic investments, which can enable people to do the same ‘emotions-based prioritization,’ such as care for family’s basic needs and enjoy aesthetics of family presentation or can enable communities to act in this way, such as see if non-human animals should receive some attention, if economics should be improved, and if time to enjoy relationships should be allocated—this can be better possible in less industrialized areas which may make more decisions by ‘emotional consensus’).
The problem here is that it’s still overtly utilitarian, with just a bit more wiggle room. It still forces people to weigh one thing against the other, which is what I think they might be uncomfortable doing. Buck Shlegeris says’ everything is triage’ and I think you’d agree with this sentiment. However, I don’t think everyone likes to think this way, and I don’t want that hiccup to be the reason they don’t further investigate EA.
Hmmm …but is it more so about the presentation of relative power between the one who offers and the one who contemplates EA as a reasonable or not framework? For example, if Buck Shlegeris (or anyone else) offers that he has “the utmost respect in [his] heart” for “dumb” people, if the person was implying that different thinking should be dismissed or ridiculed (to assert dominance by fear as opposed to critical thinking invitation), then regardless of the framework that the offering person would support, the suggestion may be relatively less well accepted among people who seek to cooperate.
So, if ‘everything is a triage’ is meant to (or happens to, in interpretation) allude to the notion of ‘an exclusive group of decisionmakers does not have time for the emotional requests of a much larger group, which can be perceived as almost disgusting by persistent appeals’ then a utilitarian framework with actually highly significant wiggle room may be accepted relatively less well than when ‘everything is a triage’ connotes ‘everyone is a decisionmaker; decisions are challenging; we have to take care of our close ones but also others; requests are received well and always welcome but there is only so much one can do, perhaps the best is to inspire and encourage’ - for example.
So, sure, I think that any framing that shows that the one who pitches EA sincerely cares about the perspective of the other person or people but also is confident that EA is a great option should work. What you are suggesting can work for many people, perhaps a stereotype of older affluent decisionmakers who seek to be seen as righteous and caring for others/their group by almost privileging them. It should be actually brilliant for appealing to such decisionmakers.
What I was suggesting can appeal to non-decisionmakers, those who perhaps do not much enjoy Van Gogh because they may prefer to save the gallery entrance fee and time spent there to develop relationships with others—may understand decisionmaking more as an emotional consensus. Your pitch would not work there; the people would feel like they have nothing to contribute.
Between your and maybe moral circles/triage pitches, moral circles/triage invites decisionmakers’ and others’ critical thinking about institutions/standards, by reason. It just makes sense to get a bit organized about the impact. There is nothing personal, no ‘failed hopes and dreams’ of ‘future potential’ and asking others to solve ‘one’s concerns’ by being on their side of fairness. Thus, your framing can attract a group of people who could come across as seeking to emotionally influence others who they expect to not ‘yield’ because they do not have to. That is a movement stagnation risk.
So, I suggest the moral circles/triage/high impact with some, those which can be well spared, resources as an option that one offers when pitching EA and reason of pitching it to be that it actually benefits them personally, is quite cool, so it is almost a personal tip with absolutely no feelings or judgment regarding one’s thinking about such tip.