I would like to emphasize that when we discuss community norms in EA, we should remember the ultimate goal of this community is to improve the world / humanity’s future as much as possible, not to make our lives as enjoyable as possible. Increasing the wellbeing of EAs is instrumentally useful for increased productivity and attracting more people to make sacrifices like “donate tens of thousands of dollars” or “change your career plan to work on this problem”, but ultimately the point isn’t to create a jolly in-group of ambitious nerds. For example, if the meshing of polyamorous and professional relationships causes less qualified candidates to earn positions in EA organizations, this may be net negative, even if the polyamorous relationships make people really happy.
I think it is possible that things that make the community happy will end up being net negative for the world. But I do think that creating a happy, thriving social community, that people feel comfortable in, is going to be really important for the longterm success of this movement (as you acknowledge). And there’s a kind of tricky thing where… like, if I felt like ‘oh, the movement tolerates you having polyamorous relationships now, but if we decided one day that this had net negative consequences, we’d shun you’ - then I’d feel way less good in the social community now, because my acceptance wouldnt feel secure. I think people need to feel safe and like their acceptance is “unconditional”, rather than feeling like if their presence is no longer deemed to be net positive for the world, they’ll be rejected from their social network [cf “I didn’t get into EAG and am sad” discourse] .
I put “unconditional” in inverted commas because obviously some conditions are always present and appropriate—if I went on a murdering spree (or committed a billion-dollar fraud :p) it would be reasonable for the community to shun me. But I think this bar should be pretty high, because the costs are bigger than the obvious costs to the people being shunned.
Improve the lot of humanity as much as we can given the resources we can gather and are willing to put in,
Gather and put in a lot of resources.
In particular, pretty much no-one puts in all their resources, or as much as they can possibly afford. Most EAs are not willing to entirely forego being really happy themselves in the pursuit of a better world.
(There are instrumental reasons why being infinitely self-sacrificing is a bad idea anyway, but even if there weren’t, most people just aren’t that hardcore about their utilitarianism.)
I would like to emphasize that when we discuss community norms in EA, we should remember the ultimate goal of this community is to improve the world / humanity’s future as much as possible, not to make our lives as enjoyable as possible. Increasing the wellbeing of EAs is instrumentally useful for increased productivity and attracting more people to make sacrifices like “donate tens of thousands of dollars” or “change your career plan to work on this problem”, but ultimately the point isn’t to create a jolly in-group of ambitious nerds. For example, if the meshing of polyamorous and professional relationships causes less qualified candidates to earn positions in EA organizations, this may be net negative, even if the polyamorous relationships make people really happy.
I think it is possible that things that make the community happy will end up being net negative for the world. But I do think that creating a happy, thriving social community, that people feel comfortable in, is going to be really important for the longterm success of this movement (as you acknowledge). And there’s a kind of tricky thing where… like, if I felt like ‘oh, the movement tolerates you having polyamorous relationships now, but if we decided one day that this had net negative consequences, we’d shun you’ - then I’d feel way less good in the social community now, because my acceptance wouldnt feel secure. I think people need to feel safe and like their acceptance is “unconditional”, rather than feeling like if their presence is no longer deemed to be net positive for the world, they’ll be rejected from their social network [cf “I didn’t get into EAG and am sad” discourse] .
I put “unconditional” in inverted commas because obviously some conditions are always present and appropriate—if I went on a murdering spree (or committed a billion-dollar fraud :p) it would be reasonable for the community to shun me. But I think this bar should be pretty high, because the costs are bigger than the obvious costs to the people being shunned.
I would say the goal of EA is twofold:
Improve the lot of humanity as much as we can given the resources we can gather and are willing to put in,
Gather and put in a lot of resources.
In particular, pretty much no-one puts in all their resources, or as much as they can possibly afford. Most EAs are not willing to entirely forego being really happy themselves in the pursuit of a better world.
(There are instrumental reasons why being infinitely self-sacrificing is a bad idea anyway, but even if there weren’t, most people just aren’t that hardcore about their utilitarianism.)