I like this. I was surprised it hasn’t received more upvotes yet.
I suspect what’s going on is that most people here are focused on the arguments in the post—and quite rightly so, I suppose, for a red teaming contest—and are thinking, “Meh, nothing I haven’t heard before.” Whereas I’m a bit unusual in that I almost always habitually focus on the way someone presents an argument and the wider context, so I read this and am like, “Omg EA-adjacent person making an effort to share their perspective and offering a sensible critique seemingly from a place of trying to help rather than to take the piss or vent their anger—this stuff is rare and valuable and I’m grateful to you for it (and to the contest organisers) and I want to encourage more of it.”
I’m really curious about the “nothing I haven’t heard before” in relation to the Social Capital Concern. Have people raised this before? If so, what’s being done about it? As I said, I think it’s the most serious of the four I mentioned, so if it’s empirically supported, what’s the action plan against it?
I think occasionally I hear people argue that others focus on longtermist issues in large part because it’s more exciting/creative/positive etc to think about futuristic utopias, then some of those people reply “Actually I really miss immediate feedback, tangible results, directly helping people etc, it’s really hard to feel motivated by all this abstract stuff” and the discussion kind of ends there.
But the broader Social Capital Concern is something that deserves more serious attention I think. The ‘core’ of the EA community seems to be pretty longtermist (whether that’s because it is sexier, or because these people have thought about / discussed / researched it a lot, whatever reason) and so you would expect this phenomenon of people acting more longtermist than they actually are in order to gain social capital within the community.
Marisa encourages neartermist EAs to hold on to their values here. Luke Freeman encourages EA to stay broad here. Owen Cotton-Barratt says “Global health is important for the epistemic foundations of EA, even for longtermists”. [Edit: These are all community leaders (broadly defined), so as well as the specific arguments they make, I think the very fact that they’re more prominent members of the community expressing these views is particularly useful when the issue at hand is social capital.]
I also kinda get the sense that many EA orgs/groups cater to the neartermist side of EA mainly out of epistemic humility / collaborative norms etc rather than personally prioritising the associated causes/projects. E.g. I’m pretty longtermist, but I still make some effort to help the more neartermist EAs find PAs—it felt like that was the default for a new community-focused organisation/project. And I remember some discussion around some of CEA’s projects being too focused on longtermism a few years back and things seem to be more evenly distributed now.
(I think there are probably many more examples of public and private discussion along these lines, apologies for not giving a more comprehensive response—it’s hard from this selection to get a sense of if we’re doing enough or even too much to correct for the Social Capital Concern. My intention wasn’t actually to be like “Yeah, heard it all before” otherwise I expect I would have included some links to similar discussions to start with. I was more theorising as to what others might be thinking and explaining my own upvote. Sorry for not making this clearer—I’m just re-reading my first comment now and it seems a bit rude!)
This seems defensive lol. My entire thing here is, I’m asking if there is support for this because I don’t know because I’m not in the community. It seems like you’re saying “it’s been mentioned but is not necessarily true.” If that’s the case, it would be helpful to say that. If it’s something else, it would be helpful to say that thing!
I like this. I was surprised it hasn’t received more upvotes yet.
I suspect what’s going on is that most people here are focused on the arguments in the post—and quite rightly so, I suppose, for a red teaming contest—and are thinking, “Meh, nothing I haven’t heard before.” Whereas I’m a bit unusual in that I almost always habitually focus on the way someone presents an argument and the wider context, so I read this and am like, “Omg EA-adjacent person making an effort to share their perspective and offering a sensible critique seemingly from a place of trying to help rather than to take the piss or vent their anger—this stuff is rare and valuable and I’m grateful to you for it (and to the contest organisers) and I want to encourage more of it.”
Thank you so much for this!
I’m really curious about the “nothing I haven’t heard before” in relation to the Social Capital Concern. Have people raised this before? If so, what’s being done about it? As I said, I think it’s the most serious of the four I mentioned, so if it’s empirically supported, what’s the action plan against it?
I think occasionally I hear people argue that others focus on longtermist issues in large part because it’s more exciting/creative/positive etc to think about futuristic utopias, then some of those people reply “Actually I really miss immediate feedback, tangible results, directly helping people etc, it’s really hard to feel motivated by all this abstract stuff” and the discussion kind of ends there.
But the broader Social Capital Concern is something that deserves more serious attention I think. The ‘core’ of the EA community seems to be pretty longtermist (whether that’s because it is sexier, or because these people have thought about / discussed / researched it a lot, whatever reason) and so you would expect this phenomenon of people acting more longtermist than they actually are in order to gain social capital within the community.
Marisa encourages neartermist EAs to hold on to their values here. Luke Freeman encourages EA to stay broad here. Owen Cotton-Barratt says “Global health is important for the epistemic foundations of EA, even for longtermists”. [Edit: These are all community leaders (broadly defined), so as well as the specific arguments they make, I think the very fact that they’re more prominent members of the community expressing these views is particularly useful when the issue at hand is social capital.]
I also kinda get the sense that many EA orgs/groups cater to the neartermist side of EA mainly out of epistemic humility / collaborative norms etc rather than personally prioritising the associated causes/projects. E.g. I’m pretty longtermist, but I still make some effort to help the more neartermist EAs find PAs—it felt like that was the default for a new community-focused organisation/project. And I remember some discussion around some of CEA’s projects being too focused on longtermism a few years back and things seem to be more evenly distributed now.
(I think there are probably many more examples of public and private discussion along these lines, apologies for not giving a more comprehensive response—it’s hard from this selection to get a sense of if we’re doing enough or even too much to correct for the Social Capital Concern. My intention wasn’t actually to be like “Yeah, heard it all before” otherwise I expect I would have included some links to similar discussions to start with. I was more theorising as to what others might be thinking and explaining my own upvote. Sorry for not making this clearer—I’m just re-reading my first comment now and it seems a bit rude!)
I don’t think “people have mentioned this before” and “it’s empirically supported” are the same things!
This seems defensive lol. My entire thing here is, I’m asking if there is support for this because I don’t know because I’m not in the community. It seems like you’re saying “it’s been mentioned but is not necessarily true.” If that’s the case, it would be helpful to say that. If it’s something else, it would be helpful to say that thing!
I didn’t mean to come across as defensive. Communicating across cultural barriers is hard.
I wholeheartedly agree with Holly Morgan here! Thank you for writing this up and for sharing your personal context and perspective in a nuanced way.