What’s an ideal name for the larger ecosystem that EA resides in? Including things like the Progress Studies, Longtermist, Rationality communities?
[Question] Name for the larger EA+adjacent ecosystem?
- 14 Aug 2021 16:24 UTC; 32 points) 's comment on What is the closest thing you know to EA that isn’t EA? by (
- 15 Apr 2022 15:02 UTC; 6 points) 's comment on Announcing the Forecasting Wiki by (
- 16 Aug 2021 1:05 UTC; 4 points) 's comment on Propose and vote on potential EA Wiki entries by (
Could you say a little more about the context(s) where a name seems useful?
(I think it’s often easier to think through what’s wanted from a name when you understand the use case, and sometimes when you try to do this you realise it was a slightly different thing that you really wanted to name anyway.)
TBH, it’s a question that popped into mind from background consciousness, but I can think of many possible applications:
helping people in various parts of the EA-adjacent ecosystem know about the other parts, which they may be better-suited to helping
helping people in various parts of this ecosystem understand what thinking (or doing) has already been done in other parts of the ecosystem
building kinship between parts of the ecosystem
academically studying the overall ecosystem—why have these similar movements sprung up at a similar time?
planning for which parts are comparatively advantaged at what different types of tasks
Thanks, makes sense. This makes me want to pull out the common characteristics of these different groups and use those as definitional (and perhaps realise we should include other groups we’re not even paying attention to!), rather than treat it as a purely sociological clustering. Does that seem good?
Like maybe there’s a theme about trying to take the world and our position in it seriously?
Makes sense—I guess they’re all taking an enlightenment-style worldview and pursuing intellectual progress on questions that matter over longer timescales...
Maybe the obvious suggestion then is “new enlightenment”? I googled, and the term has some use already (e.g. in a talk by Pinker), but it feels pretty compatible with what you’re gesturing at. I guess it would suggest a slightly broader conception (more likely to include people or groups not connected to the communities you named), but maybe that’s good?
I find “new enlightenment” very fitting. But wonder whether it might at times be perceived as a not very humble name (must not be a problem, but I wonder whether some, me included, might at times end up feeling uncomfortable calling ourselves part of it).
I agree that this is potentially an issue. I think it’s (partially) mitigated the more it’s used to refer to ideas rather than people, and the more it’s seen to be a big (and high prestige) thing.
As I mentioned above, cf “Brights”
Why not just call it the EA-adjacent ecosystem? I think there are lots of communities that intersect with EA, and it would probably be difficult to make one acronym that includes all of these communities.
Strong upvote for something like this unless there’s some context I’m missing? I.e. what is the sentence that this would belong in?
Maybe the “PEARL communities”? (Progress studies, effective altruism, rationality and longtermism)?
Another adjacent community you might want to mention is the forecasting community.
Yes, and I guess there’s a lot of other components that could possibly be added to that list: science reform (reproducibility, open science), fact-checking, governance reform (approval voting or empowerment of the technocracy), that vary from being possible small ingredients of any new “enlightenment” to being unlikely to come to much...
My sense is that the forecasting community overlaps more with the PEARL communities, e.g. the fact-checking does.
The FFLARRP ecosystem: forecasting, fact-checking, longtermism, altruism, rationality, reform, and progress! :P
Sexy.
Clarification question: why do you understand longtermism to be outside of EA?
It seems to me that longtermism ( I assume you talk about the combination of believing in strong longtermism (Greaves and Macaskill, 2019) and believing in doing the most good) is just one particular kind of an effective altruist (an effective altruist with particular moral and empirical beliefs).
Just like environmentalism and animal rights intersect with EA, without being a subset of it, the same could be true for longtermism. (I expect longtermism to grow a lot outside of EA, while remaining closer to EA than those other groups.)
Yeah – things like the Long Now Foundation have been around for decades and aren’t necessarily approaching longtermism from the same angle as EA.
Not bad but maybe not catchy enough? I’m also worried about the connotation of “pearl” as in a prized thing.
Worried about analogue where some atheists and rationalists started calling themselves “Brights” and everyone threw up in their mouth a little. :)
I think EA and Rationality are fine.
How would you define longtermism so that it isn’t pretty much by definition EA? Like longtermism that isn’t necessarily primarily motivated by consequences for people in the future? I think GPI may have explored some such views, but I think it’s close enough to EA that we don’t need a new term.
If we’re including progress studies, why not international development, global health, AI safety, biosecurity, nuclear security, social movements, animal ethics, vegan studies, conflict and peace studies, transhumanism, futurism, philosophy of mind, etc.? Is progress studies more cause-neutral?
Spontaneously I find “Broad Rationality” a plausible candidate (I spontaneously found it being used as a very specific concept mainly by Elster 1983, but I find on google only 46 hits on ‘”broad rationality” elster ’, though there are of course more hits more generally on the word combination)
I typically refer to this as “EA+”, and people seem to understand what I mean.