This is a section of a EAF post Iāve begun drafting about the question of the community and culture of EA in the Bay Area, and its impact on the rest of EA worldwide. That post isnāt intended to only be about longtermism as it relates to EA as an overlapping philosophy/āmovement often originally attributed to the Bay Area. Iāve still felt like my viewpoint here in its rough form is still worth sharing as a quick take post.
@JWS šø self-describes as āanti-Bay Area EA.ā I get where anyone is coming from with that, though the issue is that, pro- or anti-, this certain subculture in EA isnāt limited to the Bay Area. Itās bigger than that, and people pointing to the Bay Area as a source of greatness or setbacks in EA is to me a wrongheaded sort of provincialism. To clarify, specifically āBay Area EAā culture entails the stereotypes-both accurate and misguidedāof the rationality community and longtermism, as well as the trappings of startup culture and other overlapping subcultures in Silicon Valley.
Prior even to the advent of EA, a sort of āproto-longtermismā was collaboratively conceived on online forums like LessWrong in the 2000s. Back then, like now, a plurality of the userbase of those forums might have lived in California. Yet it wasnāt only rationalists in the Bay Area who took up the mantle to consecrate those futurist memeplexes into what longtermism is today. It was academic research institutes and think tanks in England. It wasnāt @EliezerYudkowsky, nor anyone else at the Machine Intelligence Research Institute or the Center for Applied Rationality, who mostly coined the phrase ālongtermismā and wrote entire books about it. That was @Toby_Ord and @William_MacAskill It wasnāt anyone in the Bay Area who spent a decade trying to politically and academically legitimize longtermism as a prestigious intellectual movement in Europe. That was the Future of Humanity Institute (FHI), as spearheaded by the likes of Nick Bostrom and @Anders Sandberg, and the Global Priorities Institute (GPI).
In short, EA is an Anglo-American movement and philosophy, if itās going to be made about culture like that (not withstanding other features started introduced by Germany via Schopenhauer). It takes two to tango. This is why I think calling oneself āpro-ā or āanti-ā Bay Area EA is pointless.
Maybe itās worth pointing out that Bostrom, Sandberg, and Yudkowsky were all in the same extropian listserv together (the one from the infamous racist email), and have been collaborating with each other for decades. So maybe itās not precisely a geographic distinction, but there is a very tiny cultural one.
Iām an anti-Bay Area EA because from the stuff Iāve read about sexual harassment in EA itās like 90% Bay Area EAs doing it, and seems to be enabled by specific subcultural factors there that I donāt see anywhere else in the movement.
I`m guessing this is going to be a controversial post, though I was satisfied when like 10 minutes ago it had net zero karma because I wanted to make a screen cap for a Thanos āperfectly balanced, as all things should beā meme. This isnāt to say that whoever sees this post and feels like voting on it should try upvoting or downvoting it to try getting it to exactly zero karma. That would probably be futile because someone will in short order probably upvote or downvote it to some non-zero value. I just making this extra comment to vent about the fact how frustrating it is that Iāve waited over a year for one of the takes I drop on the EA Forum to have exactly zero karma so I can make a hella dope Thanos EA meme.
This is a section of a EAF post Iāve begun drafting about the question of the community and culture of EA in the Bay Area, and its impact on the rest of EA worldwide. That post isnāt intended to only be about longtermism as it relates to EA as an overlapping philosophy/āmovement often originally attributed to the Bay Area. Iāve still felt like my viewpoint here in its rough form is still worth sharing as a quick take post.
@JWS šø self-describes as āanti-Bay Area EA.ā I get where anyone is coming from with that, though the issue is that, pro- or anti-, this certain subculture in EA isnāt limited to the Bay Area. Itās bigger than that, and people pointing to the Bay Area as a source of greatness or setbacks in EA is to me a wrongheaded sort of provincialism. To clarify, specifically āBay Area EAā culture entails the stereotypes-both accurate and misguidedāof the rationality community and longtermism, as well as the trappings of startup culture and other overlapping subcultures in Silicon Valley.
Prior even to the advent of EA, a sort of āproto-longtermismā was collaboratively conceived on online forums like LessWrong in the 2000s. Back then, like now, a plurality of the userbase of those forums might have lived in California. Yet it wasnāt only rationalists in the Bay Area who took up the mantle to consecrate those futurist memeplexes into what longtermism is today. It was academic research institutes and think tanks in England. It wasnāt @EliezerYudkowsky, nor anyone else at the Machine Intelligence Research Institute or the Center for Applied Rationality, who mostly coined the phrase ālongtermismā and wrote entire books about it. That was @Toby_Ord and @William_MacAskill It wasnāt anyone in the Bay Area who spent a decade trying to politically and academically legitimize longtermism as a prestigious intellectual movement in Europe. That was the Future of Humanity Institute (FHI), as spearheaded by the likes of Nick Bostrom and @Anders Sandberg, and the Global Priorities Institute (GPI).
In short, EA is an Anglo-American movement and philosophy, if itās going to be made about culture like that (not withstanding other features started introduced by Germany via Schopenhauer). It takes two to tango. This is why I think calling oneself āpro-ā or āanti-ā Bay Area EA is pointless.
Maybe itās worth pointing out that Bostrom, Sandberg, and Yudkowsky were all in the same extropian listserv together (the one from the infamous racist email), and have been collaborating with each other for decades. So maybe itās not precisely a geographic distinction, but there is a very tiny cultural one.
Iām an anti-Bay Area EA because from the stuff Iāve read about sexual harassment in EA itās like 90% Bay Area EAs doing it, and seems to be enabled by specific subcultural factors there that I donāt see anywhere else in the movement.
Can you say more about this?
I`m guessing this is going to be a controversial post, though I was satisfied when like 10 minutes ago it had net zero karma because I wanted to make a screen cap for a Thanos āperfectly balanced, as all things should beā meme. This isnāt to say that whoever sees this post and feels like voting on it should try upvoting or downvoting it to try getting it to exactly zero karma. That would probably be futile because someone will in short order probably upvote or downvote it to some non-zero value. I just making this extra comment to vent about the fact how frustrating it is that Iāve waited over a year for one of the takes I drop on the EA Forum to have exactly zero karma so I can make a hella dope Thanos EA meme.