The guy in the panda hat at EAG
Cornelis Dirk Haupt
I wanted to downvote this comment. I think discussion on the topic and the dynamics it raises are very much worth discussing without being branded a bigot.
But then I did the exercise of replacing “polyamory” with “gay men” and “monogamous” with “straight” in the comment you responded to and was pretty horrified with the result.
It totally reads like a comment then that would have been socially acceptable not too many years ago, but that we strongly condemn now as homophobia.
I’m kinda just sitting with this info processing it, not entirely sure what conclusion to draw just yet.
So as a poly/poly-adjacent EA of many years I’ll start by saying I strong upvoted your post and that insofar as a vision for a better tomorrow is concerned, your comment was poetry to my ears. I am very much aligned. Beautiful stuff.
However, this little nugget just keeps coming back to me and it irks me:
“On the other hand, we do poly and flexible sexual connections and those of us who are engaged in those things will even try and help you figure out if it’s for you. Poly is fun. Sex is fun. Play and curiosity are fun. These are some of the major fun things our community does have going for it when comes to hedonism [and utopian way of life, over the rest of society.]”
I think you’re making the poly-part of the community sound way more accessible than it actually is. You possibly have a blindspot here because you don’t know what it is like to be on the outside trying to get in (?).
So here’s the thing. If poly for you in the community is this fantastically amazing, then it is a tragedy of a vastly worse degree than many EAs might even realize that they can’t be part of it.
I’m reminded of some study I read about years ago that showed that the mental health of people in third world countries decreased markedly when they were shown just how much better off people in first world countries are. Those in the control group that showed clear ignorance over how Europeans lived were… Happier.
Honestly, after reading so much poly-discourse on the forum lately I’m very surprised this point hasn’t been raised. An obvious model for what is going on, in my head, is that the poly-backlash is part of a larger backlash against the “moral fulfillment and life fulfillment and career fulfillment and hedonism fulfillment and just-everything-important fulfillment” that is perceived to be held and disproportionately concentrated among few in EA.
Scott Alexander’s response is the first time I see that there is someone I can contact who has substantial claims of evidence that Kathy Forth’s accusations were false. I’ve heard of Kathy twice in the last month (don’t remember hearing about her at all before then), as have others in my local community. Many find Scott Alexander’s response valuable, which is why it is the top comment. A large part of the EA community appears to only recently be learning about Kathy Forth.
it makes me sad that the top comment on this post is adversarial/argumentative and showing little emotional understanding/empathy (particularly the line “getting called out in posts like this one”). I think it unfortunately demonstrates well the point the author made about EA having an emotions problem
I personally think Scott shows immense emotional maturity responding to this in a context where he is opening himself up to huge scrutiny, including the criticism of being told he is being too adversarial and lacking empathy. He removed the sentence in question after some reflection, updating immediately and explaining his thought process, empathising with another’s perspective and recognizing his own emotional state that led to him to include that sentence. To me these seem the hallmarks of being a well emotionally regulated individual. If it isn’t, what does a person with emotional understanding/empathy do differently in this situation?
Before you answer that question, let’s take a moment to actually highlight what the situation even is:
Kathy Forth was a human, a member of our community, that committed suicide. Given the serious implications of Kathy Forth’s accusations if they were to be true, it seems that we should place a lot of value on anything that can confirm or deny the veracity of Kathy Forth’s story. Do you disagree?
This is the sentence you don’t like that Scott brought up. He removed it.
But they wouldn’t do that, I’m guessing because they were all terrified of getting called out in posts like this one.
OP wrote the following words that, for lack of a better word, triggered Scott. She also has the opportunity to amend or qualify these words:
I read about Kathy Forth, a woman who was heavily involved in the Effective Altruism and Rationalist communities. She committed suicide in 2018, attributing large portions of her suffering to her experiences of sexual harassment and sexual assault in these communities. She accuses several people of harassment, at least one of whom is an incredibly prominent figure in the EA community. It is unclear to me what, if any, actions were taken in response to (some) of her claims and her suicide. What is clear is the pages and pages of tumblr posts and Reddit threats, some from prominent members of the EA and Rationalist communities, disparaging Kathy and denying her accusations.
Nowhere in what OP writes above does she even seem to entertain the possibility that at least some of Kathy’s major accusations could be false (she says “It is unclear to me what, if any, actions were taken in response to (some) of her claims and her suicide” but nothing akin to “It is unclear to me whether Kathy Forth’s accusations were true”).
Kathy Forth’s story is a really really serious accusation. One in which whether we believe it is true or false would and should significantly update our priors around how the EA community treats the concerns of women. If viewed to be true, it would frame the experiences of women and the EA community’s prior response to them in a very sinister light. If viewed to be false, then concerns 1+2 don’t have a broader sinister context and could be more optimistically corrected with improved managment and a culture shift in EA. For example...I agree that Swapcard at EAGs should, in the vast majority of cases, not be used for dating (my dudes in EA… this past weekend at EAGx Berkeley I met a woman that was flat out frustrated that she was hit on through Swapcard one-on-ones twice within 24 hours… facepalm)
I agree profusely that ranking women by how much you want to have sex with them and publicly sharing that casually is creepy as all hell (my dudes in EA… how it is possible that so many of you can be so into “intrumental rationality” and yet act so obviously against your own self-interests when it comes to your behaviour around women is beyond me… facepalm)
These issues seem like something we can come together and fix. But if Kathy Forth’s accusations are true, then it is implied that EA as a community is much more sinister and not interested in addressing the concerns of women. If Kathy Forth’s accusations are true, then there is a deep rot within the EA community. There is no “facepalm” joke I’d be able to light-heartedly say about it, as if it were a thing we can reasonably fix. If Kathy Forth’s accusations are true, OP is right to be scared.
OP’s framing of Kathy Forth’s experience strongly implies they view Kathy Forth’s accusations as more true than false. I don’t know if she believes this because either:
a) She had bad experiences in EA and that made her update more towards “Kathy Forth’s story is probably true”
b) She thought Kathy Forth’s story is probably more true than false, and that made her update her experiences in EA as more bad (in a broader sinister context) than they otherwise would be
But the epistemic particulars are beside the point, because either way, if Scott is able to provide compelling evidence that parts of Kathy Forth’s story is false, this could help OP (and everyone else) feel less sad, disappointed and scared and that can only be a good thing.OP, if you are reading this, I realize the EA community can seem intimidating. I empathize strongly with you when you expressed concern about how others will judge your writing style and take you less seriously if it did not conform to forum standards (why hello there… my still haven’t-made-one-post self… something I am still embarrassed about given how long I’ve been a part of EA). That said, I am confident if you made a very short post just like “Please, what happened to Kathy Forth and why? I need to know, I cant sleep. I feel sad, disappointed, and scared.” the EA community would have responded with compassion—not caring that the post didn’t abide by some set of forum norms. But, regardless, you did put effort into a longer post here so I just want to say I’m glad you posted this.
- 4 Mar 2023 0:04 UTC; 7 points) 's comment on Consider not sleeping around within the community by (
Good lord… am I the only one shedding a tear? This post is one helluva heavy dose of fuzzies and if you are a hug person I’d love to give you a virtual hug for making my morning, if you consent. Thank you so much for all the effort put into this warm post. It is appreciated. Ganna go off to do my earn to give coding with a smug smile on my face now.
Fwiw, I think your view that a leftward shift in EA would increase popularity is probably Americocentric. I doubt it is true if you were to consider EA as a global movement rather than just a western one.
Also, fwiw, I’ve lost track of how many people I’ve seen dismiss EA as “dumb left-wing social justice”. EAs also tend to think the consequence of saying something is what matters. So we tend to be disliked both by free speech absolutists and by people who will never concede that properly discussing some controversial topics might be more net positive than the harm caused by talking about them. Some also see EA as tech-phobic. Steven Pinker famously dismissed EA concerns about AI Alignment. If you spend time outside of EA in tech-optimism-liberal circles you see a clear divide. It isn’t culturally the same. Despite this, I think I’ve also lost count of how many people I’ve seen dismiss EA as ” right-leaning libertarian tech-utopia make-billionaires-rich nonsense”
We can’t please everyone and it is a fool’s errand to try.
One person’s “steven pinker style techno-liberalism, with a free speech absolutist stance and a vague unease about social justice activism” is another person’s “Ludite free speech blocking SJW”
If following principles does not clearly push EA one way or the other, also then so be it.
Edit: he has responded
I was in the same conversation. It was at an EAGx conference, not EAG. I’ve pinged the person who shared the anecdote. They’ll respond here soon. Better to get it straight from the source. I don’t think they mentioned the staff member enforced any norms—just that their prescence felt kind of odd given they are someone with the power to ban people from the conference (which I guess implicitly enforces norms). They also mentioned that this particular staff member was regardless a lovely person and mentioned how their prescence feeling odd or intimidating may have something to do with a cultural difference since the EAGx was held in Asia.
But let us wait for the anecdote source to respond directly lest we let this hearsay evidence train derail itself.
I’d also like to say as someone who has been to multiple EAGs and even volunteered, I’ve not experienced there being any intimidating norm-enforcer. In fact, I’d be hard-pressed to think of even a single EAG staff member that wasn’t incredibly approachable, kind, professional and compassionate. No matter where they were on the implicit status hierarchy—even if they were virtually EA-celebrities—nobody felt intimidating to me.
Given it is the Giving Season, I’d be remiss not to point out that ACE currently has donation matching for their Recommended Charity Fund.
I am personally waiting to hear back from RC Forward on whether Canadian donations can also be made for said donation matching, but for American EAs at least, this seems like a great no-brainer opportunity to dip your feet in effective animal welfare giving.
I think another useful question to ask could be something like, “what is your fantasy partner/complement organization?”
This part here is where my eyes widened. Adding this as standard question on EA grants is, in hindsight, so obviously a good idea to me that I am kinda in shock we don’t do so already.Creating a group of EA free agents that can be allocated/rented to EA-aligned non-profits?
Actually, this already exists I believe! I know there is a website called “EA Services” that allows you to sign up to basically be allocated around EA/EA-aligned orgs. Can anyone link the website? I’ve lost the URL.
Already seeing the love on the Animals and Longtermism Discord! I concur, the website is exceptional. I hope you’re tracking hits and website interaction so in the long-term we can get some proxy-measures for the site’s long-term impact! It is a fantastic and superior alternative to just sending newbies to the Wild Animal Suffering wikipedia page—which is just too dense IMO.
If you’re an animal welfare EA I’d highly recommend joining the wholesome refuge that is the newly minted Impactful Animal Advocacy (IAA).
Website and details here. I volunteered for them at the AVA Summit which I strongly recommend as the premier conference and community-builder for animal welfare-focused EAs. The AVA Summit has some features I have long thought missing from EAGs—namely people arguing in good faith about deep deep disagreements (e.g. why don’t we ever see a panel with prominent longtermist and shorttermist EAs arguing for over an hour straight at EAGs?). There was an entire panel addressing quantification bias which turned into talking about some believing how EA has done more harm than good for the animal advocacy movement… but that people are afraid to speak out against EA given it is a movement that has brought in over 100 million dollars to animal advocacy. Personally I loved there being a space for these kind of discussions.
Also, one of my favourite things about the IAA community is they don’t ignore AI, they take it seriously and try to think about how to get ahead of AI developments to help animals. It is a community where you’ll bump into people who can talk about x-risk and take it seriously, but for whatever reason are prioritizing animals.
“Resistance Raid” is a bizarre framing of deliberately targeting and slaughtering defenceless women and children in their homes with the deliberate goal of mass terror.
Unlike say the ANC from my home country of South Africa that deliberately tried to only target government targets… that is clearly not what Hamas did. They aren’t freedom fighters, maybe some are, but not their organisation as a whole. Any support for the organisation—given what their charter said pre-2017 - can under no reasonable lens not be seen as tantamount to, at the very least, be supporting ex-Nazis insofar as explicit genocidal antisemitism is concerned. What reasonable counterargument justifying support for Hamas is there that isn’t “Israel is much worse”?
I do not understand why it is so hard for some people to comprehend that both the IDF and Hamas can be net-negative and evil. You don’t have to support the one you judge as the lesser evil and use euphemisms to describe their actions. You can oppose both and say both are savagely genocidal against the other.
“You claim responding against the emotional propaganda is wrong, but writing even close to the parallel from the Palestinian side would result in a perma-ban.”
I don’t believe this is the true given the contentious posts I’ve seen here over the years. I presume you have evidence of someone who is Palestinian and identifies as an EA that was perma-banned for writing from the Palestinian side? (i.e. not a political bot, someone who is actually part of the community) Because I’d be just as interested in reading that as I was reading this piece. And I wouldn’t be putting the two against each other, but be extending empathy to both authors as fellow human beings.
Also during the Oct 7th raid we know Israel killed many of it’s own civilians and it was a highly planned out military operation. If that’s a “terrorist” attack then what israel is doing is even worse than a genocide.
I had to do a double-take and am now only rereading this part after writing my response. You actually believe Israel deliberately perpetuated part of the Oct 7 raid? I’m at a complete loss for words...
Time-poor long-time deep EA with imposter syndrome here with a forum post draft now probably over a year old.
One point you didn’t hit on that I think strongly applies to me (and probably others like me) is just when I think I’ve struck upon some sort of insight or found a topic I want to dive into to write about… I find a lengthy well-written EA Forum post has already been written on the topic and not only that, that I agree with it and that there is nothing new I would add (or if there is it is better suited as a quick comment rather than a followup post).
In other online communities I have at times found myself writing profusely because I find areas of disagreement , areas where I can add value, influence discourse etc. In EA… I’m just “another EA” that has roughly the same views and values as so many other EAs. I’m weird and interesting and insightful outside of EA. Within EA… Not so much. And what I find myself wanting to say so often has already been said.
It actually seems to be a paradox of sorts. Due to being an EA “insider” I’m less likely to generate any unique valuable insight compared to someone who is more EA-adjacent or an EA “outsider.”
Strongly agree with you on everything you wrote.
Fun-fact: even though I’ve been in multi-year poly relationships even I don’t know if poly is right for me. I nominally identify as polyskeptic. This loosely means I believe more people than not are trying to be poly without realising that poly is sub-optimal for reaching their goals (whatever their goals are). I acknowledge I might be projecting here, because my dating life really only “took off” the way I wanted after I stopped trying to be poly and was nothing else other than “single.”That said, I do also have some empirical backing for my belief: I’ve spoken or know of at least 2-3 long-time poly EAs (i.e. poly for most of their dating life) that have since gone mono. I think the interesting thing was one of them saying they were shocked by how much more fulfilling mono was than they expected it would be given “poly-metaphysics” is what they strongly subscribed to before.
Which also speaks to a broader point: if you’re poly you’re interesting and get invited to speak on the Clearer Thinking podcasts etc etc. You gain status just due to your private relationship preference in EA, or such is my perception. Nobody cares if you’re mono.
But, this is getting to a point where I need to go to work and I’d like to talk with you over video chat instead to continue—perhaps on EA Gather.town to make it public. DM’d you :)
I disagree. If anything EA has a problem that Alexrjl hinted at that you gain too much status for criticising EA. Scott Alexander’s recent post made me update in that direction.
(Sidenote: I gave your comment an upvote because I appreciate it, but an agreement downvote since I disagree. And it is just making me happy right now to see how useful explicitly seperating these two voting systems can be)
This might sound silly but I guess naming can matter a lot, but is there a name other than “tofu” that these “rare Chinese tofus” can be called that wouldn’t just be made up? If they are indeed as different then it might be worth marketing them with a different name to distinguish them strongly from what Westerners currently think of as tofu. I am a Westerner and long time vegan, but throughout the article whenever I read the word “tofu” the image of only culinary blandness crosses my mind because that is the only reference point I have. This image association I introspectovely notice I cannot seem to break despite being very excited now to try these rare Chinese Tofus before I die.
On that note, I’d like to voice support for the sentiment that Chinese vegan cuisine is shockingly diverse. I travelled to Hunan pre-covid and thought I would have cheat and eat meat from time to time thinking veganism just doesn’t exist there. I was incredibly wrong.
It felt easier to find cheap incredibly tasty full vegan meals there even though I didn’t know the area than it is in my very vegan-friendly West Coast city despite living here for over a decade. Almost certainly my memories are being clouded by positive association due to being on vacation, but still a data point. Make of it what you will.
Consider buying an anti-hunting colar or bib in case you don’t know these are options! Make sure whenever they are let out to play you put the bib/colar on first.
All the joy of playing with your cat outside and feeling like a cool cat dad giving your cat-kids what they want, with none of the guilt!
I strongly agree with this. And your footnote example is also excellent-excellent. I don’t see why it isn’t obvious that Constance’s goal of getting into EAG is merely intrumental to her larger goal of making the world a better place (primarily for animal suffering since that is what she currently seems to believe is the world’s most pressing issue).
EDIT: Lukas Gloor does a much better job than me at getting across everything I wanted to in this comment here
There was a vague tone of “the goal is to get accepted to EAG” instead of “the goal is to make the world better,” which I felt a bit uneasy about when reading the post. EAGs are only useful in so far as they let community members to better work in the real world.
From my reading her goals are not simply get into EAG. It seems obvious to me that her goal to get into EAG is instrumental to the end of making the world a better place. The crux is not “Constance just wants to get into EAG.” The crux I think is Constance believes she can help make the world a better place much more through connecting with people at EAG. The CEA does not appear to believe this to be the case.
The crux should be the focus. Focusing on how badly she wants to get into EAG is a distraction.
“EAG exists to make the world a better place, rather than serve the EA community or make EAs happy.”
For many EAs you cannot have a well-run conference that makes the world a better place without it also being a place that makes many EAs very happy. I’d think the two goals are synonymous for a great many EAs.
In their comment Eli says:
This unfortunately sometimes means EAs will be sad due to decisions we’ve made — though if this results in the world being a worse place overall, then we’ve clearly made a mistake.
Let’s also remember that EAs that get rejected from EAG that believe their rejection resulted in the world being a worse place overall will also be sad—probably moreso because they get both the FOMO but also a deeper moral sting. In fact, they might be so sad it motivates them to write an EA Forum post about it in the hopes of making sure that the CEA didn’t make a mistake.
I like Eli’s comment. It captures something important. But I also don’t like it because it can also provide a false sense of clarity—seperating goals that aren’t actually always that seperate—and this false clarity can possibly provide a motivated reasoning basis that can be used to more easily believe the EAG admission process didn’t make a mistake and make the world a worse place. Why? Because it makes it easier to dismiss an EA that is very sad about being rejected from EAG as just someone who “wants to get into EAG.”
So me and some other EAs just talked to the person in the tweet that got rejected.
Far as I could tell they have a stellar “EA resume” and was even encouraged to apply by leaders in their EA community.
Why were they rejected? What is this “specific bar for admissions and everyone above that bar gets admitted” and why are so many applying and then surprised when they don’t meet this bar? Or is my perception off here?
This isn’t an accusation. I’m in the camp that thinks the conference should not be a free-for-all. But I can’t figure out why the person in the tweet would be rejected from EAG. And as a community organiser it would be great if I can know how best to help the bright-eyed enthusiastic young promising students in my community get into EAG.
See also my other comment asking if the rejection process is possibly too opaque. Maybe that’s the real issue here. Imagine if every person who got rejected knew exactly why and what they could do to not get rejected next time. I almost feel like we wouldn’t be having this discussion because far fewer people would be upset.
On a much more lighthearted note you:
Have a prominent EA you consider your hero (with a photo op!)
Gave up one career to pursue another one you enjoyed less so you can do more good (EtG)
Have done some obscure project that you judged based on numbers instead of feelings was effective, so went for it (TNR kittens)
Have been rejected by at least one EAG
Wrote a lengthy EA Forum post criticizing the EA Community with proposed solutions
Don’t know all the LessWrong—EA jargon but is adamantly trying to learn more
Can we all just agree this is just (say this in a California Valley Girl accent) “So EA,” all these are so much the hallmarks of the EA experience that the Spirit of Gorgeous-Locks William MacAskill may well be bursting from you. So regardless about what any of us think about the EAG admissions process : you clearly belong in this community, lol.