Your patience is admirable :)
burner
lol when people use this burner account, it’s usually closer to “this argument could get a bit annoying” than “I feel the need to protect my anonymity for fear of retribution.” please don’t speak for all burners
I disagree with this. For one, OpenPhil has a higher bar now. There’s a lot of work that needs to be done. ASB and others might already think this was a very bad grant. There’s a cost to dwelling on these things, especially as EA Forums drama rather than a high quality post mortem.
it’s not anywhere in any canonical EA materials
This seems a bit obtuse. In any local EA community I’ve been a part of, poly plays a big part in the culture.
Plenty of EAs are criticizing it in this very thread.
This is sort of true, but most of them are receiving a lot of downvotes. And this is the first time I’ve seen a proper discussion about it.
I don’t have a particular agenda about “what should happen” here. I’ve said we should scrutinize the ways that polyamorous norms could be abused in high trust communities. I’m not sure what the outcome would be, but I would certainly hope it’s not intolerance of poly communities.
I would readily agree that some—perhaps most—of these problems could also be solved by ensuring EA spaces are purely professional, but it does seem a bit obtuse to not understand that someone could feel more uncomfortable when asked to join a polycule at an EA meet up than simply being asked on a date.
I think an ideal outcome would to reduce the association between EA and poly—such that poly is not a major cultural touchstone within EA—while keeping EA a welcoming and respectful place for poly people.
I certainly don’t think it’s conclusive, or even strong evidence. As I said, I think it’s one thing among many that should inform our priors here. There’s also a different vein of anthropological research that looks at non-monogamy and abuse in cults and other religious contexts, but I’m less familiar with it.
The alternative—accepting norms of sexual minorities without scrutiny—seems perfectly reasonable in many cases, but because of those reasons I don’t think it should be abided by here, especially in light of these women’s accounts.
I emphasize there shouldn’t be any hostility or intolerance to polyamorous people, just the way polyamorous norms might create the potential for abuse in EA spaces (or generally in high trust, insular environments).
if you are saying “we shouldn’t tolerate this in the community”, that just is intolerant.
Ok, fortunately that is not what I am saying.
I’m very surprised by this. There are number of anthropological findings which connect monogamous norms to greater gender equality and other positive social outcomes. Recently arguments along these lines have been advanced by Joseph Henrich, one of the most prominent evolutionary biologists.
Something that is above question or criticism or question (see here), in this case because discourse is often cast as intolerant or phobic
This post is a bit weak in making its case but it is blindingly obvious that Helena is a grift and I’m a bit unimpressed by galaxy brain’d reasons (hit based, etc) for thinking it might be good.
But in the big picture, occasionally a grant is bad. We can’t treat every bad grant as a scandal.
It’s surprising to me that polyamory continues to be such a sacred cow of EA. It’s been highly negative for EA’s public image, and now it seems to be connected to a substantial amount of abuse. There’s a number of reasons our priors should suggest that non-monogomous relationships in high trust, insular communities can easily lead to abuse. It’s always seemed overly optimistic to think EA could avoid these problems. Of course, there have been similar ongoing discussions in the Berkeley Rationalist community for a number of years now.
This seems like one of the most important community issues to reflect on.
No, that’s not really what I mean. I mean that I generally doubt these public apologies are generally able to give people the emotional reconciliation that they desire.
They can provide a few things, presumably including PR damage mitigation, a sincere account of their thinking, and perhaps some amount of reconciliation.
My criticism of your post is that it seems intent on optimizing for only one of those—indeed considering it entirely sufficient for a “good apology” without considering how these things trade off, nor considering what we might normatively want an apology to do. In my view, a sincere account of someone’s beliefs is very valuable.
Could you (or someone else) actually make the case for “good apologies” (in the sense you outline in this post) that goes beyond PR concerns?
I understand the desire to know what Bostrom really thinks, but the attention on the structural quality of his apology seems completely undue. None of these elements would presumably reveal more about how Bostrom really thinks than his actual apology.
In fact, it seems like if our preference is to understand how Bostrom really feels, your “good apology” approach might take us further away from that! Your emphasis is on appearing “sincere and genuine” which again, fair enough for PR concerns, but presumably we are after some sort of larger reconciliation here that necessitates being honest and forthright?
If an apology was terribly written—but was in fact genuine and sincere—that seems preferable? If a good apology is just to “sell forgiveness”, what could the point be beyond PR?
My apologies if I am missing something here, but you seem to be writing a guide for some kind of dishonesty? And if you mean it to be about true honesty, I think this scheme really fails.
Phrases like “EA elevates people” are becoming common, but it is very unclear what it means. Nick Bostrom created groundbreaking philosophical ideas. Will MacAskill has written extremely popular books and built communities and movements. Sam Bankman Fried became the richest man under 30 in a matter of months. All of these people have influenced and inspired many EAs because of their actions.
Under any reasonable sense of the word, people are elevating themselves. I think EA is incredibly free from ‘cult of personality’ problems—in fact it’s amazing how quickly people will turn against popular EAs. But in any group, some people are going to get status for doing their work well.
I am very surprised by the warm reception to this post. To my mind, this is exactly the type of rhetoric we should be discouraging on the Forums. It’s insinuating all kinds of scandals
(I am tired of drama, scandals, and PR. I am tired of being in a position where I have to apologize for sexism, racism, and other toxic ideologies within this movement)
without making any specific allegations or points, which becomes somehow acceptable within the emotional frame of “I am TIRED.” Presumably many other people, including those directly impacted by these things, are tired too, and we need to use reason to adjudicate how we should respond.
I think it is very difficult to litigate point three further without putting certain people on trial and getting into their personal details, which I am not interested in doing and don’t think is a good use of the Forum. For what it’s worth, I haven’t seen your Twitter or anything from you.
I should have emphasized more that there are consistent critics of EA who I don’t think are acting in bad faith at all. Stuart Buck seems to have been right early on a number of things, for example.
Your Bayesian argument may apply in some cases but it fails in others (for instance, when X = EAs are eugenicists).
Just apply Bayes’ rule: if P(events of the last week | X) > P(events of the last week | not-X), then you should increase your credence in X upon observing the events of the last week.
I also emphasize there are a few people who I have strong reason to believe are “deliberate effort to sow division within the EA movement” and this was the focus of my comment, publicly evidenced (NB: this is a very small part of my overall evidence) by them “taking glee in this disaster or mocking the appearances and personal writing of FTX/Alameda employees.” I do not think a productive conversation is possible in these cases.
These areas all seem well-identified, but the essential problem is that EA doesn’t have near the sufficient talent for top priority causes already.
I don’t think that the EA community profits itself by not including artists and those with skills that aren’t squarely in the conventional Earning to Give purview.
I certainly wouldn’t claim this. Obviously art, in general, is ex ante a very unpromising earning to give path. My suggestion is that we should encourage artists to use their skills in high impact ways.
I don’t buy the premise that this is not high EV through a combination of direct impact and promoting a model that is potentially high EV.
This implies a very weird model. Why would you think this is high EV? Presumably things are neutral to low EV unless proven otherwise via research? Nothing about “a combination of direct impact (??) and promoting a model” innately suggest high EV-which recall is a very high bar for career paths.
I’m really glad OP is excited to help out, but we should encourage them to consider whether they could do more good given their skill set. That is, after all, the point of EA. Many EA orgs need help with brand and aesthetics for example. Maybe their skills would be a good fit.
This is really sad and frustrating to see, that a community which prides itself in rigorous and independent thinking has taken to reciting by the same platitudes that every left wing organization does. We’re supposed to hold ourselves to higher standards than this.
Posts like this makes me much less being interested in being a part of EA.