synonyms might be “SJW” or “DEI”.
River
You think that dating a coworker or whatever without sleeping with them is less likely to cause problems than the reverse? That does not ring true to me at all. It does ring of Christian purity culture, which I would not have expected to encounter in EA.
Is it true that other successful institutions generally have norms against dating within them? (I don’t want to use the term “sleeping around”, which feels derogatory in this particular context). My company only prohibits dating people in your chain of command, and I am certainly aware of relationships within the company that have not caused any objections or issues that I know of. Though my company is tens of thousands of people, with thousands in my building, so maybe it doesn’t qualify as tight-nit. I also haven’t perceived any of my friend groups as having a norm against dating. Family seems obviously different, because there is that incest norm, and that impossibility of stepping away on the off chance that things go really badly. Though again, maybe you have a family with different dynamics—to the best of my knowledge, I’ve never met a cousin’s spouse’s anything. Anyway, point is, I don’t think it’s actually true that the rest of society operates this way.
Even if the differences you pointed to in the OP are real on average (and for some of them that is a generous assumption), what makes you think they are large? Even where men and women are different on average, the differences are usually very small, much smaller than the variation within either gender.
I am opposed to any norm that asks different behavior of men than women.
while it’s possible to get to the truth with enough effort
This comes off as naive. Usually you never know the truth no matter how much effort or investigation. If you think you do, you are probably doing more harm than good. (This applies to many areas, not just sexual misconduct.)
Suppose, hypothetically, that every individual EA would be just as effective, do just as much good, without an EA community as with one. In that case, how many resources should CEA and other EA orgs devote to community building? My answer is exactly 0. That implies that the EA community is a means to an end, the end of making EAs more effective.
That said, I wouldn’t necessarily generalize to other communities. And I agree that assessing a particular case of alleged wrongdoing should not depend on the perceived value of the accused’s contributions to EA causes, and I do not read CEA’s language as implying otherwise.
There are different ways to read the signal that the lack of a statement gives. Someone could read it to mean that these two firms have rampant racism/sexism. Alternatively, someone could read it to mean that these two firms have the same low rates of racism/sexism as the other ten, and choose to focus their energies on software accounting rather than identity politics. A third possible reading is that the 10 firms put out statements precisely because they had more problems with racism/sexism, and therefor the two firms without the statements probably have the fewest racism/sexism problems. How you read the lack of a statement will depend a lot on your priors about the dynamics of racism/sexism in your particular place and time. But if you adopt the second or third readings, then the signal from the lack of a statement seems positive.
Why would you take the TIME article at face value on this?
It doesn’t even get the language right. I’m poly, and I have never once heard people talk about “joining a polycule” as the thing someone chooses to do. That’s not how it works. You choose to date someone. “Polycule” just describes the set of people who you are dating, who your partner(s) are dating, who their partner(s) are dating, and so on. Dating someone doesn’t imply anything about how you have to relate to your metamours, much less people farther distant in the polycule. Sometimes you may never even know the full extent of your polycule.
I don’t know of a single poly person who would approve of the dynamic that the TIME article seems to describe, or any reason to think it is an accurate description of how EA works. Of course you shouldn’t shame people into dating you. Of course you shouldn’t leverage professional power for sexual benefit. Of course it’s good to be an EA and buy bed nets whether you are poly or monogomous. Nobody that I know of, poly or monogomous, disagrees with this. The fact that you think poly people do is what shows your prejudice. I suggest you try getting to know a poly person, talk to a poly person about their relationship(s), before opening your mouth on the subject again.
“predatory polyamorous rationalists” is pretty bigoted. What would we think if someone referred to “predatory gays”?
[Question] Who owns “Effective Altruism”?
I think EA is at its best when it takes the high epistemic standards of LW and applies them to altruistic goals. I see the divergence growing, and that worries me.
What is the EA that you think should do this re-examining? In what sense is something that has different beliefs still EA? If an individual re-evaluates their beliefs and changes their mind about core EA ideas, wouldn’t they leave EA, go do something else, EA gets smaller, newer better philosophies get bigger, and resources therefor get allocated as they should?
I actually think EA is inherently utilitarian, and a lot of the value it provides is allowing utilitarias to have a conversation among ourselves without having to argue the basic points of utilitarianism with every other moral view. For example, if a person is a nativist (prioritizing the well being of their own country-people), then they definitionally aren’t an EA. I don’t want EA to appeal to them, because I don’t want every conversation to be slowed down by having to argue with them, or at least find another way to filter them out. EA is supposed to be the mechanism to filter the nativists out of the conversation.
David, how do you reconcile your implication that there is a norm to get a “special dispensation” from with CEA’s claim that EA “doesn’t say anything about how much someone should give”?
Consider Financial Independence First
I think the reason this hasn’t been proposed in a forum post yet is because it would be movement suicide. Suppose you are Moskovitz, and CEA comes to you and says they want you to hand over all of your personal financial records, all kinds of Facebook business records, etc, to some random law firm or whoever to do the audit. What do you think? How do you feel? Obviously you aren’t going to do it—that’s an insane invasion of your personal privacy, and would reveal a lot of confidential information about your Facebooks business in violation of business norms if not legal agreements. Probably you are going to feel insulted. Certainly you can find plenty of other non-EA charities that would be happy to take your money and say “thank you” no questions asked. You’ll probably just do that.
The reality of the situation is that if Moscovitz is another SBF, then EA is screwed. We can’t mitigate the risk by seriously investigating Moscovitz. The mitigation strategies available are to see to our own governance better—be prepared to loose funding—take any opportunities to diversify funding sources, build up savings (“endowments”) to see us through hard times, document our activities so that if we have to shut down for a few years due to lack of funding, people can pick up the pieces and restart when funding becomes available again.
I don’t think talking about fraud right now is a good move. If somebody asks you whether EAs should do fraud, of course your answer should be an unqualified ‘no’. But if you bring it up, you are implying that SBF actually did fraud, which (1) may not be true and (2) is bad PR.
I love this intro! I especially like that it defines EA in terms of finding the most effective interventions, the ones that do good most efficiently with whatever inputs they take, rather than doing the most good in an absolute sense.
I think part of the difficulty here is that “wokism” seems to refer to a cluster of ideas and practices that seem to be a genuine cluster, but don’t have especially clear boundaries or a singular easy definition.
What I do notice is that none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with. But I’ll try to give some analysis of what I would understand to be woke in the general vicinity of these ideas. Note that I am not asserting any normative position myself, just trying to describe what I understand these words to mean.
I don’t think veganism really has much to do with wokism. Whatever you think about EA event catering, it just seems like an orthogonal issue.
I suspect everyone would prefer that EA spaces be welcoming of trans people, but there may be disagreement on what exactly that requires on a very concrete level, or how to trade it off against other values. Should we start meetings by having everyone go around and give their pronouns? Wokism might say yes, other people (including some trans people) might say no. Should we kick people out of EA spaces for using the “wrong” pronouns? Wokism might say yes, other might say no as that is a bad tradeoff against free speech and epistemic health.
I suspect everyone thinks reports of assault and harassment should be taken seriously. Does that mean that we believe all women? Wokism might say yes, others might so no. Does that mean that people accused should be confronted with the particular accusations against them, and allowed to present evidence in response? Wokism might say no, others might say yes, good epistemics requires that.
I’m honestly not sure what specifically you mean by “so-called ‘scientific’ racism” or “scourge”, and I’m not sure if that’s a road worth going down.
Again, I’m not asserting any position myself here, just trying to help clarify what I think people mean by “wokism”, in the hopes that the rest of you can have a productive conversation.