Thank you!
burner
.
FYI anything you write to Teddy could end up in an article. I suggest you read some of his pieces before engaging. I worry this piece makes him sound more EA than he is, although of course he could be pivoting.
We clearly agree on your first point (and sorry, I don’t mean to single your comment out too much as a foil, it just came to mind as a recent example of the discourse).
The second strongest reason is if you think childrearing is actually the most cost-effective thing for you to do on the margin because of the effects of the children themselves
I thought about making some back of the envelope EV calculations to this point but it sort of lives or dies on certain assumptions and I didn’t want to make it just an argument about those. But it’s conceivable to me that, for the median EA, raising a child (and trying to instill EA values) would be the most cost effective use of that marginal time. That might be crazy, but I’d like to see different people’s numbers on it.
Again, these are all tentative, but I think my main point in this post is there is something of a collective action problem, where it is more high value (and lower cost) if a lot of EAs to have kids than it is the most cost effective thing for any individual EA having children.
I believe Abby’s take on this, but I don’t think it’s a misrepresentation of Caplan’s position (though maybe an unnuanced one), unless we’re really just coming down on the meaning of “significant amount.” I would say saving 10% of parenting time is “a significant amount.”
I think those low hanging fruits, if they are there at all, are probably there for 8-15 year olds, give or take.
Makes sense, glad to clarify
Firstly, I want to address why effective altruism, as I’ve stated elsewhere, “cannot singlehandedly meet the civil purpose of philanthropy.”
I think Nadia is misreading EA as a fundamentally philanthropic movement. EA is about maximizing the amount of good we do. Longtermism is about maximizing the EV of the future. Philanthropy is part of that, but far from the whole picture. Neither have made any claims about fulfilling the civil purpose of philanthropy-which I take it is something like libraries and children’s hospitals. In their more extreme forms, EA and longtermism may claim on the margin they are more important than those things, but not that they meet the same purpose.
I enjoy the piece, but do think it misses the mark in its comments on EA.
I have spent some time in and around Sandusky. I think you might be vastly overselling it in terms of general niceness and amenities.
I don’t get it. Why don’t you try to earn more money if you are going to give it away?
I’m not suggesting he shouldn’t advertise that he will donate profits. I’m suggesting he could do something more lucrative.
I’m really glad OP is excited to help out, but we should encourage them to consider whether they could do more good given their skill set. That is, after all, the point of EA. Many EA orgs need help with brand and aesthetics for example. Maybe their skills would be a good fit.
I don’t think that the EA community profits itself by not including artists and those with skills that aren’t squarely in the conventional Earning to Give purview.
I certainly wouldn’t claim this. Obviously art, in general, is ex ante a very unpromising earning to give path. My suggestion is that we should encourage artists to use their skills in high impact ways.
I don’t buy the premise that this is not high EV through a combination of direct impact and promoting a model that is potentially high EV.
This implies a very weird model. Why would you think this is high EV? Presumably things are neutral to low EV unless proven otherwise via research? Nothing about “a combination of direct impact (??) and promoting a model” innately suggest high EV-which recall is a very high bar for career paths.
These areas all seem well-identified, but the essential problem is that EA doesn’t have near the sufficient talent for top priority causes already.
I think it is very difficult to litigate point three further without putting certain people on trial and getting into their personal details, which I am not interested in doing and don’t think is a good use of the Forum. For what it’s worth, I haven’t seen your Twitter or anything from you.
I should have emphasized more that there are consistent critics of EA who I don’t think are acting in bad faith at all. Stuart Buck seems to have been right early on a number of things, for example.
Your Bayesian argument may apply in some cases but it fails in others (for instance, when X = EAs are eugenicists).
Just apply Bayes’ rule: if P(events of the last week | X) > P(events of the last week | not-X), then you should increase your credence in X upon observing the events of the last week.
I also emphasize there are a few people who I have strong reason to believe are “deliberate effort to sow division within the EA movement” and this was the focus of my comment, publicly evidenced (NB: this is a very small part of my overall evidence) by them “taking glee in this disaster or mocking the appearances and personal writing of FTX/Alameda employees.” I do not think a productive conversation is possible in these cases.
I am very surprised by the warm reception to this post. To my mind, this is exactly the type of rhetoric we should be discouraging on the Forums. It’s insinuating all kinds of scandals
(I am tired of drama, scandals, and PR. I am tired of being in a position where I have to apologize for sexism, racism, and other toxic ideologies within this movement)
without making any specific allegations or points, which becomes somehow acceptable within the emotional frame of “I am TIRED.” Presumably many other people, including those directly impacted by these things, are tired too, and we need to use reason to adjudicate how we should respond.
Phrases like “EA elevates people” are becoming common, but it is very unclear what it means. Nick Bostrom created groundbreaking philosophical ideas. Will MacAskill has written extremely popular books and built communities and movements. Sam Bankman Fried became the richest man under 30 in a matter of months. All of these people have influenced and inspired many EAs because of their actions.
Under any reasonable sense of the word, people are elevating themselves. I think EA is incredibly free from ‘cult of personality’ problems—in fact it’s amazing how quickly people will turn against popular EAs. But in any group, some people are going to get status for doing their work well.
Could you (or someone else) actually make the case for “good apologies” (in the sense you outline in this post) that goes beyond PR concerns?
I understand the desire to know what Bostrom really thinks, but the attention on the structural quality of his apology seems completely undue. None of these elements would presumably reveal more about how Bostrom really thinks than his actual apology.
In fact, it seems like if our preference is to understand how Bostrom really feels, your “good apology” approach might take us further away from that! Your emphasis is on appearing “sincere and genuine” which again, fair enough for PR concerns, but presumably we are after some sort of larger reconciliation here that necessitates being honest and forthright?
If an apology was terribly written—but was in fact genuine and sincere—that seems preferable? If a good apology is just to “sell forgiveness”, what could the point be beyond PR?
My apologies if I am missing something here, but you seem to be writing a guide for some kind of dishonesty? And if you mean it to be about true honesty, I think this scheme really fails.
No, that’s not really what I mean. I mean that I generally doubt these public apologies are generally able to give people the emotional reconciliation that they desire.
They can provide a few things, presumably including PR damage mitigation, a sincere account of their thinking, and perhaps some amount of reconciliation.
My criticism of your post is that it seems intent on optimizing for only one of those—indeed considering it entirely sufficient for a “good apology” without considering how these things trade off, nor considering what we might normatively want an apology to do. In my view, a sincere account of someone’s beliefs is very valuable.
It’s surprising to me that polyamory continues to be such a sacred cow of EA. It’s been highly negative for EA’s public image, and now it seems to be connected to a substantial amount of abuse. There’s a number of reasons our priors should suggest that non-monogomous relationships in high trust, insular communities can easily lead to abuse. It’s always seemed overly optimistic to think EA could avoid these problems. Of course, there have been similar ongoing discussions in the Berkeley Rationalist community for a number of years now.
This seems like one of the most important community issues to reflect on.
.