Gleb Tsipursky has also repeatedly said he will leave the EA movement.
Gleb Tsipursky has also repeatedly said he will leave the EA movement.
This is simply false. See what I actually said here
Let me first clarify that I see the goal of doing the most good as my end goal, and YMMV—no judgment on anyone who cares more about truth than doing good. This is just my value set.
Within that value set, using “insufficient” means to get to EA ends is just as bad as using “excessive” means. In this case, being “too honest” is just as bad as “not being honest enough.” The correct course of actions is to correctly calibrate one’s level of honesty to maximize for positive long-term impact for doing the most good.
Now, the above refers to the ideal-type scenario. IRL, different people are differently calibrated. Some tend to be too oriented toward exaggerating, some too oriented to being humble and understating the case, and in either case, it’s a mistake. So one should learn where one’s bias is, and push against that bias.
Sarah’s post highlights some of the essential tensions at the heart of Effective Altruism.
Do we care about “doing the most good that we can” or “being as transparent and honest as we can”? These are two different value sets. They will sometimes overlap, and in other cases will not.
And please don’t say that “we do the most good that we can by being as transparent and honest as we can” or that “being as transparent and honest as we can” is best in the long term. Just don’t. You’re simply lying to yourself and to everyone else if you say that. If you can’t imagine a scenario where “doing the most good that we can” or “being as transparent and honest as we can” are opposed, you’ve just suffered from a failure mode by flinching away from the truth.
So when push comes to shove, which one do we prioritize? When we have to throw the switch and have the trolley crush either “doing the most good” or “being as transparent and honest as we can,” which do we choose?
For a toy example, say you are talking to your billionaire uncle on his deathbed and trying to convince him to leave money to AMF instead of his current favorite charity, the local art museum. You know he would respond better if you exaggerate the impact of AMF. Would you do so, whether lying by omission or in any other way, in order to get much more money for AMF, given that no one else would find out about this situation? What about if you know that other family members are standing in the wings and ready to use all sorts of lies to advocate for their favorite charities?
If you do not lie, that’s fine, but don’t pretend that you care about doing the most good, please. Just don’t. You care about being as transparent and honest as possible over doing the most good.
If you do lie to your uncle, then you do care about doing the most good. However, you should consider at what price point you will not lie—at this point, we’re just haggling.
The people quoted in Sarah’s post all highlight how doing the most good sometimes involves not being as transparent and honest as we can (including myself). Different people have different price points, that’s all. We’re all willing to bite the bullet and sometimes send that trolley over transparency and honesty, whether questioning the value of public criticism such as Ben or appealing to emotions such as Rob or using intuition as evidence such as Jacy, for the sake of what we believe is the most good.
As a movement, EA has a big problem with believing that ends never justify the means. Yes, sometimes ends do justify the means—at least if we care about doing the most good. We can debate whether we are mistaken about the ends not justifying the means, but using insufficient means to accomplish the ends is just as bad as using excessive means to get to the ends. If we are truly serious about doing the most good as possible, we should let our end goal be the North Star, and work backward from there, as opposed to hobbling ourselves by preconceived notions of “intellectual rigor” at the cost of doing the most good.
We have a number of collaborative venues, such as a Facebook group, blog, email lists, etc. for people who get involved.
Yup, we’re focusing on a core of people who are upset about lies and deceptions in the US election and the Brexit campaign, and aiming to provide them with means to address these deceptions in an effective manner. That’s the goal!
Broad social movement. We’re aiming to focus on social media organizing at first, and then spread to local grassroots organizing later. There will be a lot of marketing and PR associated with it as well.
Well, ok, are you really going to make this semantic argument with me? Trump is widely accepted by the Republican party as its leader. I’ll be happy to agree on using the term “Republican” instead of “conservative” to address your concerns.
I’ve made a strong and public decision to orient much more toward making politics less irrational. For me, this means not orienting toward party politics, but addressing the problems in political discourse and culture. It’s a bipartisan/nonpartisan stance.
You are mistaken, we have never claimed that we will distance InIn publicly from the EA movement.
We have previously talked about us not focusing on EA in our broad audience writings, and instead talking about effective giving—which is what we’ve been doing. At the same time, we were quite active on the EA Forum, and engaging in a lot of behind-the-scenes, and also public, collaborations to promote effective marketing within the EA sphere.
Now, we are distancing from the EA movement as a whole.
FYI, we decided to distance InIn publicly from the EA movement for the foreseeable future.
We will only reference effective giving and individual orgs that are interested in being promoted, as evidenced by being interested in providing InIn with stats for how many people we are sending to their websites, and similar forms of collaboration (yes, I’m comfortable using the term collaboration for this form of activity). Since GWWC/CEA seem not interested, we will not mention them in our future content.
Our work of course will continue to be motivated by EA concerns for doing the best things possible to improve the world in a cost-effective way, but we’ll shift our focus from explicitly EA-themed activities to our other area of work—spreading rational thinking and decision-making to reduce existential risk, address fundamental societal problems, and decrease individual suffering. Still, we’ll also continue to engage in collaborations and activities that have proved especially beneficial within the EA-related sphere, such as doing outreach to secular folks and spreading Giving Games, yet that will be a smaller aspect of our activities than in the past.
FYI, we removed references to GWWC and CEA from our documents
Interesting to see how many downvotes this got. Disappointing that people choose to downvote instead of engaging with the substance of my comments. I would have hoped for better from a rationally-oriented community.
Oh well, I guess it is what it is. I’m taking a break from all this based on my therapist’s recommendation. Good luck!
I think it would be great to set up a formal panel. That way, we can have an actual calm discussion about the topics at hand. Furthermore, we can make sure that all points are thoroughly discussed and there is a clear resolution.
For example, InIn has been accused of astroturfing, etc. However, no one responded to my comments pointing out that astroturfing does not apply to our activities. The same goes for other points of disagreement with the claims of the authors of the document expressing concerns—no one has responded to my points of disagreement. A formal panel would be a good way of making sure there is actually a discourse around these topics and they can be hashed out.
So far, the impression I and many others are getting is that these accusations are unfair and unjust, and paint some of the top-level EA activists in a negative light. These concerns would be addressed in a formal procedure. I’d be glad to take the InIn situation through a formal procedure where these things can be hashed out.
This makes sense for spreading the message among EAs, which is why we have the Effective Altruist Accomplishments Facebook group. I’ll have to think further about the most effective ways of spreading this message more broadly, as I’m not in a good mental space to think about it right now.
1) I would prefer to hear Jeff’s answer to my questions—he’s more than capable of speaking for himself.
2) I will not stoop to engaging with the level of discourse you present in this comment.