I see a tension at the heart of your piece. On the one hand, you say you “want EA to change the attitudes of society as a whole”. But you seem willing to backpedal on the goal of changing societal attitudes as soon as you encounter any resistance. Yes, society as a whole believes that “it’s the thought that counts” and that you should “do something you’re passionate about”. These are the sort of attitudes we’re trying to change. If EA is watered down to the point where everyone can agree with it, it won’t mean anything anymore. (Indeed, we’ve been criticized from the other direction for being too watered down already: one economist called EA “vacuous”.)
You criticize EAs for believing that “their movement is correct and more important than all others”. But the implicit premise of your post is that EA should seek to improve its image in order to increase its influence and membership, almost necessarily at the expense of other movements. The implication being that EA is more correct and/or important than those other movements.
I’m skeptical of your implicit premise. I think that EA should play to its strengths and not try to be everything to everyone. We’re passionate about doing the most good, not passionate about problems that affect ourselves and our friends. We focus on evidence and reason, which sometimes comes across as cold-hearted (arguably due to cultural conditioning). Many of us are privileged and see EA as a way to give back. If someone doesn’t like any of this, they are more than welcome to borrow from EA as a philosophy—but EA as a movement may not be a good fit for them, and we should be honest and upfront about that.
I’m tempted to say this post is itself a symptom of the “EA is the last social movement we’ll ever need” type arrogance it criticizes :P
My vision of EA is not EA being the last social movement we ever need. It’s a vision of getting a bunch of smart, wealthy, influential critical thinkers in the same room together, trying to figure out what the world’s most important and neglected problems are and how they can most effectively be solved. If it’s not a candidate for one of the world’s most important and neglected problems, we should leave it to some other movement, and I don’t think we need to apologize for that.
I’m not even sure I want EA to add more smart, wealthy, influential critical thinkers. EA has already captured a lot of mindshare and I’m in favor of intellectual diversity broadly construed. Additionally, EAs have (correctly in my view) pointed out that expanding the EA movement is not an evidence-based intervention… for example, for all we know most people who have recently pledged to donate 10% will burn out in a few years.
Final note: I’m doubtful that we can successfully split the “doing good” and “being a good person” semantic hair. And even if it was possible, I’m doubtful that it’s a good idea. As you suggest, I think we should set up incentives so the way to gain status in the EA movement is to do a lot of good, not have the appearance of being a good person.
I’m not even sure I want EA to add more smart, wealthy, influential critical thinkers. EA has already captured a lot of mindshare and I’m in favor of intellectual diversity broadly construed.
I was quite surprised by this, so I was wondering what your reference class was? It seems to me that—while much bigger than a few years ago—effective altruism is still an extremely small proportion of society, and a larger but still very small part of influential thinkers.
In the early days of the EA movement, when it was uncertain whether expansion was even possible, I can see “try and expand like crazy, see what happens” as being a sensible option. But now we know that expansion is very possible and there’s a large population of EA-amenable people out there. The benefits of reaching these people a bit sooner than we would otherwise seem marginal to me. So at this point I think we can afford to move the focus off of movement growth for a bit and think more deeply about exactly what we are trying to achieve. Brain dump incoming...
Does hearing about EA in its current form actually seem to increase folks’ effective altruist output? (Why are so many EAs on the survey not donating anything?)
Claiming to be “effective altruists” amounts to a sort of holier-than-thou claim. Mild unethical behavior from prominent EAs that would be basically fine in any other context could be easy tabloid fodder for journalists due to the target EA has painted on its own back. There have already been a few controversies along these lines (not gonna link to them). EA’s holier-than-thou attitude leads to unfavorable contrasts with giving to help family members etc.
EA has neglectedness as one of its areas of focus. But if a cause is neglected, it’s neglected for a reason. Sometimes it’s neglected because it’s a bad cause. Other times it’s neglected because it sounds like a bad cause but there are complicated reasons why it might actually be a good cause. EA’s failure to communicate neglectedness well leads to people saying things like “Worrying about sentient AI as the ice caps melt is like standing on the tracks as the train rushes in, worrying about being hit by lightning”. Which is just a terrible misunderstanding—EAs mostly think that global warming is a problem that needs to be addressed, but that AI risk is receiving much less funding and might be a better use of funds on the margin. The problem is that by branding itself as “effective altruism”, EA is implicitly claiming that any causes EA isn’t working on are ineffective ones. Which gets interpreted as a holier than thou attitude and riles anyone who’s working on a different cause (even if we actually agree it’s a pretty good one).
Some EAs cheered for the Dylan Matthews Vox article that prompted the tweet I linked to above, presumably because they agree with Matthews. But finding a reporter to broadcast your criticisms of the EA movement to a huge readership in order to gain leverage and give your cause more movement mindshare is a terrible defect/defect equilibrium. This is a similar conflict to the one at the heart of Tom_Davidson’s piece. EA is always going to have problems with journalists due to the neglectedness point I made above. Doing good and looking good are not the same thing and it’s not clear how to manage this tradeoff. It’s not clear how best to spend our “weirdness points”.
In line with this, you can imagine an alternate branding for EA that focuses on the weakest links in our ideological platform… for example, the “neglected causes movement” (“Neglected Causes Global”?), or the “thoughtful discussion movement”/”incremental political experimentation movement” if we decided to have a systemic change focus. (Willingness is not the limiting factor on doing effective systemic change! Unlike philanthropy, many people are extremely interested in doing systemic change. The limiting factor is people forming evidence filter bubbles and working at cross purposes to one another. As far as I can tell EA as a movement is not significantly good at avoiding filter bubbles. “Donate 10% of your time/energy towards systemic change” fails to solve the systemic problems with systemic change.) As far as I can tell, none of these alternate brandings have been explored. There hasn’t been any discussion of whether EA is better as a single EA tentpole or as multiple tentpoles, with an annual conference for neglected causes, an annual conference for avoiding filter bubbles, etc. etc.
There’s no procedure in place for resolving large-scale disagreement within the EA movement. EA is currently a “do-ocracy”, which leads to the unilateralist’s curse and other problems. In the limit of growth, we risk resolving our disagreements the same way society at large does: with shouting and/or fists. Ideally there would be some kind of group rationality best practices baked in to the EA movement. (These could even be a core branding focus.) The most important disagreement to resolve in a cooperative way may be how to spend our weirdness points.
EA is trying to be a “big tent”, but they don’t realize how difficult this is. The most diverse groups are the ones that are able to engineer their diversity: universities and corporations can hold up a degree/job carrot and choose people in order to get a representative cross section of the population. In the absence of such engineering, groups tend to get less diverse over time. Even Occupy Wall Street was disproportionately white. That’s why people who say “I like the idea of altruistic effectiveness, but not the EA movement’s implementation” don’t hang around—it’s stressful to have persistent important disagreements with everyone who’s around you. (EA’s definitional confusion might also eventually result in EA becoming a pernicious meme that’s defined itself to be great. I’m somewhat in favor of trying to make sure we really have identified the world’s most high impact causes before doing further expansion. People like Paul Cristiano have argued, convincingly IMO, that there are likely to be high-impact causes still not yet on EA movement radar. And focusing on funneling people towards a particular cause also helps address “meta trap” issues.) EA is trying to appeal to people of all ages, races, genders, political orientations, religions, etc. with very little capability for diversity engineering. It’s difficult to imagine any other group in society that’s being this ambitious.
There’s no procedure in place for resolving large-scale disagreement within the EA movement. EA is currently a “do-ocracy”, which leads to the unilateralist’s curse and other problems. In the limit of growth, we risk resolving our disagreements the same way society at large does: with shouting and/or fists. Ideally there would be some kind of group rationality best practices baked in to the EA movement. (These could even be a core branding focus.)
This seems particularly important to me. I’d love to hear more in depth thoughts of you have any. Even if not, I think it might be worth a top level post to spur discussion.
One category of solutions is the various voting and governing systems. Score voting seems pretty solid based on my limited reading. There are also more exotic proposals like futarchy/prediction markets and eigendemocracy. The downside of systems like this is once you give people a way to keep score, they sometimes become focused on increasing their score (through forming coalitions, etc.) at the expense of figuring out what’s true.
There are also “softer” solutions like trying to spread beneficial social norms. Maybe worrying about this is overkill in a group made up of do-gooders anyway, as long as moral trade is emphasized enough that people with very different value systems can still find ways to cooperate.
You’re more than welcome to think things over and write a top level post.
Why are so many EAs on the survey not donating anything?
This I can answer at least. The vast majority of the EAs who were down as giving 0, in the survey, matched at least 1 (and often more) of these criteria, i) full time student, ii) donated a large amount in the past already (even if not in that particular year), iii) pledged to give a substantial amount. The same applied for EAs merely giving ‘low’ amounts e.g. <$500. I give the figures in a comment somewhere on an earlier thread where this was raised (probably the survey thread).
Some EAs cheered for the Dylan Matthews Vox article that prompted the tweet I linked to above, presumably because they agree with Matthews. But finding a reporter to broadcast your criticisms of the EA movement to a huge readership in order to gain leverage and give your cause more movement mindshare is a terrible defect/defect equilibrium.
Matthews is an EA, and identifies as one in that piece. This wasn’t about finding someone to broadcast things, this was someone within the movement trying to shape it.
(I do agree with you that we shouldn’t be trying to enlist the greater public to take sides in internal disagreements over cause prioritization within EA.)
Thanks for the reply! I would like to pick you up on a few points though...
“On the one hand, you say you “want EA to change the attitudes of society as a whole”. But you seem willing to backpedal on the goal of changing societal attitudes as soon as you encounter any resistance… If EA is watered down to the point where everyone can agree with it, it won’t mean anything anymore.”
I think all the changes I suggested can be made without the movement losing the things that currently makes it distinctive and challenging in a good way. Which of my suggested changes do you think are in danger of watering EA down too much? Do you take issue with the other changes I’ve suggested?
“Yes, society as a whole believes that “it’s the thought that counts” and that you should “do something you’re passionate about”. These are the sort of attitudes we’re trying to change.”
I completely agree we should try to change people’s attitudes about both these things. I argued that we should say
“An action that makes a difference is much better than one that doesn’t, regardless of intention” rather than
“An agent that makes a difference is much better than one who doesn’t” because the latter turns people against the movement and the former says everything we need to say.
Again, I’m interested to know which of my suggested changes you think would stop the movement challenging society in ways that it should be?
“I think that EA should play to its strengths and not try to be everything to everyone. We’re passionate about doing the most good, not passionate about problems that affect ourselves and our friends. We focus on evidence and reason, which sometimes comes across as cold-hearted (arguably due to cultural conditioning).”
Again, I completely agree . The things you mention are essential parts of the movement. In my post was trying to suggest ways in which we can minimize the negative image that is easily associated with these things.
“But the implicit premise of your post is that EA should seek to improve its image in order to increase its influence and membership, almost necessarily at the expense of other movements… I’m skeptical of your implicit premise.”
You’re right, although it’s not implicit—I say explicitly that I want EA to change the attitudes of society as a whole. This is because I think EA is a great movement and, therefore, that if it has more appeal and influence it will be able to accomplish more. FWIW I don’t think it’s the last social movement we’ll ever need.
“It’s a vision of getting a bunch of smart, wealthy, influential critical thinkers in the same room together, trying to figure out what the world’s most important & neglected problems are and how they can most effectively be solved.”
I think comments like these make the movement seem inaccessible to outsiders who aren’t rich or privileged. It seems like we disagree over whether that’s a problem or not though.
Overall it seems like you think that paying attention to our image in the ways I suggest would harm the movement by making it less distinctive. But I don’t know why you think the things I suggest would do that. I’m also interested to hear more about why you don’t think getting more members and being more influential would be a good thing.
I guess I’m not totally sure what concrete suggestions you’re trying to make. You do imply that we should stop saying things like “It’s better to become a banker and give away 10% of your income than to become a social worker” and stop holding EAs who earn and donate lots of money in high regard. So I guess I’ll run with that.
High-earning jobs are are often unpleasant and/or difficult to obtain. Not everyone is willing to get one or capable of getting one. Insofar as we de-emphasize earning to give, we are more appealing to people who can’t get one or don’t want one. But we’ll also be encouraging fewer people to jump through the hoops necessary to achieve a high-earning job, meaning more self-proclaimed “EAs” will be in “do what you’re passionate about” type jobs like going to grad school for pure math or trying to become a professional musician. Should Matt Wage have gone on to philosophy academia like his peers or not? You can’t have it both ways.
I don’t think high-earning jobs are the be all and end all of EA. I have more respect for people who work for EA organizations, because I expect they’re mostly capable of getting high-paying jobs but they chose to forgo that extra income while working almost as hard. I guess I’m kind of confused about what exactly you are proposing… are we still supposed to evaluate careers based on impact, or not? As long as we evaluate careers based on impact, we’re going to have the problem that highly capable people are able to produce a greater impact. I agree this is a problem, but I doubt there is an easy solution. Insofar as your post presents a solution, it seems like it trades off almost directly against encouraging people to pursue high-impact careers. We might be able to soften the blow a little bit but the fundamental problem still remains.
Just in terms of the “wealthy & privileged” image problem, I guess maybe making workers at highly effective nonprofits more the stars of the movement could help some? (And also help compensate for their forgone income.)
I think comments like these make the movement seem inaccessible to outsiders who aren’t rich or privileged. It seems like we disagree over whether that’s a problem or not though.
EA has its roots in philanthropy. As you say, philanthropy (e.g. in the form of giving 10% of your income) is fundamentally more accessible to rich people. It’s not clear to me that a campaign to make philanthropy seem more accessible to people who are just scraping by is ever going to be successful on a large scale. No matter what you do you are going to risk coming across as demeaning and/or condescending.
I discuss more about why I’m skeptical of movement growth in this comment. Note that some of the less philanthropy-focused brandings of the EA movement that I suggest could be a good way to include people who don’t have high-paying jobs.
I think we’re talking past each other a little bit. I’m all for EtG and didn’t mean to suggest otherwise. I think we should absolutely keep evaluating career impacts; Matt Wage made the right choice. When I said we should stop glorifying high earners I was referring to the way that they’re hero-worshipped, not our recommending EtG as a career path.
Most of my suggested changes are about the way we relate to other EAs and to outsiders, though I had a couple of more concrete suggestions about the pledge and the careers advice. I do take your point that glorifying high earners might be consequentially beneficial though: there is a bit of a trade-off here.
As long as we evaluate careers based on impact, we’re going to have the problem that highly capable people are able to produce a greater impact… Insofar as your post presents a solution, it seems like it trades off almost directly against encouraging people to pursue high-impact careers.
I hope my suggestions are compatible with encouraging people to pursue high-impact careers, but would reduce the image problem currently currently associated with it. One hope is that by distinguishing between doing good and being good we can encourage everyone to do good by high earning (or whatever) without alienating those who can’t by implying they are less virtuous, or less good people. We could also try and make the movement more inclusive to those who are less rich in other ways: e.g. campaigning for EA causes is more accessible to all.
I guess maybe making workers at highly effective nonprofits more the stars of the movement could help some?
When I said we should stop glorifying high earners I was referring to the way that they’re hero-worshipped
Hm, maybe I just haven’t seen much of this?
Regarding the pledge, I’m inclined to agree with this quote:
I recently read a critique of the Giving What We Can pledge as classist. The GWWC pledge requires everyone with an income to donate 10% of their income. This disproportionately affects poor people: if you made $20,000 last year, giving 10% means potentially going hungry; if you made a million dollars last year, giving 10% means that instead of a yacht you will have to have a slightly smaller yacht. This is a true critique.
Of course, there’s another pledge that doesn’t have this problem. It was invented by the world’s most famous effective altruist. It even comes with a calculator. And I bet you half the people reading this haven’t heard of it.
The problem is that the Giving What We Can pledge is easy to remember. “Pledge to give 10% of your income” is a slogan. You can write it on a placard. “Pledge to give 1% of your before-tax income, unless charitable donations aren’t tax-deductible in your country in which case give 1% of your after-tax income, as long as you make less than $100,000/year adjusted for purchasing power parity, and after that gradually increase the amount you donate in accordance with these guidelines” is, um, not.
So, I’m inclined to think that preserving the simplicity of the current GWWC pledge is valuable. If someone doesn’t feel like they’re in a financial position to make that pledge, there’s always the Life You Can Save pledge, or they can skip pledging altogether. Also, note that religions have been asking their members for 10% of their income for thousands of years, many hundreds of which folks were much poorer than people typically are today.
I don’t think the existence of another pledge does much to negate the harm done by the GWWC pledge being classist.
I agree there’s value in simplicity. But we already have an exception to the rule: students only pay 1%. There’s two points here.
Firstly, it doesn’t seem to harm our placard-credentials. We still advertise as “give 10%”, but on further investigation there’s a sensible exception. I think something similar could accommodate low-earners.
Secondly, even if you want to keep it at one exception, students are in a much better position to give than many adults. So we should change the exception to a financial one.
Do you agree that, all things equal, the suggestions I make about how to relate to each other and other EAs are good?
I see a tension at the heart of your piece. On the one hand, you say you “want EA to change the attitudes of society as a whole”. But you seem willing to backpedal on the goal of changing societal attitudes as soon as you encounter any resistance. Yes, society as a whole believes that “it’s the thought that counts” and that you should “do something you’re passionate about”. These are the sort of attitudes we’re trying to change. If EA is watered down to the point where everyone can agree with it, it won’t mean anything anymore. (Indeed, we’ve been criticized from the other direction for being too watered down already: one economist called EA “vacuous”.)
You criticize EAs for believing that “their movement is correct and more important than all others”. But the implicit premise of your post is that EA should seek to improve its image in order to increase its influence and membership, almost necessarily at the expense of other movements. The implication being that EA is more correct and/or important than those other movements.
I’m skeptical of your implicit premise. I think that EA should play to its strengths and not try to be everything to everyone. We’re passionate about doing the most good, not passionate about problems that affect ourselves and our friends. We focus on evidence and reason, which sometimes comes across as cold-hearted (arguably due to cultural conditioning). Many of us are privileged and see EA as a way to give back. If someone doesn’t like any of this, they are more than welcome to borrow from EA as a philosophy—but EA as a movement may not be a good fit for them, and we should be honest and upfront about that.
I’m tempted to say this post is itself a symptom of the “EA is the last social movement we’ll ever need” type arrogance it criticizes :P
My vision of EA is not EA being the last social movement we ever need. It’s a vision of getting a bunch of smart, wealthy, influential critical thinkers in the same room together, trying to figure out what the world’s most important and neglected problems are and how they can most effectively be solved. If it’s not a candidate for one of the world’s most important and neglected problems, we should leave it to some other movement, and I don’t think we need to apologize for that.
I’m not even sure I want EA to add more smart, wealthy, influential critical thinkers. EA has already captured a lot of mindshare and I’m in favor of intellectual diversity broadly construed. Additionally, EAs have (correctly in my view) pointed out that expanding the EA movement is not an evidence-based intervention… for example, for all we know most people who have recently pledged to donate 10% will burn out in a few years.
Final note: I’m doubtful that we can successfully split the “doing good” and “being a good person” semantic hair. And even if it was possible, I’m doubtful that it’s a good idea. As you suggest, I think we should set up incentives so the way to gain status in the EA movement is to do a lot of good, not have the appearance of being a good person.
I was quite surprised by this, so I was wondering what your reference class was? It seems to me that—while much bigger than a few years ago—effective altruism is still an extremely small proportion of society, and a larger but still very small part of influential thinkers.
In the early days of the EA movement, when it was uncertain whether expansion was even possible, I can see “try and expand like crazy, see what happens” as being a sensible option. But now we know that expansion is very possible and there’s a large population of EA-amenable people out there. The benefits of reaching these people a bit sooner than we would otherwise seem marginal to me. So at this point I think we can afford to move the focus off of movement growth for a bit and think more deeply about exactly what we are trying to achieve. Brain dump incoming...
Does hearing about EA in its current form actually seem to increase folks’ effective altruist output? (Why are so many EAs on the survey not donating anything?)
Claiming to be “effective altruists” amounts to a sort of holier-than-thou claim. Mild unethical behavior from prominent EAs that would be basically fine in any other context could be easy tabloid fodder for journalists due to the target EA has painted on its own back. There have already been a few controversies along these lines (not gonna link to them). EA’s holier-than-thou attitude leads to unfavorable contrasts with giving to help family members etc.
EA has neglectedness as one of its areas of focus. But if a cause is neglected, it’s neglected for a reason. Sometimes it’s neglected because it’s a bad cause. Other times it’s neglected because it sounds like a bad cause but there are complicated reasons why it might actually be a good cause. EA’s failure to communicate neglectedness well leads to people saying things like “Worrying about sentient AI as the ice caps melt is like standing on the tracks as the train rushes in, worrying about being hit by lightning”. Which is just a terrible misunderstanding—EAs mostly think that global warming is a problem that needs to be addressed, but that AI risk is receiving much less funding and might be a better use of funds on the margin. The problem is that by branding itself as “effective altruism”, EA is implicitly claiming that any causes EA isn’t working on are ineffective ones. Which gets interpreted as a holier than thou attitude and riles anyone who’s working on a different cause (even if we actually agree it’s a pretty good one).
Some EAs cheered for the Dylan Matthews Vox article that prompted the tweet I linked to above, presumably because they agree with Matthews. But finding a reporter to broadcast your criticisms of the EA movement to a huge readership in order to gain leverage and give your cause more movement mindshare is a terrible defect/defect equilibrium. This is a similar conflict to the one at the heart of Tom_Davidson’s piece. EA is always going to have problems with journalists due to the neglectedness point I made above. Doing good and looking good are not the same thing and it’s not clear how to manage this tradeoff. It’s not clear how best to spend our “weirdness points”.
In line with this, you can imagine an alternate branding for EA that focuses on the weakest links in our ideological platform… for example, the “neglected causes movement” (“Neglected Causes Global”?), or the “thoughtful discussion movement”/”incremental political experimentation movement” if we decided to have a systemic change focus. (Willingness is not the limiting factor on doing effective systemic change! Unlike philanthropy, many people are extremely interested in doing systemic change. The limiting factor is people forming evidence filter bubbles and working at cross purposes to one another. As far as I can tell EA as a movement is not significantly good at avoiding filter bubbles. “Donate 10% of your time/energy towards systemic change” fails to solve the systemic problems with systemic change.) As far as I can tell, none of these alternate brandings have been explored. There hasn’t been any discussion of whether EA is better as a single EA tentpole or as multiple tentpoles, with an annual conference for neglected causes, an annual conference for avoiding filter bubbles, etc. etc.
There’s no procedure in place for resolving large-scale disagreement within the EA movement. EA is currently a “do-ocracy”, which leads to the unilateralist’s curse and other problems. In the limit of growth, we risk resolving our disagreements the same way society at large does: with shouting and/or fists. Ideally there would be some kind of group rationality best practices baked in to the EA movement. (These could even be a core branding focus.) The most important disagreement to resolve in a cooperative way may be how to spend our weirdness points.
EA is trying to be a “big tent”, but they don’t realize how difficult this is. The most diverse groups are the ones that are able to engineer their diversity: universities and corporations can hold up a degree/job carrot and choose people in order to get a representative cross section of the population. In the absence of such engineering, groups tend to get less diverse over time. Even Occupy Wall Street was disproportionately white. That’s why people who say “I like the idea of altruistic effectiveness, but not the EA movement’s implementation” don’t hang around—it’s stressful to have persistent important disagreements with everyone who’s around you. (EA’s definitional confusion might also eventually result in EA becoming a pernicious meme that’s defined itself to be great. I’m somewhat in favor of trying to make sure we really have identified the world’s most high impact causes before doing further expansion. People like Paul Cristiano have argued, convincingly IMO, that there are likely to be high-impact causes still not yet on EA movement radar. And focusing on funneling people towards a particular cause also helps address “meta trap” issues.) EA is trying to appeal to people of all ages, races, genders, political orientations, religions, etc. with very little capability for diversity engineering. It’s difficult to imagine any other group in society that’s being this ambitious.
Thanks.
This seems particularly important to me. I’d love to hear more in depth thoughts of you have any. Even if not, I think it might be worth a top level post to spur discussion.
One category of solutions is the various voting and governing systems. Score voting seems pretty solid based on my limited reading. There are also more exotic proposals like futarchy/prediction markets and eigendemocracy. The downside of systems like this is once you give people a way to keep score, they sometimes become focused on increasing their score (through forming coalitions, etc.) at the expense of figuring out what’s true.
There are also “softer” solutions like trying to spread beneficial social norms. Maybe worrying about this is overkill in a group made up of do-gooders anyway, as long as moral trade is emphasized enough that people with very different value systems can still find ways to cooperate.
You’re more than welcome to think things over and write a top level post.
This I can answer at least. The vast majority of the EAs who were down as giving 0, in the survey, matched at least 1 (and often more) of these criteria, i) full time student, ii) donated a large amount in the past already (even if not in that particular year), iii) pledged to give a substantial amount. The same applied for EAs merely giving ‘low’ amounts e.g. <$500. I give the figures in a comment somewhere on an earlier thread where this was raised (probably the survey thread).
Matthews is an EA, and identifies as one in that piece. This wasn’t about finding someone to broadcast things, this was someone within the movement trying to shape it.
(I do agree with you that we shouldn’t be trying to enlist the greater public to take sides in internal disagreements over cause prioritization within EA.)
Thanks for the reply! I would like to pick you up on a few points though...
I think all the changes I suggested can be made without the movement losing the things that currently makes it distinctive and challenging in a good way. Which of my suggested changes do you think are in danger of watering EA down too much? Do you take issue with the other changes I’ve suggested?
I completely agree we should try to change people’s attitudes about both these things. I argued that we should say “An action that makes a difference is much better than one that doesn’t, regardless of intention” rather than “An agent that makes a difference is much better than one who doesn’t” because the latter turns people against the movement and the former says everything we need to say. Again, I’m interested to know which of my suggested changes you think would stop the movement challenging society in ways that it should be?
Again, I completely agree . The things you mention are essential parts of the movement. In my post was trying to suggest ways in which we can minimize the negative image that is easily associated with these things.
You’re right, although it’s not implicit—I say explicitly that I want EA to change the attitudes of society as a whole. This is because I think EA is a great movement and, therefore, that if it has more appeal and influence it will be able to accomplish more. FWIW I don’t think it’s the last social movement we’ll ever need.
I think comments like these make the movement seem inaccessible to outsiders who aren’t rich or privileged. It seems like we disagree over whether that’s a problem or not though.
Overall it seems like you think that paying attention to our image in the ways I suggest would harm the movement by making it less distinctive. But I don’t know why you think the things I suggest would do that. I’m also interested to hear more about why you don’t think getting more members and being more influential would be a good thing.
I guess I’m not totally sure what concrete suggestions you’re trying to make. You do imply that we should stop saying things like “It’s better to become a banker and give away 10% of your income than to become a social worker” and stop holding EAs who earn and donate lots of money in high regard. So I guess I’ll run with that.
High-earning jobs are are often unpleasant and/or difficult to obtain. Not everyone is willing to get one or capable of getting one. Insofar as we de-emphasize earning to give, we are more appealing to people who can’t get one or don’t want one. But we’ll also be encouraging fewer people to jump through the hoops necessary to achieve a high-earning job, meaning more self-proclaimed “EAs” will be in “do what you’re passionate about” type jobs like going to grad school for pure math or trying to become a professional musician. Should Matt Wage have gone on to philosophy academia like his peers or not? You can’t have it both ways.
I don’t think high-earning jobs are the be all and end all of EA. I have more respect for people who work for EA organizations, because I expect they’re mostly capable of getting high-paying jobs but they chose to forgo that extra income while working almost as hard. I guess I’m kind of confused about what exactly you are proposing… are we still supposed to evaluate careers based on impact, or not? As long as we evaluate careers based on impact, we’re going to have the problem that highly capable people are able to produce a greater impact. I agree this is a problem, but I doubt there is an easy solution. Insofar as your post presents a solution, it seems like it trades off almost directly against encouraging people to pursue high-impact careers. We might be able to soften the blow a little bit but the fundamental problem still remains.
Just in terms of the “wealthy & privileged” image problem, I guess maybe making workers at highly effective nonprofits more the stars of the movement could help some? (And also help compensate for their forgone income.)
EA has its roots in philanthropy. As you say, philanthropy (e.g. in the form of giving 10% of your income) is fundamentally more accessible to rich people. It’s not clear to me that a campaign to make philanthropy seem more accessible to people who are just scraping by is ever going to be successful on a large scale. No matter what you do you are going to risk coming across as demeaning and/or condescending.
I discuss more about why I’m skeptical of movement growth in this comment. Note that some of the less philanthropy-focused brandings of the EA movement that I suggest could be a good way to include people who don’t have high-paying jobs.
Thanks a lot, this cleared up a lot of things.
I think we’re talking past each other a little bit. I’m all for EtG and didn’t mean to suggest otherwise. I think we should absolutely keep evaluating career impacts; Matt Wage made the right choice. When I said we should stop glorifying high earners I was referring to the way that they’re hero-worshipped, not our recommending EtG as a career path.
Most of my suggested changes are about the way we relate to other EAs and to outsiders, though I had a couple of more concrete suggestions about the pledge and the careers advice. I do take your point that glorifying high earners might be consequentially beneficial though: there is a bit of a trade-off here.
I hope my suggestions are compatible with encouraging people to pursue high-impact careers, but would reduce the image problem currently currently associated with it. One hope is that by distinguishing between doing good and being good we can encourage everyone to do good by high earning (or whatever) without alienating those who can’t by implying they are less virtuous, or less good people. We could also try and make the movement more inclusive to those who are less rich in other ways: e.g. campaigning for EA causes is more accessible to all.
This seem like a good idea.
Good to hear we’re mostly on the same page.
Hm, maybe I just haven’t seen much of this?
Regarding the pledge, I’m inclined to agree with this quote:
So, I’m inclined to think that preserving the simplicity of the current GWWC pledge is valuable. If someone doesn’t feel like they’re in a financial position to make that pledge, there’s always the Life You Can Save pledge, or they can skip pledging altogether. Also, note that religions have been asking their members for 10% of their income for thousands of years, many hundreds of which folks were much poorer than people typically are today.
I don’t think the existence of another pledge does much to negate the harm done by the GWWC pledge being classist.
I agree there’s value in simplicity. But we already have an exception to the rule: students only pay 1%. There’s two points here. Firstly, it doesn’t seem to harm our placard-credentials. We still advertise as “give 10%”, but on further investigation there’s a sensible exception. I think something similar could accommodate low-earners. Secondly, even if you want to keep it at one exception, students are in a much better position to give than many adults. So we should change the exception to a financial one.
Do you agree that, all things equal, the suggestions I make about how to relate to each other and other EAs are good?