Thank you for reading the post and for the helpful comment! I totally agree that “do a lot of good” isn’t particularly unique or sexy a message. As I mentioned in the post, I believe that part of EA’s early appeal was that maximization was so radical and elusive. I do think there is a fairly big difference between the two messages though, especially when it comes to elite donors. Academics like Anand Giridharadas (in Winners Take All) and Rob Reich (in Just Giving) argue that elite philanthropy is often a well-disguised charade to boost the donor’s own power and status. I’m sympathetic to the argument that sometimes, these elites are unaware of the ways in which their giving (e.g. to private schools, religious institutions, the arts, etc.) increase existing inequalities. However, it’s also easy to argue that some of these wealthy individuals are not doing “a lot of good”. I don’t think “do a lot of good” is some magic bullet that will fix billionaire philanthropy, but I also think that it might make it easier to convince elites they need to change their philanthropy (and other actions...) than the rhetoric of maximization.
Great point about 80k Hours. It’s certainly interesting to take a macro-level view of what a large group or whole society could do with respect to cause areas. It reminds me a little bit of trying to reason like a Rawlsian about ideal theory. Our decisions as individuals become incredibly contingent on what other members of the group decide, especially other members with wealth and power. In this ideal world, the “Neglect” area becomes arguably the most important, which seems to take away from the power of EA (because, as you say, our niche is taking on the big things not just the things that nobody is doing). I think we’re in a time when there are tons of Important and Tractable causes which aren’t being altogether neglected but still need more resources. All this is just roundabout way of saying I’m skeptical of both the usefulness and feasibility of trying to take the macro perspective and maximize a group’s impact.
I’m particularly sympathetic to your point about elitism, and part of my motivation for this post was to try to temper that problem within EA. In my conversations with friends about EA, it’s never the idea of maximization that changes their worldview. Instead, they’re usually more interested in the argument that there are big ways to make impact which elite philanthropy largely ignores. If you’re talking with somebody who dedicates her free time volunteering for an org like Make A Wish, maximization is a non-starter, but sewing the seed that there might be additional avenues for impact will at least allow for some discourse.
On your last point, I think that the “everybody just does a little good” world is already the world we live in! I agree that there is serious need to for groups of people to tackle the big things, but in an ideal world, this is what governments do. Just like many nonprofits say, EA’s main goal should be to not need to exist (because institutions are tackling high-ITN goals efficiently).
Thank you for reading the post and for the helpful comment! I totally agree that “do a lot of good” isn’t particularly unique or sexy a message. As I mentioned in the post, I believe that part of EA’s early appeal was that maximization was so radical and elusive. I do think there is a fairly big difference between the two messages though, especially when it comes to elite donors. Academics like Anand Giridharadas (in Winners Take All) and Rob Reich (in Just Giving) argue that elite philanthropy is often a well-disguised charade to boost the donor’s own power and status. I’m sympathetic to the argument that sometimes, these elites are unaware of the ways in which their giving (e.g. to private schools, religious institutions, the arts, etc.) increase existing inequalities. However, it’s also easy to argue that some of these wealthy individuals are not doing “a lot of good”. I don’t think “do a lot of good” is some magic bullet that will fix billionaire philanthropy, but I also think that it might make it easier to convince elites they need to change their philanthropy (and other actions...) than the rhetoric of maximization.
Great point about 80k Hours. It’s certainly interesting to take a macro-level view of what a large group or whole society could do with respect to cause areas. It reminds me a little bit of trying to reason like a Rawlsian about ideal theory. Our decisions as individuals become incredibly contingent on what other members of the group decide, especially other members with wealth and power. In this ideal world, the “Neglect” area becomes arguably the most important, which seems to take away from the power of EA (because, as you say, our niche is taking on the big things not just the things that nobody is doing). I think we’re in a time when there are tons of Important and Tractable causes which aren’t being altogether neglected but still need more resources. All this is just roundabout way of saying I’m skeptical of both the usefulness and feasibility of trying to take the macro perspective and maximize a group’s impact.
I’m particularly sympathetic to your point about elitism, and part of my motivation for this post was to try to temper that problem within EA. In my conversations with friends about EA, it’s never the idea of maximization that changes their worldview. Instead, they’re usually more interested in the argument that there are big ways to make impact which elite philanthropy largely ignores. If you’re talking with somebody who dedicates her free time volunteering for an org like Make A Wish, maximization is a non-starter, but sewing the seed that there might be additional avenues for impact will at least allow for some discourse.
On your last point, I think that the “everybody just does a little good” world is already the world we live in! I agree that there is serious need to for groups of people to tackle the big things, but in an ideal world, this is what governments do. Just like many nonprofits say, EA’s main goal should be to not need to exist (because institutions are tackling high-ITN goals efficiently).