Executive Director at One for the World; chair of trustees at High Impact Athletes.
Jack Lewars
Hi guys—Jack here (Executive Director at OFTW). Sabrina makes some great points here. I would add as well that we need to be mindful of how much the average American is giving to effective charities. You’re right, @Larks, that persuading people to give 1% to effective charities would be odd, if the average person would give 3% to them—but, of course, the average American gives ~0% to effective charities! We have introduced some checkout questions to give us an indication of this ‘counterfactual’ argument (‘how much do you already give (if anything) to GiveWell’s charities?’) and also have some students at Yale researching typical giving trends for graduates to give us a proxy control group.
Hi Brian—Jack here (ED at OFTW). Thanks for your thoughtful questions and it looks like Sabrina ha answered them really well :-)
Some quick additions:
as Sabrina says, activation is moderate (~2/3). However, our models suggest that we would still provide decent ROI at even 50% activation rates. One advantage that we have is that we process all our donations ourselves and so actually know our activation and retention rates, where a lot of pledge orgs have to estimate them. My reading of our data is that most orgs are extremely optimistic in their assumptions around this.
definitely open to this and some individual chapters have done it already. It’s on the roadmap!
this is where processing our donations is so helpful—we see in real time how donors behave, and so know if people stop their donations. We do ask people who cancel if they have changed cause area but we only get the usual response rates to cancellation surveys (~1%)
we do not currently plan to expand our cause areas, for few reasons. First, and for transparency, GiveWell is one of our main funders and that influences this decision. Second, our founders were very focussed on global health and poverty and so I would want their input before any change. But third, and most importantly—OFTW on average engages a donor for ~10-60 mins before they pledge (and pre-COVID this was sometimes as little as 2 mins when our volunteers were tabling). When you are recruiting people with this level of engagement, message clarity is essential. Using global health and poverty, which is both the most popular EA cause area and the simplest ‘sell’ to someone who isn’t part of EA yet, makes a lot of sense to me in this context. All that being said, this may well evolve over time!
Hi Alex—these are very good points and largely correct, I think—thanks for contributing them. I’ve added some thoughts and mitigations below:
Yes, we definitely do anchor around poverty. I think this can be good ‘scaffolding’ to come into the movement; but sometimes it will anchor people there. It is worth noting, though, that global health and poverty is consistently the most popular cause area in the EA survey, so there are clearly other factors anchoring to this cause area—it’s hard to say how much OFTW counterfactually increases this effect (and whether it counterfactually stops people from progressing beyond global health and poverty). In terms of mitigation for competing with GWWC—we are in close touch with them and both sides are working hard to foster collaboration and avoid competition.
On point 2, our experience so far is that OFTW and EA groups actually coexist very well. I think (without any systematic evidence) some of this may because a lot of EA groups don’t prioritise donations, preferring to focus on things like career advice, and so OFTW chapters can sort of ‘own’ the donation space; sometimes, though, they just find a way to work alongside each other. I’m not sure it follows that we have to ‘compete for altruistically motivated people’ - in fact, I don’t really see any reason why someone couldn’t take the OFTW pledge and then carry on engaging with EA uninterrupted—but I agree that we could compete on this front. A lot seems to depend on OFTW’s approach/message/ask. Maybe a virtue of OFTW is that we really only need people’s attention for a short period to get them to take one action—so we aren’t competing for their sustained attention, in a way that would crowd out EA programming. Indeed, we can actually be a funnel to get them to pay attention to this content—see for example our recent webinar with Toby Ord on x-risk, which attracted ~200 people, many of whom came from OFTW chapters.
Yes, fair. I’d just bear in mind, though, how many EAs were introduced via global health and poverty (again, see the EA survey for how many people came in via poverty-focussed writing from Peter Singer or Will MacAskill) and did ultimately develop/broaden/change their thinking, so again I’m not sure how much counterfactual anchoring there is from OFTW.
I haven’t watched this yet but will shortly—from your brief precis above, this criticism looks like it might apply equally to any pledge/donation org that support health and poverty causes (GWWC, Founder’s Pledge, GiveWell, EA Funds etc.).
This is absolutely true—I actually think it’s a strength of OFTW, as it happens. The reason I don’t worry overly about anchoring people at 1%/distracting from other cause areas is that I actually think most OFTW pledgers were never good candidates to be super engaged with EA in the first place—but those that are end up getting into EA anyway, and we can be a useful first point of contact to make that happen. To be fully transparent, this is basically all from anecdata, but I have met very, very few OFTW pledgers who I (subjectively) think were ever likely to be a GWWC pledger/dedicate their career to AI research. In a perfect world, we would hoover up all the ‘well I might give 1% but I really wouldn’t give 10%’ crowd and not stop any of the ‘I might give 10%/change my career’ crowd. One project to support this is to give more GWWC and other EA content to our members, so that those who were predisposed to give 10% end up doing it anyway (which has happened for a subset of OFTW members in the past, certainly).
Hey Vaidehi—I hope you’re well :-)
Just on the factual questions:
~12 - this includes some where the EA group explicitly runs the OFTW content, and some where the two just peacefully coexist. Collaboration is broadly positive but not consistent in method or depth.
Hard to say—I would guess that around 1⁄3 know about nothing except effective giving, 1⁄3 know a bit about EA but are mainly focussed on effective giving and 1⁄3 are very knowledgeable about EA/fully committed EAs themselves.
To pick up two of your risks above:
OFTW chapters are certainly vulnerable to changes in leadership, but this point would seem to apply just as strongly to EA groups on campuses, I think? So I’m not sure that we should expect leadership turnover to have any more or less of a negative effect on OFTW-EA relations that it does on EA-student relations.
In fairness, we don’t teach people those memes, or ever reference them in any of our materials or training (at least not in any of the materials or training that I have reviewed/contributed to). OFTW never mentions ETG and in general we don’t really make claims about what EA cares about or focusses on. You helped us with this page, I recall, which is probably the best summary of how we talk about EA—and it reads to me as very neutral in its phrasing: https://chapters.1fortheworld.org/info/effective-altruism-thinking/
It might be useful context to some of the comments below to highlight this page on our resource for our volunteers, which encapsulates for me how we talk about EA within OFTW (and how we signpost people to find out more): https://chapters.1fortheworld.org/info/effective-altruism-thinking/ (props to Vaidehi for helping us revise this and make it better this summer)
Will send you an email :-) you might also be interested in my post here, although it’s only tangentially-related
Thanks Jan—looking forward to hearing from you!
Interesting project mate. One use case—I am always interested to know what the total ‘value’ of the community’s donation is. Indeed, I ended up doing a back-of-an-envelope version of this for a presentation in December, using publicly available donations data. The issue is that everyone reports data over different time periods/in different formats, and there’s also a very real risk of double-counting quite a few donations, and so it’s tricky to do.
I’d be interested to keep track of a) total donations influenced by EA; and b) trends in giving over time.
Thanks for the mention :-)
Not sure how helpful this is, but grad schools typically move more money (certainly per pledger/per student/per class etc. and often in naive terms). We have no idea yet of the long term changes in attitudes/actions and how those relate to school-type.
Also FWIW someone just started raising OFTW pledges at HLS and is absolutely crushing it—about $20k/annum of pledges in about a fortnight!
What is the legal and practical feasibility of a global DAF that could facilitate tax deductible donations from any wealthy country to any charity registered in a different country (but not registered in the home country of the donor)? Practical considerations would need to include FX risk.
Or, to put it another way, how might we build a platform to let people in India or China donate to Malaria Consortium?
This exactly chimes with my experience. I’ve been hiring for 10 years now, and the range in application volume has been 10-200 for a position.
In particular, I’ve been using an opt-in for feedback for years and my experience has also been that this is requested by a very low volume of people (I’d actually guess at 5% for early rejections, rising to 75% if they did an interview, at which point most people seem to want feedback).
For what it’s worth, I think this is a moral issue as well—we have a duty to the community to try to give useful feedback when we can; and to treat people with kindness.
I try to take it in good faith when people say “I’m too busy to give feedback” but I feel that this is often not literally true; and in the rare cases where it is (maybe someone running one of the big ‘legacy’ EA organisations and getting hundreds of applicants per position), the solutions in other comments are viable.
Is there an argument here for trying to spread more of a Growth Mindset in EA? I don’t want to diminish the hurt of rejection that people feel by implying that they just need to reframe it and everything will be fine—but the approach of seeing challenges/failures as learning experiences can be genuinely transformative for people.
In general, I think developing a growth mindset is incredibly valuable, and I wonder if this is something Training for Good could look at.
I think you should be cautious about statements like:
”More cynically, maybe the data did not demonstrate a significant reduction in malaria and that in itself was taken as evidence that the data was low quality.”
From reading your post, you don’t seem to have any evidence for this at all and it doesn’t seem like you’ve made any effort to find any evidence (e.g. by asking AMF). If that is indeed the case, this suggestion is baseless.
Hi team—this sounds exciting!
I have some questions about how you will align with other university-focussed organisations (with a healthy dose of self-interest, obviously!).
If you are plausibly providing $17m-$54m in annual budget for university organising at just 17 schools, you’re likely going to dwarf every other organisation on campus (and definitely every other EA org). How will you avoid completely eclipsing other organisations with a different focus/approach/merits? A particular concern here would be if this capital overwhelmingly favoured longtermism, for example, which already benefits from an enormous imbalance in allocated EA funding.
I’m also interested in where the funding for this is coming from—this represents a dramatic increase on CEA’s budget and spend on university organising, which you seem confident that you can source.
I’m especially interested in this because this is a huge amount of money on a new initiative, I think without a track record or tested methodology. So in essence it’s a new idea, seeking to develop and test new approaches, but one being supported with a huge influx of capital (relative to other EA resources directed at these campuses). There are obviously risks here, the most obvious being that the programme might be ineffective/much less effective than alternatives, or make mistakes that have quite wide-ranging consequences, but become the dominant player anyway because of a huge asymmetry of resources.
All of this being said, I want to be clear that I’m actually super excited about this ambition and focus and One for the World will of course support as much as we can :-)
Thanks for this Yonatan and for emphasising that this is a point made in good faith.
I don’t know a lot about cults but I think those that ask people for money usually ask for it to use themselves, rather than to e.g. alleviate the suffering of farmed animals or people in extreme poverty.
There could definitely be a danger of EA becoming quite incestuous as quite a few of the recommendations above are to donate to EA orgs and so it could get to the point where we ask members of the community to fund the community.
However, there are a lot of orgs that are really very independent of EA that are heavily promoted in effective giving. ‘Meta’ giving (giving to charities that work within/on EA) is a very small slice of the pie, at least at the moment.
Finally, I would emphasise that almost all the places effective giving recommends are public charities of some sort and so open to a lot of scrutiny and transparency requirements. Again, I don’t think that’s very characteristic of cults.
I think there are lots of ways to advocate for the opportunity of effective giving without pressuring people.
Any group worried about this should reach out—we have a lot of training and resources on how to talk about giving without unduly or insensitively pressuring people.
Thank Luke. You guys are also an option in the contact form, so I’ll forward anything relevant
Hi Yonatan,
A full answer to this would be very detailed, so do fill out our form if you’d like us share resources and tactics in more detail.
In brief, I think the main thing is to frame giving as an opportunity, rather than an obligation. There are some pretty robust arguments that it actually is an obligation, if we have disposable income in high income countries—but this tends to be less effective as a persuasion strategy and has more risks around people feeling unduly pressured.
If we talk about the incredible opportunity we have to save a life, or improve animal welfare, without really making any noticeable sacrifice in our own lives, we can inspire people to give. I don’t think we need to pressure people (e.g. by saying ‘you’re a bad person’ or ‘if you don’t do this you’re not an effective altruist’). But we can absolutely raise awareness and persuade people.
Many people, especially at universities, already have some sense that they are in a position of privilege and would like to ‘make a difference’, and for these people it’s just a case of raising their awareness—you’re actually solving a problem for them. Others can be persuaded if we highlight, for example, where the median graduating salary from a university places them in the income distribution of their home country, or indeed globally.
And I think it’s worth emphasising that we’re not saying that everyone should take a pledge that will meaningfully reduce their income—if you’re earning substantially above the median wage, it’s likely that you can give something like 1% with literally no effect at all on your material quality of life. So, again, I don’t think that explaining this framing to people is pressuring them.
Ultimately, of course, any movement seeks to persuade people—we persuade people to change career plans, or majors, or eat less meat—and persuading them to give falls within this spectrum.
Thanks for this Mauricio. It’s good to have an alternative perspective added to this, which was written by quite convinced advocates for one way of thinking!
I think you make a good point that this is a theory that seems to align very closely with the reality of EA, rather than an absolutely established phenomenon. So, for example, we don’t have data in the EA survey that says ‘people say they would likely drop out if they weren’t donating’ or ‘we see higher rates of drop out amongst people who don’t donate versus those who do’. That’s not to say those statements aren’t plausibly true—it’s just the survey isn’t set up to capture them.It seems unlikely, though, that it’s coincidental that the foremost and most longstanding members of EA have given throughout their engagement and often seem to increase their giving over time (cf. Julia, Will, Toby, everyone at Longview, ~everyone at GiveWell). This also aligns with our experience of talking to the EA community. Obviously anecdotal evidence is weaker than some sort of systematic evidence but if you have a theory that is plausibly true, aligns with common sense and then is supported by a lot individual cases, that seems enough to think this is ‘signal’ rather than coincidence.
To address some specific points:-Careers advice may be more popular than programming about giving—it makes sense, as both parties want the thing on offer. It’s the opposite of asking for some sacrifice—you can receive careers advice purely out of self-interest. Equally, though, lots of students are passionate about social justice, making a difference etc. and can be attracted to EA precisely by talking about giving. Career change isn’t for everyone, especially when EA careers advice can focus on careers that need significant technical expertise, like biorisk or AI safety. Careers advice also has some hazier routes to impact in its theory of change than a lot of effective giving.
- I’d challenge the idea that the majority of students are charity sceptics. A quick Google suggests exactly the opposite Gen Z gives more and more widely than older generations. Gen Z and Millennials are seen as activist generations, so I’d be really surprised if the median Gen Z-er is a donation sceptic, and the data seems to undermine this idea reasonably firmly.
- I’m surprised a) that you haven’t seen the result of any donations and b) that you’re sceptical that is has shorted feedback loops than a career change. If you’re at university and alter your career plans, I’d guess you’d have to wait at least 2-3 years to see any impact from that? And plausibly way, way longer? If you donate $10 to AMF today, you’ll be able to see the bednet distribution you funded in a much shorter timescale. I can see a donation I made in November ’21 has already funded nets that are ready in the factory for distribution in the Congo. Maybe this changes depending on what you donate to?
- costly signalling is a widely-referenced theory (the Wikipedia pages on it are instructive), although in fairness it’s more broadly cited in relation to signalling to others rather than necessarily deepening your personal commitment (a costly signal is seen as more honest and therefore more powerful)
- Candidly, I think opportunity costs are frequently overstated. We do acknowledge this above and give examples of how giving can be incorporated into existing programming. However, we also think there’s a frequent fallacy in EA, where we make all decisions as if they are zero sum (e.g., to pick a particularly odd example, ‘we shouldn’t give blood because we could spend that time earning $x and giving it to an effective charity’, when of course almost everyone in EA can do both simultaneously). Often this choice isn’t real. Of course EA groups need to make some decisions about prioritisation; but are most EA groups genuinely so maxed out that they couldn’t weave giving into their existing programming or even run an extra session?
Overall, I think you do a good job of laying out possible drawbacks of this approach. I’m not convinced they add up to a really robust argument to neglect effective giving entirely, though. And I’d challenge you in return that maybe you’re understating the opportunity costs of only focussing on careers advice, while overstating some of these drawbacks.
Hi Naryan—this sounds like great work, well done. One for the World may be able to help you with your 2020 plans (www.1fortheworld.org). We’re a network of people giving 1% of their income to the GiveWell charities and have a couple of chapters in Canada. My email is jack [at] 1fortheworld [dot] org—please do get in touch!