Master in Public Policy student at Georgetown University. Previously worked in operations at Rethink Charity et. al. and co-founded EA Anywhere.
Marisa
Not an existing nonprofit, but you might also be interested in creating one of the Software Platforms in this post.
Great post! I know a little bit about the US side of things from both watching orgs I’ve worked with go through the process, and working at a start-up that helped charities get 501(c)(3) status, so I can offer some data points from that perspective.
The IRS estimates that the DIY method would take 100+ hours. It’s also worth considering that this method is most likely to lead to mistakes, which can lead to having to re-submit the application and delays in processing time.
US charity lawyers cost around the same, although there are companies that’ll do this for you at a cheaper rate. Harbor Compliance is the most popular I’ve seen, and most orgs I know who’ve used this service pay around $3k. There are also smaller companies that will do this for even less (the company I worked for offered it for ~$750 at the time, but lots of prospects told us they’d found even cheaper options), but these companies often have fewer resources and/or lower success rates.
Worth noting that, if you don’t want to deal with the 501(c)(3) process off the bat, fiscal sponsorship is also a good option. (Shameless plug, Rethink Charity is offering this service for EA projects and organizations.)
Great post! I think you’re one of the first uni groups I’ve seen who’s particularly selective with their fellowship—I wouldn’t have initially agreed with that strategy, but you give convincing reasons for doing so and it sounds like it paid off. :) Congrats on a successful fellowship round!
I think one of the best things about hearing about EA pre-college is it would let you set up your college plan (e.g., major, internships) in an EA-directed way
To me, this seems like the best case for engaging with high schoolers over college students. I seem to meet a lot of EAs who study something that doesn’t correlate well with most high-impact careers and find themselves wishing they’d heard about EA sooner so they could have studied something more practical.
The major questions I have with this are 1) can you actually convince high schoolers to change their career plans, and 2) if so, will they actually apply EA ideas in a way that increases their impact? (or as opposed to just blindly following 80k recommendations and doing something they don’t like or aren’t good at.) I guess both are also risks associated with trying to get anyone to make an EA-related career change, but high schoolers seem more at risk to me, particularly with #2 since I think they have less self-awareness regarding their skills and interests.
Matthew Dellavedova is on Momentum’s board, and they’re an EA-aligned org, so I suspect he might be EA-sympathetic (or at the very least familiar with it).
I’ve spent a lot of time thinking about this, and I largely agree with you. I also think studying “pure” value drift (as opposed to “symptoms” of value drift, which is what a lot of the research in this area focuses on, including, to some extent, my own) comes with a few challenges. (Epistemic status: Pretty uncertain and writing this in haste. Feel free to tell me why I’m wrong.)
EA isn’t (supposed to be) dogmatic, and hence doesn’t have clearly defined values. We’re “effective” and we’re “altruistic,” and those are more or less the only requirements to being EA. But what is altruism? Is it altruistic to invest in yourself so you can have more of an impact later on in life? Effectiveness, on the surface, seems more objective, since it mostly means relying on high-quality evidence and reasoning. But evidence and reason can get messy and biased, which can make defining effectiveness more difficult. For example, even if valuing effectiveness leads you to choose to believe in interventions that have the most robust evidence, it’s possible that that robust evidence might come from p-hacking, publication bias, or studies with an over-representation of middle-class people from high-income countries. At some point effectiveness (from the EA point of view) also hinges on a valuing certainty vs. risk-taking, and probably a number of other sub-values as well.
Measuring raw values relies primarily on self-reporting, which is a notoriously unreliable social science method. People often say they value one thing and then act in a contradictory manner. Sometimes it’s a signaling thing, but sometimes we just don’t really understand ourselves that well. Classic example: a young college student says they don’t care much about financial stability, until they actually enter the workforce, get a not-super-well-paid job, and realize that maybe they actually do care. I think this is a big reason why people have chosen to focus on behavior and community involvement. It’s the closest thing to objective data we can get.
This isn’t an argument against what you’ve written. I still think a lot of people err on assigning the label “value drift” to things like leaving the EA community that could be caused by a number of different scenarios in which it actually perfectly reflects your values to do that thing. I guess I don’t know what the solution is here, but I do think it’s worth digging into further.
Hmm. On the one hand I think these are all useful topics for an EA to know. But I don’t think it’s necessary for all EAs to know these things. I think there’s a lot of EAs who don’t have this technical knowledge, but are happy to outsource decisions relying on this knowledge (such as where to donate) to people who do. That said, I think that often leads to donating less-than-effectively (e.g. giving to whatever EA Fund appeals to you personally, rather than rationally thinking about trade-offs/probabilistic outcomes).
I guess this is, in part, a big-tent vs. elite EA trade-off question. If EA is best as an elite movement, it makes sense that all the members should have this knowledge. But if we want to take an “everyone has a place in EA” approach, then it might not make sense to have a central curriculum.
Also, I don’t think we want everyone in EA to have the same skillset. EA isn’t, in my view, a single professional field, but perhaps more like a company (although this is probably an oversimplification). If a company gave all of their employees a handbook on How to Be A Great Project Manager, it’d be helpful… for project managers. But the rest of the team ought to be rounding out skills that others in the company don’t have that suit their comparative advantage and will move the company forward. The only thing everyone at the company really needs to know is the product. Basic time management / other soft skills are also useful. I don’t think we need 100% of EAs to have a solid grounding in economics. Maybe we need ~100% of EAs to trust economics. But I’d rather have some EAs focusing on building skills like movement-building, communications, fundraising, operations/management, entrepreneurship, policy, qualitative research, etc.
Granted, I’m thinking about this from the perspective of careers, rather than being able to participate in discussions in EA spaces. To answer to that aspect of it—although I certainly think it’s possible to discuss EA without knowing about economics / statistics / decision analysis knowledge, the conversation does sometimes go in this more technical direction and leave newcomers behind. The question, then, might be whether it’s the newcomers who should hold the responsibility of learning this so that they can participate in these discussions, or if the people who are discussing things at such a technical level should adjust the way they discuss these issues to make them more accessible to a non-technical audience. I lean more towards the latter (though it depends on the context).
I haven’t actually heard of any EA organizations laying off staff due to COVID-19. I wouldn’t be surprised if you see very little loss of jobs at EA orgs specifically, or even none at all, over the last couple of months. Most EA organizations seem to have a decent amount of runway, and with a lot of EA donors employed in big tech, which seems to have been relatively stable throughout COVID, there is fortunately still a decent amount of income coming through.
That said, I suspect most of the financial and hence employment impacts of COVID on EA orgs will be in the future as organizations eat into their runway, especially this giving season if donors give significantly less than usual. If there’s a big loss of jobs in EA orgs, I’d suspect it to hit around January or February, after organizations take stock after giving season.
If you haven’t already, I’d reach out directly to GPR organizations and mention that you’re interested in applying your skillset to their work. They might be able to provide you with some concrete examples and a better idea of what’s available in the field.
I heard in mid-2019 that Open Phil was interviewing EAs to identify common characteristics / background among people are most receptive to EA. I don’t think it was published though.
Strongly agree with your points, although I also don’t think they’re mutually exclusive to the content of this post.
I think some of the most value I got out of university (and high school, to be honest) was the ability to try out a bunch of things at once with relative ease. I have a lot of interests that change and come and go rather quickly, and in the university setting, it was strangely easy to get involved in whatever new thing that caught my attention, whether via a course, a club, meetings with a professor, an internship, a volunteer opportunity, etc. (Though I attended a small liberal arts college, which might have made this process easier.) I learned a lot about what I like, what I don’t like, what I’m good at and what I suck at a lot more quickly than I think I could outside of university, and I think a lot of this became valuable data for deciding on a career, in addition to opening doors to opportunities.
I think a common mistake I see in university students is thinking “I just want to focus on school” for their first three years, trying to secure an internship during the summer of their junior year, and then hoping that’s sufficient to get them a job. I don’t think this is a great idea. At the same time, I think narrowly focusing on identifying and pursuing a high-income, stable career path (or whatever one’s ideal career plan looks like) carries a lot of risk of burnout, poor performance, and misery if you’re unlucky enough to get it wrong. I think I see more students err in the former direction that the latter though, although I imagine EA students probably have a higher tendency to over-optimize their career path.
I guess I somewhat lucked out in that a) my courseload was light enough that it allowed me to get very involved outside of class, and b) a lot of the things I was excited about were also employable skills. I guess if this isn’t the case for someone, the “seek joy” and “plan your career” might come more into conflict, but that wasn’t my experience.
I’m afraid I don’t have any original recommendations, but have you read the EA Handbook Motivation Series? Nate Soares’ ‘On Caring’ might be particularly relevant.
I was also talking with some other EAs about this recently and one of them mentioned Metta meditation, which is essentially a meditation that focuses on creating an expanding circle of goodwill, which could hypothetically include the long-term future. If meditation is your thing, it might be worth a shot.
Great post! Been meaning to comment for a while—better late than never than suppose.
One thing I wanted to add—I’ve talked with ~50 people who are interested in working at EA orgs over the last six months or so, and it seems like a lot of them come to the decision through process of elimination. Common trends I see:
They don’t feel well-suited for policy, often because it’s too bureaucratic or requires a high level of social skills.
They don’t feel well-suited for academia, usually because they have less-than-stellar marks or dislike the expected output or bureaucracy of academia.
And they aren’t interested in earning-to-give, almost always because of a lack of cultural fit. (They want to have colleagues who are also motivated to do good in the world.)
Per 80,000 Hours recommended career paths, that pretty much leaves working at effective nonprofits as the only option. And conveniently, nonprofit work (especially non-research roles) doesn’t usually come with a high bar of qualifications. A lot of positions don’t require a bachelor’s degree. Depending on the role, it’s not uncommon to find a year of vaguely-defined experience as the only minimum qualification for an entry-level job. So that seems like a reasonable choice for a lot of people… except that hundreds of other EAs also see this as a reasonable choice, and the competition grows very quickly.
I’ve certainly met EAs who seem really well-suited for direct work at EA orgs. But, in part because of the reasons mentioned above, I think the majority of people would be better off focusing their jobseeking efforts somewhere else. I do worry about swinging the pendulum too far in the opposite direction, where talented people stop applying for EA organizations.
I guess my recommendation for people interested in direct work would be to apply to EA organizations that interest you and that you think fit your skillset, but, at the same time, to also apply for EA-aligned organizations and/or impactful non-EA jobs where replaceability is likely to be lower. I also think, if you’re uncertain about whether to apply for or accept an EA direct work role, you can usually talk to the hiring manager about what they feel like your counterfactual impact might be. The nice thing about applying EA orgs is that they understand those concerns, and it likely won’t negatively affect your application—in fact, it might reflect positively on you for thinking critically and altruistically (for lack of a better word) about your career.
Came here to recommend Replacing Guilt as well! Was very impactful for me :)
And we’d love to have you at one of our EA Virtual Group meetups! You can join our new Slack workspace here
Great response—thank you!
I found the Unlocking Your Employability course on EdX had a lot of useful activities for improving self-marketing. Learning How to Learn on Coursera was also helpful, though it doesn’t have as many interactive activities. I’ve also heard good things about this Creative Problem Solving course, but I haven’t had the chance to try it myself.
Lynette Bye’s Productivity Tips also has a lot of useful resources for improving personal productivity.
I would also add CFAR as probably the most helpful tool I’ve gotten for improving productivity and decision-making in both personal and professional contexts. :)
Thanks for this post! I found it quite helpful.
I have a couple of questions about the checklist you linked, though I’m not sure how strongly you endorse it.
First:
Is there a substantial amount of literature in your field?
and
Was there a major discovery in the field in recent years?
seem to be indicators of neglectedness, which might make the topics more appealing to EAs. Do you think these are better pursued outside of academia? Or not at all?
Second:
Do you want a career in academia?
Is there a better option for prospective PhD students who want a career in research outside of academia?
I found this post really interesting—thank you!
One question I have after reading is the tractability of increasing benevolence, intelligence, and power. I get the sense that increasing benevolence might be the least tractable (though 80,000 Hours seems to think it might still be worth pursuing), though I’m less sure about how intelligence and power compare. (I’m inclined to think intelligence is somewhat more tractable, but I’m highly uncertain about that.)
Believe it or not, you’re not the first person to think about this. There’s an EA dating site made years ago called reciprocity.io, although I’m not sure it gets much use anymore.
Some arguments I’ve seen in favor of this:
Dating another EA might prevent value drift.
If a relationship with a non-EA goes sour, that person might have a negative association with EA as a result.
Having a partner is generally associated with more happiness, which is perhaps intrinsically good, and perhaps good for helping one feel more motivated to do good in the world.
Some arguments against:
The skewed gender ratio in EA might make this difficult. (I’m not sure how this plays out when you take LGBTQ+ people into account.)
Dating a non-EA might persuade them to become an EA.
Personally, I feel a bit icky about actively encouraging inter-EA dating, as it feels a bit culty to me, and I think it further insulates us from the rest of the world (which I think is bad, but I think others might disagree with me on). But, at the same time, a lot of different subcultures have their own dating apps and mingling events, and I don’t think those are culty, so maybe my concerns aren’t well-founded.
Congrats! Sounds like a great fit for you. :)