Master in Public Policy student at Georgetown University. Previously worked in operations at Rethink Charity et. al. and co-founded EA Anywhere.
Marisa
Maybe I’m misunderstanding this but I disagree. I think the average person thinks spending tons of money on global health poverty is good, particularly because it has concrete, visible outcomes that show whether or not the work is worthwhile (and these quick feedback loops mean the money can usually be spent on projects we have stronger confidence in).
But I think that spending lots of money on people who might have a .000001% chance of saving the world (in ways that are often seen as absurd to the average person) is pretty bad optics. A lot non-EAs don’t think we can realistically make traction on existential risk because they haven’t seen any evidence of traction. Plus, longtermists/x-risk people can come across as having an unfounded sense of grandiosity—because there are a whole bunch of people out there who think their various projects will drastically transform the world, and most people won’t assume that the longtermist approach is the only one that’ll actually work.
Love this post and would love to see more like it on the forum! Congrats on a successful EA Student Summit.
I especially want to emphasize this:
there are so many EAs who would genuinely like to talk to you.
In my experience, EAs are almost always super willing to provide advice to others within the EA movement, often because they’re nice people, but also because they get to help you have an impact, which helps them have an impact, so everybody wins!
As a single data point, nothing makes my day more than getting emails from random EAs. :)
Hmm. On the one hand I think these are all useful topics for an EA to know. But I don’t think it’s necessary for all EAs to know these things. I think there’s a lot of EAs who don’t have this technical knowledge, but are happy to outsource decisions relying on this knowledge (such as where to donate) to people who do. That said, I think that often leads to donating less-than-effectively (e.g. giving to whatever EA Fund appeals to you personally, rather than rationally thinking about trade-offs/probabilistic outcomes).
I guess this is, in part, a big-tent vs. elite EA trade-off question. If EA is best as an elite movement, it makes sense that all the members should have this knowledge. But if we want to take an “everyone has a place in EA” approach, then it might not make sense to have a central curriculum.
Also, I don’t think we want everyone in EA to have the same skillset. EA isn’t, in my view, a single professional field, but perhaps more like a company (although this is probably an oversimplification). If a company gave all of their employees a handbook on How to Be A Great Project Manager, it’d be helpful… for project managers. But the rest of the team ought to be rounding out skills that others in the company don’t have that suit their comparative advantage and will move the company forward. The only thing everyone at the company really needs to know is the product. Basic time management / other soft skills are also useful. I don’t think we need 100% of EAs to have a solid grounding in economics. Maybe we need ~100% of EAs to trust economics. But I’d rather have some EAs focusing on building skills like movement-building, communications, fundraising, operations/management, entrepreneurship, policy, qualitative research, etc.
Granted, I’m thinking about this from the perspective of careers, rather than being able to participate in discussions in EA spaces. To answer to that aspect of it—although I certainly think it’s possible to discuss EA without knowing about economics / statistics / decision analysis knowledge, the conversation does sometimes go in this more technical direction and leave newcomers behind. The question, then, might be whether it’s the newcomers who should hold the responsibility of learning this so that they can participate in these discussions, or if the people who are discussing things at such a technical level should adjust the way they discuss these issues to make them more accessible to a non-technical audience. I lean more towards the latter (though it depends on the context).
I’ll offer a data point: I’m not particularly motivated to post on the Forum by a monetary prize. It hasn’t led me to post on the Forum more than I ordinarily would. I am somewhat interested in social rewards, but the karma feature seems to do that better than the Forum Prize.
Also, as someone who doesn’t read every single post on the Forum, I also find the Prize useful for highlighting what content is actually worth reading, but again, I think highlighting posts based on karma instead (with or without a monetary prize) would work just as well.
If the Forum Prize does continue, I do think there should be separate categories for professional researchers and “amateurs.”
Not academic or outside of EA, but this Forum comment and this Facebook post may be good starting points if you haven’t seen them already.
I agree that CEEALAR (I’m pronouncing it see-uh-lar, almost like CLR, in my head) looks a little odd and might be hard to remember the acronym for. But I also agree that to get charitable status, dropping “hotel” was probably a good choice. A lot of nonprofits in the US use “house” (e.g. Covenant House) to give more of a charitable vibe. “EA House” already sounds less for-profit, though maybe less distinctive since EA houses are all over the place. Also, Centre gives me think tank vibes, which may or may not be what you’re looking for.
If you’re tied to the name, I’d recommend dropping the first E to make it CEALAR (Centre for Effective Altruism Learning And Research) to make it more pronounceable, aesthetic, and brief.
Overall, names are hard, and I’m not sure if it’s worth stressing as people will probably keep informally calling it the EA Hotel as is.
Props for putting in the work to keep this organization alive and well. It’s a wonderful asset to the EA community. :)
I definitely agree that religious outreach is a neglected but promising area of EA community-building.
I think a big part of what makes reaching out to religious groups at least somewhat promising is that a lot of them are already trying to do good. If we focus EA outreach on the general population, or most other subpopulations that EA currently focuses outreach on, you’ll likely have some people who care about doing good, and others who have different motivations. But in many religious spaces, an obligation to help others is already at the heart of what they do. And it’s a lot easier to sell EA to someone who already agrees that we have an obligation to help others as much as possible. Of course, different sects and individual religious communities have varying degrees of commitment to service and doing good, but I would imagine there’s some research already available on which groups are most oriented towards doing good (and if not, this is certainly doable research).
Also, from anecdotal experiences from friends and ex-colleagues as well as my own personal experience, I know a lot of agnostic/atheists who are involved in religious groups because they’re looking for a community, and often more specifically, they were looking for a community oriented towards thinking deeply about the world’s truths and/or doing good in the world. I think EA groups would fulfill this need for a lot of people (and perhaps relieve them from having to pretend to believe something they don’t in exchange for social support).
I’ve been thinking about this also so I’m glad to see this post!
Anecdote: I’ve been talking to friends and family about COVID-19 since late January/February, and started my first attempts at social distancing towards the beginning of March. In these first few days, a lot of my (non-EA) friends seemed to think this response was an overreaction. Later on, a lot of them came around to say, “wow, you were right,” which I’ve tried to use to point some more credibility towards EA.
Some not-fully-formed ideas I have about this:
I think there’s an opportunity, once this begins to resolve, to hopefully get some media attention and say something along the lines of “there’s a community of people who were thinking about this before it happened, and who are thinking of other possible threats. Here’s how you can help.”
I also think that we might see an influx of young not-currently-EAs interested in helping address future pandemics, but I suspect a lot of them will be drawn to becoming doctors and nurses. So perhaps some publicity for these careers and/or a talent pipeline for biosecurity careers might be useful.
I had similar concerns about our Operations AMA recently. It wasn’t wildly popular, but we got 7 questions and I still felt like it was a good use of my time. Several people in the group said they really enjoyed it and would be interested in doing another one, and I liked it enough that I’m planning to do another AMA for one of my other projects as well.
I’ll also mention that it’s a (relatively) low-effort way to create content (and get karma, if you care). I often feel like I should post to the Forum more but either don’t feel like I have anything worth posting, or don’t have the time to write anything out, but the nice thing about AMAs is that you don’t have to come up with a novel topic that fits neatly into a typical EA Forum post, and the standard for quality as far as formatting/organization/etc. is lower.
The only downsides of posting that I see is time spent on creating the post (I estimate we collectively spent about an hour on this, though I think you could do a less detailed one in 15 minutes), and I suppose the possible embarrassment of not getting asked any questions, but I think this is unlikely (I don’t think it’s ever happened on the Forum), and you can always delete the post if you’re really concerned about that.
FWIW I think you’d be well-suited to do an AMA :)
That makes sense though I feel like this still applies. It’s still not great optics to pay lots of money to people working on global poverty, but it’s far from unheard of and, if there’s concrete evidence that those people are having an impact then I think a lot of people would consider it justified.
I think the reason it’s acceptable for AI researchers to bring in large sums of money is more because of the market rate for their skillset and less because of the cause directly. I think if someone were paid a high salary to build complex software that solved poverty (if such a thing existed) I would guess that that would be viewed roughly equally. On the other hand if you pay longtermist and/or global poverty community-builders lots of money, this looks much worse.
Good catch, thank you!
Strongly agree with your points, although I also don’t think they’re mutually exclusive to the content of this post.
I think some of the most value I got out of university (and high school, to be honest) was the ability to try out a bunch of things at once with relative ease. I have a lot of interests that change and come and go rather quickly, and in the university setting, it was strangely easy to get involved in whatever new thing that caught my attention, whether via a course, a club, meetings with a professor, an internship, a volunteer opportunity, etc. (Though I attended a small liberal arts college, which might have made this process easier.) I learned a lot about what I like, what I don’t like, what I’m good at and what I suck at a lot more quickly than I think I could outside of university, and I think a lot of this became valuable data for deciding on a career, in addition to opening doors to opportunities.
I think a common mistake I see in university students is thinking “I just want to focus on school” for their first three years, trying to secure an internship during the summer of their junior year, and then hoping that’s sufficient to get them a job. I don’t think this is a great idea. At the same time, I think narrowly focusing on identifying and pursuing a high-income, stable career path (or whatever one’s ideal career plan looks like) carries a lot of risk of burnout, poor performance, and misery if you’re unlucky enough to get it wrong. I think I see more students err in the former direction that the latter though, although I imagine EA students probably have a higher tendency to over-optimize their career path.
I guess I somewhat lucked out in that a) my courseload was light enough that it allowed me to get very involved outside of class, and b) a lot of the things I was excited about were also employable skills. I guess if this isn’t the case for someone, the “seek joy” and “plan your career” might come more into conflict, but that wasn’t my experience.
Great post! Been meaning to comment for a while—better late than never than suppose.
One thing I wanted to add—I’ve talked with ~50 people who are interested in working at EA orgs over the last six months or so, and it seems like a lot of them come to the decision through process of elimination. Common trends I see:
They don’t feel well-suited for policy, often because it’s too bureaucratic or requires a high level of social skills.
They don’t feel well-suited for academia, usually because they have less-than-stellar marks or dislike the expected output or bureaucracy of academia.
And they aren’t interested in earning-to-give, almost always because of a lack of cultural fit. (They want to have colleagues who are also motivated to do good in the world.)
Per 80,000 Hours recommended career paths, that pretty much leaves working at effective nonprofits as the only option. And conveniently, nonprofit work (especially non-research roles) doesn’t usually come with a high bar of qualifications. A lot of positions don’t require a bachelor’s degree. Depending on the role, it’s not uncommon to find a year of vaguely-defined experience as the only minimum qualification for an entry-level job. So that seems like a reasonable choice for a lot of people… except that hundreds of other EAs also see this as a reasonable choice, and the competition grows very quickly.
I’ve certainly met EAs who seem really well-suited for direct work at EA orgs. But, in part because of the reasons mentioned above, I think the majority of people would be better off focusing their jobseeking efforts somewhere else. I do worry about swinging the pendulum too far in the opposite direction, where talented people stop applying for EA organizations.
I guess my recommendation for people interested in direct work would be to apply to EA organizations that interest you and that you think fit your skillset, but, at the same time, to also apply for EA-aligned organizations and/or impactful non-EA jobs where replaceability is likely to be lower. I also think, if you’re uncertain about whether to apply for or accept an EA direct work role, you can usually talk to the hiring manager about what they feel like your counterfactual impact might be. The nice thing about applying EA orgs is that they understand those concerns, and it likely won’t negatively affect your application—in fact, it might reflect positively on you for thinking critically and altruistically (for lack of a better word) about your career.
I found the Unlocking Your Employability course on EdX had a lot of useful activities for improving self-marketing. Learning How to Learn on Coursera was also helpful, though it doesn’t have as many interactive activities. I’ve also heard good things about this Creative Problem Solving course, but I haven’t had the chance to try it myself.
Lynette Bye’s Productivity Tips also has a lot of useful resources for improving personal productivity.
I would also add CFAR as probably the most helpful tool I’ve gotten for improving productivity and decision-making in both personal and professional contexts. :)
This is a really helpful list! I noticed a couple of organizations that I consider EA aligned that should perhaps be on the list:
Development Media International, a GiveWell-recommended charity
Organization for the Prevention of Intense Suffering did a talk at EAGxVirtual Unconference. They seem to be doing a lot of research on treatments for cluster headaches
If you haven’t already, I’d reach out directly to GPR organizations and mention that you’re interested in applying your skillset to their work. They might be able to provide you with some concrete examples and a better idea of what’s available in the field.
Great post! I know a little bit about the US side of things from both watching orgs I’ve worked with go through the process, and working at a start-up that helped charities get 501(c)(3) status, so I can offer some data points from that perspective.
The IRS estimates that the DIY method would take 100+ hours. It’s also worth considering that this method is most likely to lead to mistakes, which can lead to having to re-submit the application and delays in processing time.
US charity lawyers cost around the same, although there are companies that’ll do this for you at a cheaper rate. Harbor Compliance is the most popular I’ve seen, and most orgs I know who’ve used this service pay around $3k. There are also smaller companies that will do this for even less (the company I worked for offered it for ~$750 at the time, but lots of prospects told us they’d found even cheaper options), but these companies often have fewer resources and/or lower success rates.
Worth noting that, if you don’t want to deal with the 501(c)(3) process off the bat, fiscal sponsorship is also a good option. (Shameless plug, Rethink Charity is offering this service for EA projects and organizations.)
I think one of the best things about hearing about EA pre-college is it would let you set up your college plan (e.g., major, internships) in an EA-directed way
To me, this seems like the best case for engaging with high schoolers over college students. I seem to meet a lot of EAs who study something that doesn’t correlate well with most high-impact careers and find themselves wishing they’d heard about EA sooner so they could have studied something more practical.
The major questions I have with this are 1) can you actually convince high schoolers to change their career plans, and 2) if so, will they actually apply EA ideas in a way that increases their impact? (or as opposed to just blindly following 80k recommendations and doing something they don’t like or aren’t good at.) I guess both are also risks associated with trying to get anyone to make an EA-related career change, but high schoolers seem more at risk to me, particularly with #2 since I think they have less self-awareness regarding their skills and interests.
Very valid! I guess I’m thinking of this as “approaches EA values” [verb] rather than “values” [noun]. I think most if not all of the most abstract values EA holds are still in place, but the distinction between core and secondary values is important.
I totally understand your concerns. FWIW as a former group organizer, as the Torres pieces were coming out, I had a lot of members express serious concerns about longtermism as a result of the articles and ask for my thoughts about them, so I appreciate having something to point them to that (in my opinion) summarizes the counterpoints well.