Master in Public Policy student at Georgetown University. Previously worked in operations at Rethink Charity et. al. and co-founded EA Anywhere.
Marisa
I agree that CEEALAR (I’m pronouncing it see-uh-lar, almost like CLR, in my head) looks a little odd and might be hard to remember the acronym for. But I also agree that to get charitable status, dropping “hotel” was probably a good choice. A lot of nonprofits in the US use “house” (e.g. Covenant House) to give more of a charitable vibe. “EA House” already sounds less for-profit, though maybe less distinctive since EA houses are all over the place. Also, Centre gives me think tank vibes, which may or may not be what you’re looking for.
If you’re tied to the name, I’d recommend dropping the first E to make it CEALAR (Centre for Effective Altruism Learning And Research) to make it more pronounceable, aesthetic, and brief.
Overall, names are hard, and I’m not sure if it’s worth stressing as people will probably keep informally calling it the EA Hotel as is.
Props for putting in the work to keep this organization alive and well. It’s a wonderful asset to the EA community. :)
I worry that without it it’s too similar to CEA though
Valid.
You could keep the name but drop the first ‘A’: CEELAR
I also like this.
Thank you! :)
I agree with this. But I also want to add—I think a lot of EAs are put off from the rationalist community for different reasons (e.g. seemingly less-than-altruistic motivations, inaccessible language, discussions about things that don’t always feel practically relevant, etc.)
From a personal anecdote: I’ve had an eye on LessWrong and other rationalist spaces for some time, but never thought it was my territory for some of the reasons mentioned above. It wasn’t until I went to a CFAR workshop when I finally felt I knew enough about rationality to actually contribute to rationality discussions.
I see a lot of work being done to make EA more accessible to people who don’t have personal ties to the EA community, but not as much effort from the rationalist community on this. I feel like this could potentially be impactful for contributing to the personal development of EAs.
I’ve been thinking about this also so I’m glad to see this post!
Anecdote: I’ve been talking to friends and family about COVID-19 since late January/February, and started my first attempts at social distancing towards the beginning of March. In these first few days, a lot of my (non-EA) friends seemed to think this response was an overreaction. Later on, a lot of them came around to say, “wow, you were right,” which I’ve tried to use to point some more credibility towards EA.
Some not-fully-formed ideas I have about this:
I think there’s an opportunity, once this begins to resolve, to hopefully get some media attention and say something along the lines of “there’s a community of people who were thinking about this before it happened, and who are thinking of other possible threats. Here’s how you can help.”
I also think that we might see an influx of young not-currently-EAs interested in helping address future pandemics, but I suspect a lot of them will be drawn to becoming doctors and nurses. So perhaps some publicity for these careers and/or a talent pipeline for biosecurity careers might be useful.
You’re probably right—mostly wondering if someone had more rigorous evidence on this (or ideas on how to get it) or examples beyond the mainstream ones.
Not academic or outside of EA, but this Forum comment and this Facebook post may be good starting points if you haven’t seen them already.
I’ll offer a data point: I’m not particularly motivated to post on the Forum by a monetary prize. It hasn’t led me to post on the Forum more than I ordinarily would. I am somewhat interested in social rewards, but the karma feature seems to do that better than the Forum Prize.
Also, as someone who doesn’t read every single post on the Forum, I also find the Prize useful for highlighting what content is actually worth reading, but again, I think highlighting posts based on karma instead (with or without a monetary prize) would work just as well.
If the Forum Prize does continue, I do think there should be separate categories for professional researchers and “amateurs.”
I found this post really interesting—thank you!
One question I have after reading is the tractability of increasing benevolence, intelligence, and power. I get the sense that increasing benevolence might be the least tractable (though 80,000 Hours seems to think it might still be worth pursuing), though I’m less sure about how intelligence and power compare. (I’m inclined to think intelligence is somewhat more tractable, but I’m highly uncertain about that.)
Thanks for this post! I found it quite helpful.
I have a couple of questions about the checklist you linked, though I’m not sure how strongly you endorse it.
First:
Is there a substantial amount of literature in your field?
and
Was there a major discovery in the field in recent years?
seem to be indicators of neglectedness, which might make the topics more appealing to EAs. Do you think these are better pursued outside of academia? Or not at all?
Second:
Do you want a career in academia?
Is there a better option for prospective PhD students who want a career in research outside of academia?
I found the Unlocking Your Employability course on EdX had a lot of useful activities for improving self-marketing. Learning How to Learn on Coursera was also helpful, though it doesn’t have as many interactive activities. I’ve also heard good things about this Creative Problem Solving course, but I haven’t had the chance to try it myself.
Lynette Bye’s Productivity Tips also has a lot of useful resources for improving personal productivity.
I would also add CFAR as probably the most helpful tool I’ve gotten for improving productivity and decision-making in both personal and professional contexts. :)
Great response—thank you!
Great post! Been meaning to comment for a while—better late than never than suppose.
One thing I wanted to add—I’ve talked with ~50 people who are interested in working at EA orgs over the last six months or so, and it seems like a lot of them come to the decision through process of elimination. Common trends I see:
They don’t feel well-suited for policy, often because it’s too bureaucratic or requires a high level of social skills.
They don’t feel well-suited for academia, usually because they have less-than-stellar marks or dislike the expected output or bureaucracy of academia.
And they aren’t interested in earning-to-give, almost always because of a lack of cultural fit. (They want to have colleagues who are also motivated to do good in the world.)
Per 80,000 Hours recommended career paths, that pretty much leaves working at effective nonprofits as the only option. And conveniently, nonprofit work (especially non-research roles) doesn’t usually come with a high bar of qualifications. A lot of positions don’t require a bachelor’s degree. Depending on the role, it’s not uncommon to find a year of vaguely-defined experience as the only minimum qualification for an entry-level job. So that seems like a reasonable choice for a lot of people… except that hundreds of other EAs also see this as a reasonable choice, and the competition grows very quickly.
I’ve certainly met EAs who seem really well-suited for direct work at EA orgs. But, in part because of the reasons mentioned above, I think the majority of people would be better off focusing their jobseeking efforts somewhere else. I do worry about swinging the pendulum too far in the opposite direction, where talented people stop applying for EA organizations.
I guess my recommendation for people interested in direct work would be to apply to EA organizations that interest you and that you think fit your skillset, but, at the same time, to also apply for EA-aligned organizations and/or impactful non-EA jobs where replaceability is likely to be lower. I also think, if you’re uncertain about whether to apply for or accept an EA direct work role, you can usually talk to the hiring manager about what they feel like your counterfactual impact might be. The nice thing about applying EA orgs is that they understand those concerns, and it likely won’t negatively affect your application—in fact, it might reflect positively on you for thinking critically and altruistically (for lack of a better word) about your career.
Strongly agree with your points, although I also don’t think they’re mutually exclusive to the content of this post.
I think some of the most value I got out of university (and high school, to be honest) was the ability to try out a bunch of things at once with relative ease. I have a lot of interests that change and come and go rather quickly, and in the university setting, it was strangely easy to get involved in whatever new thing that caught my attention, whether via a course, a club, meetings with a professor, an internship, a volunteer opportunity, etc. (Though I attended a small liberal arts college, which might have made this process easier.) I learned a lot about what I like, what I don’t like, what I’m good at and what I suck at a lot more quickly than I think I could outside of university, and I think a lot of this became valuable data for deciding on a career, in addition to opening doors to opportunities.
I think a common mistake I see in university students is thinking “I just want to focus on school” for their first three years, trying to secure an internship during the summer of their junior year, and then hoping that’s sufficient to get them a job. I don’t think this is a great idea. At the same time, I think narrowly focusing on identifying and pursuing a high-income, stable career path (or whatever one’s ideal career plan looks like) carries a lot of risk of burnout, poor performance, and misery if you’re unlucky enough to get it wrong. I think I see more students err in the former direction that the latter though, although I imagine EA students probably have a higher tendency to over-optimize their career path.
I guess I somewhat lucked out in that a) my courseload was light enough that it allowed me to get very involved outside of class, and b) a lot of the things I was excited about were also employable skills. I guess if this isn’t the case for someone, the “seek joy” and “plan your career” might come more into conflict, but that wasn’t my experience.
I heard in mid-2019 that Open Phil was interviewing EAs to identify common characteristics / background among people are most receptive to EA. I don’t think it was published though.
If you haven’t already, I’d reach out directly to GPR organizations and mention that you’re interested in applying your skillset to their work. They might be able to provide you with some concrete examples and a better idea of what’s available in the field.
Hmm. On the one hand I think these are all useful topics for an EA to know. But I don’t think it’s necessary for all EAs to know these things. I think there’s a lot of EAs who don’t have this technical knowledge, but are happy to outsource decisions relying on this knowledge (such as where to donate) to people who do. That said, I think that often leads to donating less-than-effectively (e.g. giving to whatever EA Fund appeals to you personally, rather than rationally thinking about trade-offs/probabilistic outcomes).
I guess this is, in part, a big-tent vs. elite EA trade-off question. If EA is best as an elite movement, it makes sense that all the members should have this knowledge. But if we want to take an “everyone has a place in EA” approach, then it might not make sense to have a central curriculum.
Also, I don’t think we want everyone in EA to have the same skillset. EA isn’t, in my view, a single professional field, but perhaps more like a company (although this is probably an oversimplification). If a company gave all of their employees a handbook on How to Be A Great Project Manager, it’d be helpful… for project managers. But the rest of the team ought to be rounding out skills that others in the company don’t have that suit their comparative advantage and will move the company forward. The only thing everyone at the company really needs to know is the product. Basic time management / other soft skills are also useful. I don’t think we need 100% of EAs to have a solid grounding in economics. Maybe we need ~100% of EAs to trust economics. But I’d rather have some EAs focusing on building skills like movement-building, communications, fundraising, operations/management, entrepreneurship, policy, qualitative research, etc.
Granted, I’m thinking about this from the perspective of careers, rather than being able to participate in discussions in EA spaces. To answer to that aspect of it—although I certainly think it’s possible to discuss EA without knowing about economics / statistics / decision analysis knowledge, the conversation does sometimes go in this more technical direction and leave newcomers behind. The question, then, might be whether it’s the newcomers who should hold the responsibility of learning this so that they can participate in these discussions, or if the people who are discussing things at such a technical level should adjust the way they discuss these issues to make them more accessible to a non-technical audience. I lean more towards the latter (though it depends on the context).
I’ve spent a lot of time thinking about this, and I largely agree with you. I also think studying “pure” value drift (as opposed to “symptoms” of value drift, which is what a lot of the research in this area focuses on, including, to some extent, my own) comes with a few challenges. (Epistemic status: Pretty uncertain and writing this in haste. Feel free to tell me why I’m wrong.)
EA isn’t (supposed to be) dogmatic, and hence doesn’t have clearly defined values. We’re “effective” and we’re “altruistic,” and those are more or less the only requirements to being EA. But what is altruism? Is it altruistic to invest in yourself so you can have more of an impact later on in life? Effectiveness, on the surface, seems more objective, since it mostly means relying on high-quality evidence and reasoning. But evidence and reason can get messy and biased, which can make defining effectiveness more difficult. For example, even if valuing effectiveness leads you to choose to believe in interventions that have the most robust evidence, it’s possible that that robust evidence might come from p-hacking, publication bias, or studies with an over-representation of middle-class people from high-income countries. At some point effectiveness (from the EA point of view) also hinges on a valuing certainty vs. risk-taking, and probably a number of other sub-values as well.
Measuring raw values relies primarily on self-reporting, which is a notoriously unreliable social science method. People often say they value one thing and then act in a contradictory manner. Sometimes it’s a signaling thing, but sometimes we just don’t really understand ourselves that well. Classic example: a young college student says they don’t care much about financial stability, until they actually enter the workforce, get a not-super-well-paid job, and realize that maybe they actually do care. I think this is a big reason why people have chosen to focus on behavior and community involvement. It’s the closest thing to objective data we can get.
This isn’t an argument against what you’ve written. I still think a lot of people err on assigning the label “value drift” to things like leaving the EA community that could be caused by a number of different scenarios in which it actually perfectly reflects your values to do that thing. I guess I don’t know what the solution is here, but I do think it’s worth digging into further.
I think one of the best things about hearing about EA pre-college is it would let you set up your college plan (e.g., major, internships) in an EA-directed way
To me, this seems like the best case for engaging with high schoolers over college students. I seem to meet a lot of EAs who study something that doesn’t correlate well with most high-impact careers and find themselves wishing they’d heard about EA sooner so they could have studied something more practical.
The major questions I have with this are 1) can you actually convince high schoolers to change their career plans, and 2) if so, will they actually apply EA ideas in a way that increases their impact? (or as opposed to just blindly following 80k recommendations and doing something they don’t like or aren’t good at.) I guess both are also risks associated with trying to get anyone to make an EA-related career change, but high schoolers seem more at risk to me, particularly with #2 since I think they have less self-awareness regarding their skills and interests.
I definitely agree that religious outreach is a neglected but promising area of EA community-building.
I think a big part of what makes reaching out to religious groups at least somewhat promising is that a lot of them are already trying to do good. If we focus EA outreach on the general population, or most other subpopulations that EA currently focuses outreach on, you’ll likely have some people who care about doing good, and others who have different motivations. But in many religious spaces, an obligation to help others is already at the heart of what they do. And it’s a lot easier to sell EA to someone who already agrees that we have an obligation to help others as much as possible. Of course, different sects and individual religious communities have varying degrees of commitment to service and doing good, but I would imagine there’s some research already available on which groups are most oriented towards doing good (and if not, this is certainly doable research).
Also, from anecdotal experiences from friends and ex-colleagues as well as my own personal experience, I know a lot of agnostic/atheists who are involved in religious groups because they’re looking for a community, and often more specifically, they were looking for a community oriented towards thinking deeply about the world’s truths and/or doing good in the world. I think EA groups would fulfill this need for a lot of people (and perhaps relieve them from having to pretend to believe something they don’t in exchange for social support).