Interesting! Didn’t know that.
Not academic or outside of EA, but this Forum comment and this Facebook post may be good starting points if you haven’t seen them already.
I have seen a few EAs discuss selling eggs as a way of earning-to-give. If you’re not willing to compensate at market rates, getting a donation may be a bit more difficult, but I don’t expect it to be impossible.
If I were to look for an egg cell donor, I would probably make some sort of a post or Google Doc outlining exactly the type of person you’re looking for, what you’d expect from them, and what you’re willing to compensate. Then sharing it on some EA platforms—I imagine you could generate some leads from Bountied Rationality, LessWrong, some EA Facebook groups (e.g. EA Hangout), maybe local EA groups, and perhaps the Forum.
I don’t have much to add aside from what the other two responses have said except that I think it’s possible to have opportunities that both develop the inner self and benefit others. I probably wouldn’t endorse spending all your time on these activities, but looking out for them and prioritizing them seems like a good decision to me.
I don’t think there’s a One Right Answer or a one-size-fits-all approach, but I do think that using the comparative advantage framework may be helpful here.
You’re probably right—mostly wondering if someone had more rigorous evidence on this (or ideas on how to get it) or examples beyond the mainstream ones.
First: before you schedule any one-on-ones at EAG (or wherever you are), think about what you want to get out of them/the conference in general. This post includes some sample goals to consider. What your goals are will pretty much dictate what one-on-ones will be most valuable for you.
I’m coming at this as someone whose primary goal for EAGs has generally been to clarify my career plans, and secondary goal has been to make more EA friends, so my advice will likely be skewed towards that.
Who should I meet with?
I usually find myself scheduling three types of one-on-ones
1. People who have a clear connection with my interests or projects (~70%)
2. Peers who share similar personal interests, hobbies, etc. (~20%)
3. People working in areas I don’t know much about but would like to learn more about and could plausibly see as changing my mind about a particular cause area / career option / etc. (~10%)
You can also get more people to schedule one-on-ones with you by filling out your conference app profile in full and including a couple things you’re interested in talking with people about. Also, if you’re using the Grip app, indicate “Interested” on other attendees’ profiles. The more you do this, the more meeting requests you’ll likely get!
What’s the best way to score a meeting with someone I want to talk to? (especially if I don’t have any immediately useful skills or knowledge to share)
First, speaking from experience, I find that EAs are more likely than average to hold a meeting with you even if you don’t have anything tangible to offer them. When you think about it, by helping you have more of an impact, they’re also increasing they’re own impact, which is motivating for most EAs. Don’t let not having anything to offer immediately keep you from reaching out to someone you think you could have a valuable conversation with!
That said, when you ask someone to meet, it helps if you add a sentence or two about what you’d like to talk about / why you think they’d be useful to talk to. This helps them prepare for the call better and (from anecdotal evidence) makes you more likely to get a response since they know exactly how the call will be useful for you/them.
What if someone I really want to talk to doesn’t respond? Should I follow up?
IMO, it depends, and it helps if you can read some non-obvious social cues here (or can get advice from someone who can).
Some things that have worked for me and others:
Offering to meet (perhaps virtually) later in the week / the following week, in case the person has a full schedule during the conference weekend
Meeting during their office hours (another thing first-timers are sometimes intimidated by but is actually really useful!)
What do I say? What sorts of questions should I ask?
This varies quite a bit based on what you want to know. Whatever that is, you’ll want to spend some time thinking about this before.
If you’re totally new to one-on-ones, a quick Google search on sample informational interview questions will help get you started.
If you’re the type of person who gets anxious about one-on-ones, I find it helpful to run through the conversation in my head mentally ahead of time, jotting down points I want to talk about / questions I want to ask. My conversations don’t usually follow the script in actuality, but it soothes my nerves a bit to have something to fall back on in case the conversation lulls.
Also helps to have a few pocket small-talk questions handy, particularly EA-specific ones. Things like “how did you get involved in EA?” and “what cause areas are you most excited about?”
What should I do after the conference? Should I follow up? How often/in what way?
EAG is a bit more informal than most conferences, so I find that the rules are a little more relaxed. It’s pretty typical to friend EAs on Facebook after the conference to keep in touch. It’s also nice to send a quick thank you email/message after the conference, especially to people more senior than you, and let them know down the line how you’ve used their advice.
One last thing: If one-on-ones are nerve-wracking for you, you’re not alone! As someone who had my first in-person interactions at EAG a year ago, I wasn’t sure what to expect meeting with EAs for the first time, but I’ve found EAs to be incredibly helpful and friendly. And, if for whatever reason, if you find your experience to be anything less than that, CEA has an awesome community health team available to help you out. :)
I’ve been thinking about this also so I’m glad to see this post!
Anecdote: I’ve been talking to friends and family about COVID-19 since late January/February, and started my first attempts at social distancing towards the beginning of March. In these first few days, a lot of my (non-EA) friends seemed to think this response was an overreaction. Later on, a lot of them came around to say, “wow, you were right,” which I’ve tried to use to point some more credibility towards EA.
Some not-fully-formed ideas I have about this:
I think there’s an opportunity, once this begins to resolve, to hopefully get some media attention and say something along the lines of “there’s a community of people who were thinking about this before it happened, and who are thinking of other possible threats. Here’s how you can help.”
I also think that we might see an influx of young not-currently-EAs interested in helping address future pandemics, but I suspect a lot of them will be drawn to becoming doctors and nurses. So perhaps some publicity for these careers and/or a talent pipeline for biosecurity careers might be useful.
Who are you?
I’m Marisa :) I’m a recent grad currently working in operations at Rethink Charity and volunteering with social research at ALLFED.
What are some things people can talk to you about? (e.g. your areas of experience/expertise)
I talk to a lot of people about EA ops and getting a job at an EA org, but I generally see myself as a starting point for these conversations and will usually try to connect you with someone else who works more closely in the area you’re interested in or has more experience.
My coursework and experience with nonprofit boards and nonprofit communications
Value drift in EA
Social science research techniques
My (awesome) recent experience at CFAR
Also love talking to other college students / recent grads, particularly those making big career decisions!
What are things you’d like to talk to other people about? (e.g. things you want to learn)
Potentially impactful research topics in moral psychology, especially those that support moral and philanthropic education.
US and international public policy—particularly what kinds of roles are impactful and what determines personal fit for those roles.
Other applications for the social sciences in EA that I haven’t considered yet
Advice on starting a local or uni group from scratch
How can people get in touch with you?
Email me at marisajurczyk[at]gmail[dot]com or schedule a time on Calendly!
This might not be exactly what you’re looking for, but I’ll put it here in case anyone else who reads this might be interested.
Saulius Simcikas wrote an EA Forum post about potential meta-projects, which includes a subsection for software development projects.
I agree with this. But I also want to add—I think a lot of EAs are put off from the rationalist community for different reasons (e.g. seemingly less-than-altruistic motivations, inaccessible language, discussions about things that don’t always feel practically relevant, etc.)
From a personal anecdote: I’ve had an eye on LessWrong and other rationalist spaces for some time, but never thought it was my territory for some of the reasons mentioned above. It wasn’t until I went to a CFAR workshop when I finally felt I knew enough about rationality to actually contribute to rationality discussions.
I see a lot of work being done to make EA more accessible to people who don’t have personal ties to the EA community, but not as much effort from the rationalist community on this. I feel like this could potentially be impactful for contributing to the personal development of EAs.
Thank you! :)
I worry that without it it’s too similar to CEA though
You could keep the name but drop the first ‘A’: CEELAR
I also like this.
I agree that CEEALAR (I’m pronouncing it see-uh-lar, almost like CLR, in my head) looks a little odd and might be hard to remember the acronym for. But I also agree that to get charitable status, dropping “hotel” was probably a good choice. A lot of nonprofits in the US use “house” (e.g. Covenant House) to give more of a charitable vibe. “EA House” already sounds less for-profit, though maybe less distinctive since EA houses are all over the place. Also, Centre gives me think tank vibes, which may or may not be what you’re looking for.
If you’re tied to the name, I’d recommend dropping the first E to make it CEALAR (Centre for Effective Altruism Learning And Research) to make it more pronounceable, aesthetic, and brief.
Overall, names are hard, and I’m not sure if it’s worth stressing as people will probably keep informally calling it the EA Hotel as is.
Props for putting in the work to keep this organization alive and well. It’s a wonderful asset to the EA community. :)
I might be too late, but I was just cleaning through some notes of mine and found some questions related to this from various research agendas that I found interesting:
Why do people donate to ineffective charities?
Why do people want to donate directly and not too overhead?
Why do people care about future people more than current people?
How do social norms and expectations influence giving?
Some of these can be incorporated into your current questions (e.g. in the most important qualities when giving, you can mention low overhead). Maybe you can also have people choose or rank who they want to benefit from their giving the most (e.g. their community, animals, future generations, etc.).
Not sure if this is helpful, but glad that you’re doing this regardless. :)
This is interesting! Have other non-EA organizations been doing this? My main concern would be coming off as self-promoting (for individuals—such is expected for nonprofit orgs). I think EAs are particularly conscious about coming across as people who actually genuinely care about helping the world, rather than people who are just doing good for social status.
I also wonder if there is actually a stigma worth fighting around donating to EA and EA-aligned orgs. In my perspective, it seems that the biggest barriers to EA(-aligned) orgs getting the funding they need is: a) not enough big donors know about, or are sympathetic to EA ideas, or b) orgs themselves aren’t giving enough compelling reasons to donate, in the form of quantitative data about cost-effectiveness, impact, and the like.
On the other hand, I could see a scenario where I saw on social media that Person A, who I greatly respect, donates to Organization B, and I might be more compelled to donate to them as a result (though this probably isn’t a particularly good way to go about deciding where to donate). I think this is part of why peer-to-peer fundraising is one of the most effective funding techniques. (P2P might be a way to accomplish similar goals to what you mention, while bypassing some of the challenges, though it of course brings up different challenges.)
Interesting idea! I do see some advantages to this—it seems like there are a lot of people trying to get experience working for EA-related causes, and quite a bit of those people seem interested in gaining ops experience. This seems like a good way to get those people connected with EA, and get people involved who don’t currently have local communities.
One worry I have is that people who aren’t going to meetups probably don’t have a good sense of the culture of the local EA group and how well events are going. This could be worked around by collecting feedback from local group participants, but I think you could get much more information much more easily by being part of the meetups.
An alternative that I think might be useful: A lot of EAs have remote jobs or some flexibility on where they live. As someone with a remote job who currently lives in a community without an active local group and seemingly without people who would be interested in being involved with one, I would be open to moving to a community with lots of EAs but without an active EA group for the sake of helping as a group organizer. Perhaps I’m odd in that way, but perhaps there are a few other EAs that are similar.
In either case, I’d be interested in seeing a list of EA cities that have this problem and coming up with ways to match interested people with open roles. Feel free to reach out to me if you’d like help with this. :)
I’m surprised no one has recommended ‘Doing Good Better’ by MacAskill. I would say that and ‘Strangers Downing’ as mentioned in a previous comment were most responsible for my engagement with EA. ‘Strangers Drowning’ I think somewhat primed me to be EA—it made the ideas of EA seem less foreign and odd when I actually came across them. ‘Doing Good Better’ helped me understand the EA argument quite a bit better and was probably the thing that tipped me from being interested in EA to identifying more or less as an EA.