I very much agree with your second point. I think it would be especially useful to encourage EAs to announce recent donations they have made, so we can celebrate those donations, evoke warm fuzzies, and maybe even get some subtly competitive giving going.
ClaireZabel
Where do you think would be a good place? The points you brought up are concerning, but I think it works decently on LW (although the samey-ness potential is higher here...) and I’m not sure where a better place for it would be.
Thanks for writing this! I’m interested in how you think one should prepare for interviews like these. Did you write out your thoughts and what messages you wanted to communicate beforehand then try to work them into the conversation? Do you meditate or exercise etc. to calm yourself down before the interview takes place?
Here are two small things I’m proud of, that I’ve been sustaining for a while now.
-I took on an extra job so that I can donate more, even though doing it requires me to face my most severe phobia, and stuck with it even when it was more difficult than I thought it would be. -I stopped purchasing milk, eggs, and butter.
I asked my grandparents to donate to ACE on my behalf for my birthday. I just saw that the $500 donation went through :)
I don’t understand… are your events less than 30 minutes long (because it says that they now happen more than once a month)? It seems like most of the time in running our group doesn’t go into social media promotion and reminders, but into the group meetings, discussions, and activities, and the socializing before and after that makes people stick around.
I felt like this post was a misleading because the author appears to spend more than 30 minutes, as any group leader should expect. I am concerned that the author used a surprising title to get attention but then added lower-value, common-sensey content.
Are we not double-counting “good done” if we follow the guidelines in that post? I.e. by attributing full good done “credit” to the recruiter, we implicitly either can’t attribute it also to the EA-convert. But usually we also “credit” the person doing the EtG (or other EA activity) with the full amount of good they do.
Like, it would seem strange if all the EA good I did for the rest of my life was credited to the people who helped recruit me. They are awesome and deserve lots of praise and credit as well, but perhaps not all or most of the credit for the EA hours I work.
If most people who will become EAs are basically EAs-in-waiting (they just need to hear the magic EA words or whatever), then the recruiter is probably partly responsible for how much faster he or she made this happen, but not the lifetime good done. If he or she made a person learn about EA a year earlier, that’s maybe 1-2 lives saved (if the convert makes like 33-66k per year).
We’re also considering removing other personal questions that we’ve already gathered recent data on, such as gender, diet and religion.
Really don’t think you should cut these. We’re trying to diversify the movement, and the survey seems like on the of only comprehensive ways to see if it’s working. W/r/t the diet question, I think that’s incredibly valuable and useful information for EAA. I’d be interested to also know how many EAs changed their diets after becoming EAs.
Also, EA is young and growing so fast. A lot can change in a year.
People might also feel judged if asked how much they donate, but asking this is part of assessing what part of EA principles they’re adopting, and how effective EA is at actually driving behavior change in self-identified EAs, which is sort of the point of the survey.
I think EAs irrationally avoid giving to “second-best” charities (like GiveWell’s standouts) , but that’s a relatively weak impression. It might be helpful to talk more about top giving opportunities in a given moment/year, rather than talking about top charities, which can become less top as donations are made, until donating doesn’t feel so shiny anymore (also saying this as a random EA, not soon-to-be GW staffer).
Of course, it might be better to ask people to give later in general, but there’s no reason as far as I know to believe the best order would be ‘donate to room-for-funding-remaining top charity’ > ‘donate later’ > ‘donate to second-best charity.’
Also, as Eliezer and Jacy pointed out on Facebook, this sufficient funding argument is far less true of existential risk and animal-focused charities than global poverty ones (in fact, many of those are somewhat strapped for cash).
They do. My comment was in reference to the fact that the top charities may run out of room for funding. When that occurs, in my experience (some) EAs tend to forget about or avoid opportunities to help fund standout charities, which are still very although slightly less effective, out of a bias against “second-best” opportunities.
I’ve heard about 7 EAs discuss hesitating to donate to GW’s top charities because they thought the top charities would run out of room for funding without their contributions. They all decided to either give later, or give to a non-global poverty charity. None mentioned the possibility of donating to a standout charity.
We used:
“Hey, welcome to the Effective Altruism facebook group! If you have a moment, would you mind telling us where you first heard about EA?
Thanks!
Claire
(moderator)”
We are considering a/b testing some new questions, and would love suggestions on different phrasing.
And when someone in the group adds a new member, we still have to approve them. We messaged them as well.
How we can make it easier to change your mind about cause areas
I also think it would be great to have a centralized location, probably with both some general arguments about different cause areas and charities, and write-ups of individual decisions about donation and other forms of support. Unfortunately I just don’t have time, but if you guys wanted to work on this some people in the Cause Prioritization Facebook group might help (perhaps Issa).
Very much looking forward to the full write-up, Tom!
Copied from my comment on a Facebook post:
I especially liked Nick’s sapling analogy, and found it fitting. I worry that EAs are drawn from subgroups with a tendency to believe relatively simple formalistic and mechanistic processes essentially describe complex ones, with perhaps a decrease in accuracy (relative to more complex models) but not in the general sign and magnitude of the result. This seems really dangerous.
“Imagine a Level 1 event that disproportionately affected people in areas that are strong in innovative science (of which we believe there are a fairly contained number). Possible consequences of such an event might include a decades-long stall in scientific progress or even an end to scientific culture or institutions and a return to rates of scientific progress comparable to what we see in areas with weaker scientific institutions today or saw in pre-industrial civilization.” It seems likely that any Level 1 event will have disproportionate effects on certain groups (possibly ones that would be especially useful for bringing civilization back from a level 1 event), and this seems like a pretty under-investigated consideration. A pandemic that was extremely virulent but only contagious enough to spread fully in big cities. Or extreme climate change or geoengineering gone awry knocking out mostly the global north or mostly equatorial regions or coastal regions.
He doesn’t really discuss the possibility of a Level 1 event immediately provoking a Level 2 event, but that also seems possible (for example, one catastrophic use of biowarfare could incentivize another country to develop even more powerful bioweapons, or to develop some sort of militaristic AI for defense. Or catastrophic climate change could cause the use of extreme and ill-tested geoengineering). This actually seems moderately likely, and I wonder why he didn’t discuss it.
This is a comment from Jim Terry, reposted with permission (none of it mine)
There is essentially no precedent for level 1 catastrophes” is followed by immediately listing at least one level 1 catastrophe, by his previous definition. (“Hundreds of millions of people;” the Black Death qualifies by body count, depending on your estimates; the others would count if you adjust proportionally for world population.) If we use the retro rated threshold of 5% or more of the global population dying (350m, in today’s terms), the Mongol conquests (100m, 20-25%), the Wars of the Three Kingdoms (40m, 10-25%), the Plague of Justinian (25-50m, 10-25%) and potentially the Native American die-out consequent to the Columbian exchange (estimates are hard) count. (Note that all of these except for the plague of Justinian were spread over decades, but even doing some generational amortization, all of them except the Native American die-out likely make the cut anyway.)
“For the most part, these events don’t seem to have placed civilizational progress in jeopardy.”
Wild speculation! We don’t know the counterfactual scenarios. My off-the-cuff counter-speculation is that if not for the Plague of Justinian, we might be settling Alpha Centauri by now, and looking back at the possibility of still being an earthbound civilization in the third millennium as a grim dark alternate history.
To point to specific past events that probably should be considered level 1 catastrophes, not just by death toll but by impact, the Mongol Conquests are a plausible explanation for why the Muslim world didn’t continue to be dramatically more enlightened and advanced than non-ERE Europe. 1258 was one of those watershed years in history, after which the future of Islam was a lot grimmer. Mongols also had a dramatically bad impact on the Russian cultural bloc, too (viz., they overran them and infected them with their values), which did some bad things to human progress. Generally speaking, Nick’s worries about what might happen to social progress following a level 1 catastrophe all in fact did happen in this instance. Worries about the stall to scientific progress are validated here, too; the loss of the House of Wisdom is probably the most dramatic example, but flourishing scientific progress took a downturn.
Also, consider the fall of the Western Roman Empire. It was a catastrophic event widely thought to have had a significant negative impact on technological and social progress but without a particularly impressive direct death toll.
Both of these tie into the disastrous repercussions of the plague of Justinian—Justinian later became known as the emperor who reconquered Italy and large portions of the Med. If not for the plague weakening the ERE by killing 40% of his bros, things might have gone very differently. (Potential outcomes: no Middle Ages, ERE hegemony over the West and Arab world; Mongol aggression confined to East Asia, because horse archers don’t do as well against automatic weapons.)
Interestingly enough, the one non-modern global catastrophe the author was aware of actually may have had some positive social impact. It’s a controversial historical view, but the thought is that the Black Death may have created some space (....where a lot of people used to be...) for the Renaissance to blossom.
http://www.gmanetwork.com/.../the-black-death-gave-life...
Hi, I’m Claire. I’m a master’s student at Stanford, and I also work at Harvard’s Kennedy School of Government. I research climate engineering (manipulating the world’s climate to reduce the effects of climate change and otherwise make Earth more habitable). I’d really like to go into government work and this year I’ll be thinking and possibly blogging about what EA policy-making might look like.
I used to write a lot, and I’m interested in how fiction can be used to spur activism and reshape ethical frameworks, especially in young people. I’m also curious about how the relationship between EA and transhumanism, and EA and animal right activism, might evolve in the coming years. I’m interested in how evolution shaped human intuitions about altruism and utilitarianism, and how the EA movement can use or respond to these instincts.
I also do some earning to give on the side, and plan to do much more in the coming years. My not-so-EA activities are hiking (who doesn’t like hiking?), cave-diving and other unusual types of scuba diving, reading (mostly speculative fiction and classics), and painting with watercolors.