The book this is based on, Schindler’s Ark by Thomas Kenneally, is also great if you want to delve more into character psychology.
Jemma
Perhaps not EA-related, but as a refreshing change from the apparent focus on villains in both “literary” and “genre” fiction, I’d recommend Anna Funder’s ‘All That I Am’ and ‘Stasiland’. They are based on the stories of resistors to the Nazi regime and the East German GDR regime, respectively. ‘Stasiland’ is non-fiction (long-form journalism) and also includes stories from informers/ex-Stasi police, but most of the focus is on the resistors. ‘All That I Am’ is based on the life of a person whom the author, Anna Funder, knew personally.
I don’t feel qualified to comment on this myself, but I found an interview with Peter Singer that touches on the topic of politics and EA, published yesterday. One relevant extract:
“[Singer] proudly recalls how many of his own students have been turned towards Effective Altruism and have decided to integrate it into their future lives. He then briefly alludes to students’ political leanings, and I decide to probe a little further, asking, more generally, about how the philosophy plays out in the political domain.
“It’s clearly political in so far as it is trying to get away from the views of people on the right, like Ayn Rand. It is a movement away from the idea that it is good to be selfish, that somehow under capitalism people thinking and acting selfishly works under this hidden hand to do the most good. It doesn’t do the most good, and we need to think about directly aiming at doing good for people who don’t have the same chance to get into the global economy. So in that sense it is taking a stance against a certain political and economic thinking. On the other hand, it is also taking a stance against the idea that the solution to all these problems is a revolutionary overthrow of the capitalist system. It is saying, look, capitalism has been around a long time, it doesn’t look like we are going to overthrow it very soon and it is not clear what the best alternative would be. So while we are here, let’s try to do what we can within that system. In fact, it is kind of ironic that sometimes Marxists object to this, and yet that is exactly what Engels did. He was a capitalist running a factory in Manchester, and without his financial support, Marx wouldn’t have had the leisure to write the works that he did.””
Full article: https://cherwell.org/2019/05/17/interview-peter-singer/
Currently reading Cal Newport’s ‘Digital Minimalism’. Even as an older millenial who has been implementing some of his practical tips for some time, I find that he explains the detrimental effects of social media (mostly focussed on harms at individual/social group level rather than societal) in an accessible yet detailed way.
In terms of practical advice, I personally am not in favour of “willpower alone” approaches (though arguably I “use” those with respect to social media I’ve never been drawn to, e.g. Instagram), as at this point I believe that social media is intended to be addictive, and there’s no reason to forcibly expose yourself to an addictive substance. Options available include:
Time blockers: Blocks websites/apps during a specific time. The one called Freedom, while costing around $30 a year on a subscription basis, works very well on PC and it’s not obvious to me how you’d circumvent it when the anti-deactivation features are enabled. (you probably still need to have the willpower not to Google how to circumvent it). Appblock is a workable one for Android, though very easy to circumvent if you’re semi-determined.
Time limiters: Limit time on specific sites. Benefit is that you can still access the site (many of these also include time blocker options) but not for long. Leechblock and StayFocusd are well-known ones. Easy enough to circumvent, but can often be enough.
Feature blockers: Good for when you “need” to use a social media platform, but basically want it without the addictive features. Examples include Facebook News Feed Eradicator. A great phone-based option is to delete/block the Facebook app on your phone, but have Messenger and/or Local (Facebook events app). Anything to get you away from features like autoplay-next-video, suggestions/recommended content, and algorithmically-generated “feeds”. Despite being trivially easy to circumvent, I have never, in over a year of using FB News Feed Obliterator, felt the slightest desire to circumvent it to view my news feed—a fact that is startling when you consider how much time I previously spent doing something I apparently had no specific desire to do...
“EAs make sacrifices by being prepared to accept the substantial probability of themselves never having impact. This would be hard to take psychologically, but might be the right thing to do in a crowded talent space.”
My impression has always been that even the most qualified person who goes into the most promising field (say, for example, AI risk reduction) has a low absolute chance of being the person to make a breakthrough in that field, but rather, that part of the point of EA was to get more talented people into those fields (e.g. by increasing the number of jobs) to increase the chance that a breakthrough will be made by someone.
A problem with this post is that its conclusion that the “left” poses more “risk” is based on the number of individual perceived objections from the left. However, even if this were true, this conflates the number of separate issues with some attempt at a measure of the overall “magnitude” of risk, without taking into account the number of people complaining based on each objection, and/or the “intensity”/impact of their complaints. Which, as Halffull points out, could in any case even be a positive impact if they’re identifying a real problem with EA.
I don’t want to be overly pedantic, but there are also inconsistencies in this post, which make its conclusion even about the number of objections appear stronger than it is. The total number of objections from the left is increased by the separate listing of several instances of closely-related criticisms of Peter Singer (autism rights, disability rights, and others). In contrast, in the “problems with the right wing” section, similar complaints of abortion-related objections and zoophilia are listed in the same point. This inconsistency increases the apparent number of left-wing objections, a number on which the author then bases their conclusion.
I also think that research is lacking, as a recent podcast (80000 Hours? Someone help me if you can remember; it’s really hard to search content on podcasts) suggested that the rise of extreme right-wing populist nationalist politics is creating risks in the nuclear warfare space.
Another thing is that the EA survey consistently suggests that most EAs are left-wing. Anecdotally, most of those I know seem to be reformist, centre-left. Both the statistics and my experience suggests that the centre left, perhaps those who are disillusioned with more extreme leftist positions such as proposals for revolutionary communism, may be a significant source of people coming into EA—often bringing with them motivation, experience of community organising and other useful skills.
In my experience, this overvaluation of depression, or fear about what might happen if you feel happier, is a really common concern among some types of creatives (though in their case it has more to do with inspiration than motivation). In both cases, I’d say it’s probably an incorrect perception that results from the depressive state itself.
Correction: article summary says “quarter of a million” but article makes it clear it is 250 million, i.e. “quarter of a billion”. Feel free to delete this comment. :)
Interesting research. I first became aware of this issue from being involved in the animal welfare movement, specifically with small/”pocket” pets where sale of breeding “overstock” for reptile consumption is sadly common. Unfortunately, some people simply enjoy the spectacle of their pet consuming live prey. More generally, it’s part of the broader issue of carnivore pets in general—the meat produced for consumption by dogs and cats is likely to come from factory farms similar to those raising meat for human consumption, where conditions may be little better than those of the mice pictured here. This has led me to a personal decision to refrain from having non-vegetarian pets, and I know that other EAs have done likewise.
Depending on the level of government involved in making and/or enforcing animal welfare regulations, this sounds plausible even in a First World country. While discovering major, overt bribery in the federal government would be a shocking scandal, a lot of bribery and corruption occurs in lower levels of government, particularly between businesses and local councils. It sounds like John_Maxwell_IV has stats on bribery in animal welfare organisations, and I’d definitely be interested to see those.
General vegan movement, including to a greater or lesser extent people becoming vegan for health, environmental, and/or animal suffering reasons
Left-wing people who are disillusioned with the prospects of/normally promoted avenues for systemic change or the ability to help people in other countries
People who are accustomed to donating money, e.g. for religious reasons: potentially a large demographic but may be hard to sway donations, though I have had some success introducing GiveWell to people who have left religion and are looking for secular charities
This is a great article! Given the extremely high cost and demanding debt structure of college in the US, do you think that those of us who are lucky enough to live in countries with free, cheap(er) or more easily paid-off college tuition, should remain in those countries at least for undergrad (while aiming for top world colleges if/when we do a PhD)?
Yes, it’s great. I was talking to some people about this topic on New Year’s Eve, wish I’d had this stat and the link to this article then!
I just saw this post and came onto this comment thread to post that (had the Amazon link open and everything)! I’m home living with family for the holidays and while moving my bookshelf a few days ago I came across ‘Children Just Like Me’. It led me down a whole pathway of reflections about how much I loved that book as a kid and whether it was something that prompted me toward EA values.
I must have read it at least half a dozen times as a child, as I can remember parts verbatim. I am so amazed that other EAs grew up reading it! Wow, this has made my day. I’m tempted to order a copy for my little cousin now.
This reminds me of a pretty excellent Simone de Beauvoir quote: “We must decide upon the opportuneness of an action and attempt to measure its effectiveness without knowing all the factors that are present.” (From The Ethics of Ambiguity) I quite like this quote, because I don’t interpret it as an argument against trying to measure and predict the consequences of an action, but rather, as an expression of the fact that uncertainty and incomplete information is a fact of life, and we must at some point act anyway rather than becoming paralysed by this. We should always be at least passively open to the possibility of new and unknown factors, and compassionate toward people (including our past selves) who have made mistakes or held views that turned out to be incorrect.
To put a brighter spin on what other people have said about tractability, many EA-backed cause areas reduce trauma already, too. If a child doesn’t die of malaria, their siblings and parents are also spared the huge trauma of experiencing the death of a family member.