Computer science student at UCL. Previously finance lead at CEA, Nov 2019 - Aug 2021.
Ben
I’ve been learning to code with Python and I did my first tiny bit of machine learning—I figured out how to do a polynomial regression to look at global average sea surface temperatures!
Thanks for your feedback! I agree that people should read the essay and make up their minds for themselves.
To address the points you raised:
-
As a set of notes, it claims to address the most important economic and political priorities, and I think this is the criteria on which it should be judged. My view is that it fails to do so.
-
My main beef with Cummings is that he overreaches in areas he’s not familiar with, and he has uncharitable disdain for the work of others, and I think this is consistent throughout the piece.
-
In my view there is a tension between his admiration of big government projects, but his failure to talk about democracy and link government activity to public needs. If I rewrite the piece I will make this more explicit.
-
This is a nice idea, but I agree with Hauke that this risks increasing the extent to which EA is an echo chamber. Perhaps you’re not aware of the (over)hype around some of these books in EA.
I think Rationally Speaking is particularly good at engaging with a range of people and perspectives.
I believe Founders Pledge is working on a climate change fund with similar objectives to be launched later this year. Their current recommendations are here. There’s also ClimateWorks.
In 2019, I planned to donate 5% of my income. I used payroll donations in the following proportions of this 5% for three months: EA Funds Animal Welfare 5%, LTFF 35%, EA Meta 30%, ALLFED 25%, and due to reading this, CFRN 5%. Then I became more concerned about GCRs, and switched to 50% to GCRI and ALLFED, again through EA Funds.
I used my old company’s matching scheme to provide £500 (plus GiftAid) through EA funds to ALLFED, which was free of charge for me. I donated £100 to Climate Outreach when they had a week of matching. I’ve also previously donated £20/month to the Vegan Society, because of their public campaigns to increase the availability of plant-based food, but I stopped donating there so I could invest more in GCR reduction.
In the last few months of the year, I watched Phil’s talk about optimal philanthropy and decided that a. I was in an optimal stopping problem where I hadn’t explored enough options yet, and b. that there may well have been higher marginal benefits to future spending on x-risks. Since then, I’ve maintained a spreadsheet of my income (of which I’ve spent about 35%), and have invested the rest using this advice.
I tentatively plan to donate to long-term causes, but potentially not any time soon, once I’ve done more research on the most tax-efficient way to invest and donate. For 2020, my only outgoing donations so far have been to CATF and CFRN because of this talk on climate and x-risk, which I’m planning to write up in a forum post soon.
How would effective altruism be different if we’re living in a simulation?
Thanks for sharing this piece and looking for constructive feedback. I’d agree with most of the points made by other commenters. I would also suggest:
Engage more with primary sources and more things written by people outside of effective altruism. There are thousands of climate scientists with interesting things to say, and a relatively small number of people in EA thinking about this.
General humility about this field—we don’t have great data on what the climate and society will be like in 50, 100, 200, 500+ years time, and it’s hard to know what the limits for habitation / adaption will be.
How would you define existential threat? I’ve heard David Wallace-Wells say that he thinks climate change is already an existential threat because it’s already leading us to change how we live our lives. You seem to use Bostrom’s definition. Why do you think it’s better?
Hey there, interesting article! In this talk from the most recent EA Global, Niel Bowerman (climate physics PhD and now AI specialist at 80,000 Hours) gives some thoughts on the relationship between climate change and existential risk. Essentially I think that there’s some evidence about point 2 on your list.
In his talk, Niel argues that climate change could cause human extinction in itself, under some scenarios. These are quite unlikely, but have non-zero probabilities. When we consider that emissions are likely to increase well beyond 2100, beware the 2100 fallacy of cutting shorts impact analyses at an arbitrary point in time.
The larger contributions very roughly are probably from climate change contributing to social collapse and conflict, which themselves lead to existential risks. Toby Ord has called this an ‘existential risk factor’. I think the question isn’t “Is climate an existential risk?” but “Does climate change contribute to existential risk?” in which case, it seems that the sign might be yes. Or perhaps “Is climate change important in the long-term?” in which case, if we’re thinking across multiple centuries, even with lots of technological development, if we’re looking at >6C in 2300 (to pick an example), then I think the answer is yes.
All of this being said, I still think it’s a fair to argue that AI, bio, and nuclear are more neglected and tractable relative to climate change.
What do you think of Niel’s talk and this framing?
I would say don’t get an MBA unless you are really really sure, as they are mega-expensive and I think marketed very broadly to people who often don’t benefit from them
Hey Jeremy! Myself and Joan Gass at CEA, and Markus Anderljung at FHI, all use skills like the ones you mention above, from our consulting backgrounds, at non-profits.
I sometimes look at this filter on the 80K job board and one example of a role you might like is this one. I also think that working in government is often a good thing to do, and so maybe there could be some US trade/aid organisations which you might find interesting, and also this talk. If you think that consulting means you can boost the productivity of companies and lead to economic growth overall, then that could be interesting.
In a blog from 2019, Kimberly Huynh from the GiveWell team mentioned they were intending to do further research on climate change mitigation. At present it seems to be that only Founders Pledge is doing this research. Is climate change something GiveWell is looking into more generally?
Interesting idea! It might be nice to embed the image, or maybe multiple images. If you don’t know how to do that, you can do that by uploading the image to imgur, writing a word like photo, selecting it then choosing the image icon. You can then resize the image by dragging it.
Thanks all for your comments. A few friends have emailed me and made a couple of points about this post.
1. On the first-order effects of warming, the Stern Review figures are now 10 years out of date, and the IPCC SR1.5C expects worse impacts to welfare than previously stated, under all trajectories. See Chapter 5 of the report, and Byers et al. 2018.
2. A good source on the impact on GDP and societies through sea level rise is Pretis et al. (2018, Philosophical Transactions of the Royal Society)
3. The goal for climate change mitigation should be getting to net zero emissions as fast as possible, as anything other than that still causes warming, and this goal is absent from many EA and the 80,000 Hours write-up.
4. Absent from these discussions are climate economists, who would be able to help us grapple with this more concretely. Some suggested economists to research (and for the 80,000 Hours Podcast) are Adair Turner, Simon Dietz, Cameron Hepburn, and Joeri Rogelj.
Thank you for your thoughtful feedback. I’ve thought about the points you raise and I think they are all good challenges. I agree that Cummings raises interesting and relevant points in a range of areas.
I think you and I have different views on several points. The most important one seems to be that I think the piece at times aspires to represent the whole, even as a sketch (‘An Odyssean education’). And my view is that two of the most important areas (if I were writing such an essay) would be voting and political systems, and climate change, neither of which I feel get sufficient attention in the piece. It seems that on each of these topics you take a different viewpoint, which I respect. Again, thanks for your feedback. Unless there’s much else you think we can do to resolve this difference, I’d probably leave it there.
Thanks, this is useful. You mentioned above that you’re planning to list more roles looking at biosecurity and climate change. What are 80K’s current thoughts and potential plans, if any, in relation to climate change?
Have you tried contacting him to discuss? https://rdanielbressler.com/
Much of animal welfare initiatives seem to focus on farmed animals. Farmers are experiencing weather extremes, less-predictable seasons, wildfires, and flooding from climate change, which are likely to impact farmed animal welfare. If at all, how does this influence the strategy of the animal welfare movement?
I think this is an interesting area of research—I’m not aware of much writing by EAs, but bear in mind the EA community is pretty small compared to the total number of people researching this and related fields across the world—you might find some other organisations or researchers who’ve looked into this more.
Different people in the community will have different views, but my own take is that the capitalism and markets can be great for growth and improving productive capacity but you want to make sure that the benefits are spread throughout society (see the book Why Nations Fail).
I’m sorry to hear that you’re feeling overwhelmed by things. I’ve felt the same way at time. It’s important to look after yourself, take time off, and connect with other people. For me, I love watching the Simpsons, going for runs with my friends, and drinking coffee!
My own take on this is that the world is big and messy, and there are lot of bad things we each as individuals have to accept we can’t control. But if you can find a niche doing something which hits the sweet spot of being both enjoyable and improving the world, then you can have a pretty good time!
I suspect you might be able to find lots of ways to use AI to make things better—I’ve seen some great work in improving agricultural production using machine learning which seems pretty good. And I’m sure there are lots of businesses and charities that would be interested in someone with your skillset.
Hi Peter,
I’d like to make the eligibility criteria clear to any prospective applicants:
“The Paycheck Protection Program is a loan designed to provide a direct incentive for small businesses to keep their workers on the payroll.” (link)
The boards and directors of the business have to sign in good faith that “Current economic uncertainty makes this loan request necessary to support the ongoing operations of the Applicant” (link)
Providing misleading or incomplete information is a federal crime
This is an emergency support loan exclusively for businesses to retain workers they’d otherwise be forced to make redundant. At the moment, my interpretation of your summary is that you could make this point more prominent. At the moment, it’s only included in the required documents section.
The wording ‘make sure to mention that uncertainty of current economic conditions makes necessary the loan request’ I think could be misinterpreted as leading people to exaggerate this factor, though I appreciate this may not be your intention.
I think it would be safer to say that ‘this loan is exclusively available to businesses which are struggling to maintain their staff on the payroll and meet bill payments, and if this condition applies to your organisation, then please report this accurately in the documents you provide’.