I’d like to make the eligibility criteria clear to any prospective applicants:
“The Paycheck Protection Program is a loan designed to provide a direct incentive for small businesses to keep their workers on the payroll.” (link)
The boards and directors of the business have to sign in good faith that “Current economic uncertainty makes this loan request necessary to support the ongoing operations of the Applicant” (link)
Providing misleading or incomplete information is a federal crime
This is an emergency support loan exclusively for businesses to retain workers they’d otherwise be forced to make redundant. At the moment, my interpretation of your summary is that you could make this point more prominent. At the moment, it’s only included in the required documents section.
The wording ‘make sure to mention that uncertainty of current economic conditions makes necessary the loan request’ I think could be misinterpreted as leading people to exaggerate this factor, though I appreciate this may not be your intention.
I think it would be safer to say that ‘this loan is exclusively available to businesses which are struggling to maintain their staff on the payroll and meet bill payments, and if this condition applies to your organisation, then please report this accurately in the documents you provide’.
Re: 9 - I wrote this back in April 2019. There have been more recent comments from Will in his AMA, and Toby in this EA Global talk (link with timestamp).
In a blog from 2019, Kimberly Huynh from the GiveWell team mentioned they were intending to do further research on climate change mitigation. At present it seems to be that only Founders Pledge is doing this research. Is climate change something GiveWell is looking into more generally?
Much of animal welfare initiatives seem to focus on farmed animals. Farmers are experiencing weather extremes, less-predictable seasons, wildfires, and flooding from climate change, which are likely to impact farmed animal welfare. If at all, how does this influence the strategy of the animal welfare movement?
Hey, I can personally vouch that the EA London group is great—I’ve been a member for a year or so before moving to Oxford. EA London list their events here, and the site also has an email address and member directory.
Sounds great! You could also see if there’s an EA group near you.
Yes, the way that worked in consulting for me was that the referral bonus was (very approximately) something like 10% of the hire’s salary. So if someone very senior got hired, you could maybe double your annual paycheck. Not sure that would be appropriate for FHI though...
How do you see climate change affecting the work of AMF? Do changes to water and temperatures mean that the strategy of bednets is still likely to produce similar results in the future as it has in the past?
A post from EA at Harvard from 2017 recommends the following:
Working with local partners to advocate against coal power in China, India and Southeast Asia
Growing capacity and coordination at state and local levels in the U.S
Contributing to one or more climate philanthropy bodies that strategically target climate finance interventions
No worries—edit made.
Interesting idea! It might be nice to embed the image, or maybe multiple images. If you don’t know how to do that, you can do that by uploading the image to imgur, writing a word like photo, selecting it then choosing the image icon. You can then resize the image by dragging it.
I think that sounds like a great idea. You could put forward a proposal on the EA forum, with a form for people to express interest, and share it to other places where the EA survey respondents expressed an interest. If the EA survey data is accurate, I’d expect you’d have a decent level of interest to get it running.
In 2019, I planned to donate 5% of my income. I used payroll donations in the following proportions of this 5% for three months: EA Funds Animal Welfare 5%, LTFF 35%, EA Meta 30%, ALLFED 25%, and due to reading this, CFRN 5%. Then I became more concerned about GCRs, and switched to 50% to GCRI and ALLFED, again through EA Funds.
I used my old company’s matching scheme to provide £500 (plus GiftAid) through EA funds to ALLFED, which was free of charge for me. I donated £100 to Climate Outreach when they had a week of matching. I’ve also previously donated £20/month to the Vegan Society, because of their public campaigns to increase the availability of plant-based food, but I stopped donating there so I could invest more in GCR reduction.
In the last few months of the year, I watched Phil’s talk about optimal philanthropy and decided that a. I was in an optimal stopping problem where I hadn’t explored enough options yet, and b. that there may well have been higher marginal benefits to future spending on x-risks. Since then, I’ve maintained a spreadsheet of my income (of which I’ve spent about 35%), and have invested the rest using this advice.
I tentatively plan to donate to long-term causes, but potentially not any time soon, once I’ve done more research on the most tax-efficient way to invest and donate. For 2020, my only outgoing donations so far have been to CATF and CFRN because of this talk on climate and x-risk, which I’m planning to write up in a forum post soon.
Hey there, interesting article! In this talk from the most recent EA Global, Niel Bowerman (climate physics PhD and now AI specialist at 80,000 Hours) gives some thoughts on the relationship between climate change and existential risk. Essentially I think that there’s some evidence about point 2 on your list.
In his talk, Niel argues that climate change could cause human extinction in itself, under some scenarios. These are quite unlikely, but have non-zero probabilities. When we consider that emissions are likely to increase well beyond 2100, beware the 2100 fallacy of cutting shorts impact analyses at an arbitrary point in time.
The larger contributions very roughly are probably from climate change contributing to social collapse and conflict, which themselves lead to existential risks. Toby Ord has called this an ‘existential risk factor’. I think the question isn’t “Is climate an existential risk?” but “Does climate change contribute to existential risk?” in which case, it seems that the sign might be yes. Or perhaps “Is climate change important in the long-term?” in which case, if we’re thinking across multiple centuries, even with lots of technological development, if we’re looking at >6C in 2300 (to pick an example), then I think the answer is yes.
All of this being said, I still think it’s a fair to argue that AI, bio, and nuclear are more neglected and tractable relative to climate change.
What do you think of Niel’s talk and this framing?