Thanks, will do :)
Zoe Williams
EA & LW Forum Weekly Summary (23rd − 29th Jan ’23)
Summary of this post (feel free to suggest edits!):
Pax Fauna recently completed an 18-month study on messaging around accelerating away from animal farming in the US. The study involved literature reviews, interviews with meat eaters, and focus groups and online surveys to test messaging.
They found that most advocacy focuses on the animal, human health, and environmental harms of animal farming. However the biggest barrier to action for many people tended to be “futility”—the feeling that their actions didn’t matter, because even if they changed, the world wouldn’t.
Based on this, they suggest reframing messaging to focus on how we as a society / species are always evolving and progressive forwards, and that evolving beyond animal farming is something we can do, should do, and already are doing. They also suggest refocusing strategy around this—eg. focusing on advocacy for pro-animal policies, as opposed to asking consumers to make individual changes to their food choices.
(This will appear in this week’s forum summary. If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
EA & LW Forum Weekly Summary (16th − 22nd Jan ’23)
Great to hear :)
EA & LW Forum Summaries (9th Jan to 15th Jan 23′)
Great read, thanks for posting! A quick heads up that many of the links in the table of contents are broken (either linking to start of post, or to non-existent websites).
Summary of this post, and the sequel post Technological Bottlenecks for PCR, LAMP, and Metagenomics Sequencing (feel free to suggest edits!)
Biosurveillance systems help early identification of pathogens that could cause pandemics. The authors weighted existing methods on 10 criteria including usefulness, quality of evidence, feasibility and potential risks.
High scoring methods included: Point-of-person (non-lab tests eg. rapid antigen), clinical (lab tests eg. PCR), digital (reporting cases to a database), and environmental methods (eg. monitoring in wastewater). Technological developments in point-of-person and clinical surveillance (ie. faster, easier, cheaper, home-based tests) is seen as promising. Environmental surveillance would benefit from increasing sensitivity of wastewater testing equipment, and developing new concentration techniques that work for a wide variety of pathogens (bacteria, virus, fungi). Specific bottlenecks and potential solutions (eg. improving performance of LAMP, a cheaper PCR alternative, under cold temperatures) are discussed in the second post.
Slightly lower scoring methods were: animal (frequent sampling and wearable devices) and syndromic (monitoring symptoms). Data sharing between key parties (and preferably cross-country) could assist with syndromic and digital methods. Animal monitoring is less promising as, while 60% of known infectious diseases are zoonotic, we lack the capability to predict virulence and transmissibility to humans.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
EA & LW Forum Summaries—Holiday Edition (19th Dec − 8th Jan)
EA & LW Forums Weekly Summary (12th Dec − 18th Dec 22′)
Post summary (feel free to suggest edits!):
The authors broadly recommend the following for EAs from low and middle income countries (LMICs):Build career capital early on
Work on global issues over local ones, unless clear reasons for the latter
Some individuals to do local versions of: community building, priorities research, charity-related activities, or career advising
They discuss pros, cons, and concrete next steps for each. Individuals can use the scale / neglectedness / tractability framework, marginal value, and personal fit to assess options. They suggest looking for local comparative advantage at global priorities, and taking the time to upskill and engage deeply with EA ideas before jumping into direct work.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
The Centre for Enabling EA Learning & Research (CEEALAR) is an EA hotel that provides grants in the form of food and accommodation on-site in Blackpool, UK. They have lots of space and encourage applications from those wishing to learn or work on research or charitable projects in any cause area. This includes study and upskilling with the intent to move into those areas.Since opening 4.5 years ago, they’ve supported ~100 EAs with their career development, and hosted another ~200 visitors for events / networking / community building. It costs CEEALAR ~£800/month to host someone—including free food, logistics, and project guidance. This is ~13% the cost of an established EA worker, and an example of hits-based giving.
They have plans to expand, and are fixing up a next door property that will increase capacity by ~70%. They welcome donations, though aren’t in imminent need (they have 12 − 20 months of runway, depending on factors covered in the post). They’re also looking for a handy-person.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
The author argues that “the crypto industry as a whole has significant problems with speculative bubbles, ponzis, scams, frauds, hacks, and general incompetence”, and that EA orgs should avoid being significantly associated with it until the industry becomes stable.
In the last year, at least 4 crypto firms collapsed, excluding FTX. Previous downturns have included the collapse of the largest at the time crypto exchange mt gox. Crypto’s use is dominated by people using it to get rich—after 14 years, there are almost no widespread uses outside of this. This all means it’s a speculative bubble, and it will likely collapse again (maybe not in the same way). If we’re associated with it this could lead to a negative reputation that EA “keeps getting scammed”.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
EA & LW Forums Weekly Summary (5th Dec − 11th Dec 22′)
Post summary (feel free to suggest edits!):
r.i.c.e collaborates with the Government of Uttar Pradesh and an organization in India to promote Kangaroo Mother Care (KMC), which is a well-established tool for increasing survival rates of low birth weight babies. They developed a public-private partnership to cause the government’s KMC guidelines to be implemented cost-effectively in a public hospital.Their best estimate based on a combination of implementation costs and pre-existing research is that it costs ~$1.8K per life saved. However they are unsure and are planning to compare survival rates in the hospital targeted vs. others in the region next year.
Both Founders Pledge and GiveWell have made investments this year. They welcome further support—you can donate here. Donations will help maintain the program, scale it up, do better impact evaluation, and potentially expand to other hospitals if they find good implementation partners.(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
Smallpox was confirmed as eradicated on December 9th, 1979. Our World in Data has a great explorer on its history and how eradication was achieved.Smallpox killed ~300 million people in the 20th century alone, and is the only human disease to have been completely eradicated. It also led to the first ever vaccine, after Edward Jenner demonstrated that exposure to cowpox—a related but less severe disease—protected against smallpox. In the 19th and 20th centuries, further improvements were made to the vaccine. In 1959, the WHO launched a global program to eradicate smallpox, including efforts to vaccinate (particularly those in contact with infected individuals - ‘ring vaccination’), isolate those infected, and monitor spread. They eventually contained the virus primarily to India (86% of cases were there in 1974), and with a final major vaccination campaign, dropped cases there to zero in 1976.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
Some interventions are neglected because they have less emotional appeal. EA typically tackles this by redirecting more resources there. The authors suggest we should also tackle the cause, by designing marketing to make them more emotionally appealing. This could generate significant funding, more EA members, and faster engagement.As an example, the Make-A-Wish website presents specific anecdotes about a sick child, while the Against Malaria Foundation website focuses on statistics. Psychology shows the former is more effective at generating charitable behavior.
Downsides include potential organizational and personal value drift, and reduction in relative funding for Longtermist areas if these are harder to produce emotional content for. They have high uncertainty and suggest a few initial research directions that EAs with a background in psychology could take to develop this further.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
AI startups can be big money-makers, particularly as capabilities scale. The author argues that money is key to AI safety, because money:Can convert into talent (eg. via funding AI safety industry labs, offering compute to safety researchers, and funding competitions, grants, and fellowships). Doubly so if the bottleneck becomes engineering talent and datasets instead of creative researchers.
Can convert into influence (eg. lobbying, buying board seats, soft power).
Is flexible and always useful.
The author thinks another $10B AI company would be unlikely to counterfactually accelerate timelines by more than a few weeks, and that money / reduced time to AGI tradeoff seems worth it. They also argue that the transformative potential of AI is becoming well-known, and now is the time to act to benefit from our foresight on it. They’re looking for a full-stack developer as a cofounder.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Post summary (feel free to suggest edits!):
SoGive is an EA-aligned research organization and think tank. In 2022, they ran a pilot grants program, granting £223k to 6 projects (out of 26 initial applicants):Founders Pledge - £93,000 - to hire an additional climate researcher.
Effective Institutions Project - £62,000 - for a regranting program.
Doebem - £35,000 - a Brazillian effective giving platform, to continue scaling.
Jack Davies - £30,000 - for research improving methods to scan for neglected X-risks.
Paul Ingram - £21,000 - poll how nuclear winter info affects nuclear armament support.
Social Change Lab - £18,400 − 2xFTE for 2 months, researching social movements.
The funds were sourced from private donors, mainly people earning to give. If you’d like to donate, contact isobel@sogive.org.
They advise future grant applicants to lay out their theory of change (even if their project is one small part), reflect on how you came to your topic and if you’re the right fit, and consider downside risk.
The give a detailed review of their evaluation process, which was heavy touch and included a standardized bar to meet, ITN+ framework, delivery risks (eg. is 80% there 80% of the good?), and information value of the project. They tentatively plan to run it again in 2022, with a lighter touch evaluation process (extra time didn’t add much value).
They also give reflections and advice for others starting grant programs, and are happy to discuss this with anyone.
(If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)
Thanks, and that makes sense, edited to reflect your suggestion
Post summary (feel free to suggest edits!):
In November 2022, Open Philanthropy (OP) announced a soft pause on new longtermist funding commitments, while they re-evaluated their bar for funding. This is now lifted and a new bar set.
The process for setting the new bar was:
Rank past grants by both OP and now-defunct FTX-associated funders, and divide these into tiers.
Under the assumption of 30-50% of OP’s funding going to longtermist causes, estimate the annual spending needed to exhaust these funds in 20-50 years.
Play around with what grants would have made the cut at different budget levels, and using a heavy dose of intuition come to an all-things-considered new bar.
They landed on funding everything that was ‘tier 4’ or above, and some ‘tier 5’ under certain conditions (eg. low time cost to evaluate, potentially stopping funding in future). In practice this means ~55% of OP longtermist grants over the past 18 months would have been funded under the new bar.
(This will appear in this week’s forum summary. If you’d like to see more summaries of top EA and LW forum posts, check out the Weekly Summaries series.)