Retro funder profile & Manifund team recs (ACX Grants 2024: Impact Market)

The Astral Codex Ten (ACX) Grants impact market is live on Manifund — invest in 50+ proposals across projects in biotech, AI alignment, education, climate, economics, social activism, chicken law, etc. You can now invest in projects that you think will produce great results, and win charitable dollars if you are right! (Additional info about the funding round here.)

For this round, the retroactive prize funders include:

  • next year’s ACX Grants

  • the Survival and Flourishing Fund

  • the Long-Term Future Fund

  • the Animal Welfare Fund, and

  • the Effective Altruism Infrastructure Fund

Combined, these funders disburse roughly $5-33 million per year. A year from now, they’ll award prize funding to successful projects, and the investors who bet on those projects will receive their share in charitable dollars. This post profiles each of the funders and highlights a few grants that the Manifund team are particularly excited about.

Click here to browse open projects and start investing.

ACX Grants 2024 Impact Markets

Astral Codex Ten (ACX) is a blog by Scott Alexander on topics like reasoning, science, psychiatry, medicine, ethics, genetics, AI, economics, and politics. ACX Grants is a program in which Scott helps fund charitable and scientific projects — see the 2022 round here and his retrospective on ACX Grants 2022 here.

In this round (ACX Grants 2024), some of the applications were given direct grants; the rest were given the option to participate in an impact market, an alternative to grants or donations as a way to fund charitable projects. You can read more about how impact markets generally work here, a canonical explanation of impact certificates on the EA Forum here, and an explanation thread from the Manifund twitter here.

If you invest in projects that end up being really impactful, then you’ll get a share of the charitable prize funding that projects win proportional to your original investment. All funding remains as charitable funding, so you’ll be able to donate it to whatever cause you think is most impactful (but not withdraw it for yourself). For example, if you invest $100 into a project that wins a prize worth twice its original valuation, you can then choose to donate $200 to any charity or project of your choice.

Meet the retro funders

Four philanthropic funders have so far expressed interest in giving retroactive prize funding (“retro funding”) to successful projects in this round. They’ll be assessing projects retrospectively using the same criteria they would use to assess a project prospectively. Scott Alexander explains:

[Retro] funders will operate on a model where they treat retrospective awards the same as prospective awards, multiplied by a probability of success. For example, suppose [the Long Term Future Fund] would give a $20,000 grant to a proposal for an AI safety conference, which they think has a 50% chance of going well. Instead, an investor buys the impact certificate for that proposal, waits until it goes well, and then sells it back to LTFF. They will pay $40,000 for the certificate, since it’s twice as valuable as it was back when it was just a proposal with a 50% success chance.

Obviously this involves trusting the people at these charities to make good estimates and give you their true values. I do trust everyone involved; if you don’t, impact certificate investing might not be for you.

As a (very) rough approximation, the four philanthropic retro funders usually disburse about $5-33 million per year. They are:

1. ACX Grants 2025

Next year’s ACX Grants round (2025) will be interested in spending some of the money they normally give out as prizes for the projects that succeeded in this year’s (2024) round. ACX Grants 2025 will be giving out prizes to people who pursue novel ways to change complex systems, either through technological breakthroughs, new social institutions, or targeted political change.

Previous rounds of ACX Grants have disbursed about $1-2 million per round, and you can find the lists of grants that those rounds gave money to here (1, 2).

2. The Survival and Flourishing Fund (SFF)

From their website:

[SFF] is a website for organizing the collection and evaluation of applications for donations to organizations concerned with the long-term survival and flourishing of sentient life.

Since 2019, SFF has recommended about $2-33 million per year in philanthropic disbursements ($75 million in total):

To find out more about the philanthropic priorities of the SFF’s largest grant-maker, Jaan Tallinn, see here. To see past grants SFF has made, see here.

3. The Long Term Future Fund (LTFF)

From their website:

The Long-Term Future Fund aims to positively influence the long-term trajectory of civilization by making grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. In addition, we [the LTFF] seek to promote, implement, and advocate for longtermist ideas, and to otherwise increase the likelihood that future generations will flourish.

The LTFF usually disburses around $1-5 million per year, and sometimes disburses much more. You can view their yearly payout data here.

You can read more about the LTFF’s scope and expected recipients here, and find their public grants database here.

4. The Animal Welfare Fund (AWF)

From their website:

The Animal Welfare Fund aims to effectively improve the well-being of nonhuman animals, by making grants that focus on one or more of the following:

  • Relatively neglected geographic regions or groups of animals

  • Promising research into animal advocacy or animal well-being

  • Activities that could make it easier to help animals in the future

  • Otherwise best-in-class opportunities

The AWF usually disburses around $0.5-3 million per year, and sometimes disburses much more. You can view their yearly payout data here.

You can read more about the AWF’s scope and expected recipients here, and find their public grants database here.

5. The Effective Altruism Infrastructure Fund (EAIF)

From their website:

The Effective Altruism Infrastructure Fund (EA Infrastructure Fund) recommends grants that aim to improve the work of projects using principles of effective altruism, by increasing their access to talent, capital, and knowledge.

The EA Infrastructure Fund has historically attempted to make strategic grants to incubate and grow projects that attempt to use reason and evidence to do as much good as possible. These include meta-charities that fundraise for highly effective charities doing direct work on important problems, research organizations that improve our understanding of how to do good more effectively, and projects that promote principles of effective altruism in contexts like academia.

The EAIF usually disburses around $1-3 million per year, and sometimes disburses much more. You can view their yearly payout data here.

You can read more about the EAIF’s scope and expected recipients here, and find their public grants database here.

…and (possibly) more.

If you want to join these four institutions as a potential final oracular funder of impact certificates, see this document and email rachel@manifund.org.

Some projects we like

Many of the projects are really great! We don’t have enough time or space to talk about all of the ones we’re excited about, but here are a few of our faves, from each of:

Austin

Run a public online Turing Test with a variety of models and prompts, by camrobjones.”

Cam created a Turing Test game with GPT-4. I really like that Cam has already built & shipped this project, and it appears to have gotten viral traction and had to be shut down due to costs; rare qualities for a grant proposal! The project takes a very simple premise and executes well on it; playing with the demo makes me want to poke at the boundaries of AI, and made me a bit sad that it was just an AI demo (no chance to test my discernment skills); I feel like I would have shared this with my friends, had this been live.

Research on AI deception capabilities will be increasingly important, but also like that Cam created a fun game that interactively helps players think a bit about how for the state of the art has come, esp with the proposal to let user generate prompts too!

Quantifying the costs of the Jones Act, by Balsa Research.”

Balsa Research is funding an individual economist or a team to conduct a counterfactual analysis assessing the economic impact if the Jones Act was repealed, to be published in a top economics journal.

I like this project because the folks involved are great. Zvi is famous enough to almost not need introduction, but in case you do: he’s a widely read blogger whose coverage of AI is the best in the field; also a former Magic: the Gathering pro and Manifund regrantor. Meanwhile, Jenn has authored a blog post about non-EA charities that has significantly shaped how I think about nonprofit work, runs an awesome meetup in Waterloo, and on the side maintains this great database of ACX book reviews. (seriously, that alone is worth the price of admission)

I only have a layman’s understanding of policy, economics or academia (and am slightly bearish on the theory of change behind “publish in top journals”) but I robustly trust Zvi and Jenn to figure out what the right way to move forward with this.

Publish a book on Egan education for parents, by Brandon Hendrickson.”

Brandon wants to publish a book on education for parents based on Kieran Egan’s educational theory. He walks the walk when it comes to education; his ACX Book Review contest entry on the subject was not only well written, but also well structured with helpful illustrations and different text formats to drill home a point. (And the fact that he won is extremely high praise, given the quality of the competition!) I’m not normally a fan of educational interventions as their path to impact feels very long and uncertain, but I’d be excited to see what Brandon specifically can cook up.

(Disclamer: I, too, have some skin in the game, with a daughter coming out in ~July)

Lily

Start an online editorial journal focusing on paradigm development in psychiatry and psychology, by Jessica Ocean.”

Jessica’s project takes up the mantle of a favorite crusade of mine, which is “actually it was a total mistake to apply the scientific method to psychology, can we please do something better.” She’s written extensively on psychiatric crises and the mental health system, and I would personally be excited to read the work of people thinking seriously about an alternative paradigm. I’m not sure whether the journal structure will add anything on top of just blogging, but I’d be interested to see the results of even an informal collaboration in this direction.

(Note that I probably wouldn’t expect the SFF or LTFF to fund this; ACX Grants 2025 maybe, and the EAIF I’m not sure. But I’d be happy to see something like it exist.)

An online science platform, by Praveen Selvaraj

I think generating explanatory technical visuals is both an underrated use of image models, compared to generating images of mysteriously alluring women roaming the streets of psychedelic solarpunk utopias, and an underrated use of genAI for education, compared to chatbots that read your textbook over your shoulder. I’d like to see more 3Blue1Brown in the world, and in general I’m optimistic about people building tools they already want for their personal use, as Praveen does.

Saul

Educate the public about high impact causes, by Alex Khurgin.”

Alex wants to build a high-quality YouTube show, and seeks funding to make three episodes of the show on AI risk, antimicrobial resistance, and farmed animal welfare. This is something that I could pretty easily imagine the LTFF, EAIF, and possibly SFF retrofunding, and I’d additionally be excited about more people knowing about them & reducing their expected negative impact on the world.

Alex’s (and his team’s) track record is also pretty great: they’re clearly experienced & know what they’re talking about. I’d be interested in getting a better path to impact — what do they plan to do after they click publish on the videos? — but I’m sufficiently excited that I’ve invested a token $50 in Alex’s project to credibly signal my interest.

Distribute HPMOR copies in Bangalore, India, by Aditya Arpitha Prasad.”

Anecdotally, the answer “I got into HPMOR” has been quite a common response to the question “how did you become interested in alignment research?” Mikhail Samin has had (from what I’ve seen) a lot of success doing something like this in Russia, and I’m excited about starting a similar initiative in India. This grant seems to fall pretty clearly within the range of retrofunding from the LTFF and/​or EAIF. I’ve invested a token $50 in Aditya’s project to credibly signal my interest.

Links & contact

Click here to browse open projects and start investing; click here to apply to our micro-regranting program.

If you’re interested in learning more about investing on an impact market, donating to projects directly, or even just chatting about this sort of thing, you can email saul@manifund.org or book a call here.


Note: this is a slightly edited linkpost of “ACX Grants 2024: Impact market is live!” In particular, that writeup included details about our micro-regranting program, applications for which are now closed. We posted a separate announcement about it, which you can find here.

Crossposted to LessWrong (0 points, 0 comments)