Thanks so much for all of these ideas! Would you be up for submitting these as separate comments so that people can upvote them separately? We’re interested in knowing what the forum thinks of the ideas people present.
Nick_Beckstead
Good question! We’ve re-written the question to say:
”If you are launching a new organization (especially one less than 12 months old), please submit a link to a one-minute video (unlisted Youtube video). Please follow the Y Combinator application video guidelines: https://www.ycombinator.com/video/ ”
Feel free to use your judgment about what would be informative for borderline cases!
The Future Fund’s Regranting Program
The Future Fund’s Project Ideas Competition
Announcing the Future Fund
Hi Carla and Luke, I was sad to hear that you and others were concerned that funders would be angry with you or your institutions for publishing this paper. For what it’s worth, raising these criticisms wouldn’t count as a black mark against you or your institutions in any funding decisions that I make. I’m saying this here publicly in case it makes others feel less concerned that funders would retaliate against people raising similar critiques. I disagree with the idea that publishing critiques like this is dangerous / should be discouraged.
- Doing EA Better by Jan 17, 2023, 8:09 PM; 261 points) (
- Ask (Everyone) Anything — “EA 101” by Oct 5, 2022, 10:17 AM; 110 points) (
- Some benefits and risks of failure transparency by Mar 27, 2022, 2:09 AM; 53 points) (
- “EA is very open to some kinds of critique and very not open to others” and “Why do critical EAs have to use pseudonyms?” by Feb 24, 2023, 5:10 PM; 8 points) (
- Feb 7, 2023, 3:46 PM; 5 points) 's comment on The number of burner accounts is too damn high by (
Hi Evan, let me address some of the topics you’ve raised in turn.
Regarding original intentions and new information obtained:
At the time that the funds were formed, it was an open question in my mind how much of the funding would support established organizations vs. emerging organizations.
Since then, the things that changed were that EA Grants got started, I encountered fewer emerging organizations that I wanted to prioritize funding than expected, and Open Phil funding to established organizations grew more than I expected.
The three factors contributed to having fewer grants to make that couldn’t be made in other ways than was expected.
The former two factors contributed to a desire to focus primarily on established organizations.
The third opposes this, but I still see the balance of considerations favoring me focusing on established organizations.
Regarding my/CEA’s communications about the purposes of the funds: It seems you and some others have gotten the impression that the EA Funds I manage were originally intended to focus on emerging organizations over established organizations. I don’t think this is communicated in the main places I would expect it to be communicated if the fund were definitely focused on emerging organizations. For example, the description of the Long-Term Future Fund reads:
“This fund will support organizations that work on improving long-term outcomes for humanity. Grants will likely go to organizations that seek to reduce global catastrophic risks, especially those relating to advanced artificial intelligence.”
And “What sorts of interventions or organizations might this fund support?” reads:
“In the biography on the right you can see a list of organizations the Fund Manager has previously supported, including a wide variety of organizations such as the Centre for the Study of Existential Risk, Future of Life Institute and the Center for Applied Rationality. These organizations vary in their strategies for improving the long-term future but are likely to include activities such as research into possible existential risks and their mitigation, and priorities for robust and beneficial artificial intelligence.”
The new grants also strike me as a natural continuation of the “grant history” section. Based on the above, I’d have thought the more natural interpretation was, “You are giving money for Nick Beckstead to regrant at his discretion to organizations in the EA/GCR space.”
The main piece of evidence that these funds were billed as focused on emerging organizations that I see in your write-up is this statement under “Why might you choose not to donate to this fund?”:
“First, donors who prefer to support established organizations. The fund manager has a track record of funding newer organizations and this trend is likely to continue, provided that promising opportunities continue to exist.”
I understand how this is confusing, and I regret the way that we worded it. I can see that this could give someone the impression that the fund would focus primarily on emerging organizations, and that isn’t what I intended to communicate.
What I wanted to communicate was that I might fund many emerging organizations, if that seemed like the best idea, and I wanted to warn donors about the risks involved with funding emerging organizations. Indeed, two early grants from these funds were to emerging orgs: BERI and EA Sweden, so I think it’s good that some warning was here. That said, even at the time this was written, I think “likely” was too strong a word, and “may” would have been more appropriate. It’s just an error that I failed to catch. In a panel discussion at EA Global in 2017, my answer to a related question about funding new vs. established orgs was more tentative, and better reflects what I think the page should have said.
I also think there are a couple of other statements like this on the page that I think could have been misinterpreted in similar ways, and I have regrets about them as well.
- Apr 10, 2019, 3:14 PM; 28 points) 's comment on Long-Term Future Fund: April 2019 grant recommendations by (
Thanks for sharing your concerns, Evan. It sounds like your core concerns relate to (i) delay between receipt and use of funds, (ii) focus on established grantees over new and emerging grantees, and (iii) limited attention to these funds. Some thoughts and comments on these points:
I recently recommended a series of grants that will use up all EA Funds under my discretion. This became a larger priority in the last few months due to an influx of cryptocurrency donations. I expect a public announcement of the details after all grant logistics have been completed.
A major reason I haven’t made many grants is that most of the grants that I wanted to make could be made through Open Phil, and I’ve focused my attention on my Open Phil grantmaking because the amount of funding available is larger.
I am hopeful that EA Grants and BERI will provide funding to new projects in these areas. CEA and BERI strike me as likely to make good choices about funding new projects in these areas, and I think this makes sense as a division of labor. EA Grants isn’t immediately available for public applications, but I’m hopeful they’ll have a public funding round soon. BERI issued a request for proposals last month. As these programs mature, I expect that most of what is seen as funding gaps in these areas will be driven by taste/disagreement with these grantmakers rather than lack of funding.
For now, I don’t have any plans to change the focus or frequency of my grantmaking with these funds from what was indicated in my April 2018 update.
I think it’s probably true that a fund manager who has more time to manage these funds would be preferable, provided we found someone with suitable qualifications. This is a possibility that’s under consideration right now, but progress toward it will depend on the availability of a suitable manager and further thinking about how to allocate attention to this issue relative to other priorities.
- Jul 30, 2018, 3:26 PM; 9 points) 's comment on Ben Hoffman’s donor recommendations by (LessWrong;
- Aug 3, 2018, 8:44 PM; 4 points) 's comment on Leverage Research: reviewing the basic facts by (
In addition to, 35 days total. (I work at Open Phil.)
I don’t mean to make a claim re: averages, just relaying personal experience.
I am a Program Officer at Open Philanthropy who joined as a Research Analyst about 3 years ago.
The prior two places I lived were New Brunswick, NJ and Oxford, UK. I live in a house with a few friends. It is 25-30m commute door-to-door via BART. My rent and monthly expenses are comparable to what I had in Oxford but noticeably larger than what I had in New Brunswick. I got pay increases when I moved to Open Phil, and additional raises over time. I’m comfortable on my current salary and could afford to get a single-bedroom apartment if I wanted, but I’m happy where I am.
Overall, I would say that it was an easy adjustment.
Differential Technological Development: Some Early Thinking
To avoid confusing people: my own annual contributions to charity are modest.
You might consider having a look at http://www.flamingswordofjustice.com/ . It’s a podcast of interviews with activists of various types (pretty left-wing). I’ve listened to a few episodes and found it interesting. It was the closest thing I could think of that already exists.
I would love to see some action in this space. I think there is a natural harmony between what is best in Christianity—especially regarding helping the global poor—and effective altruism.
One person to consider speaking with is Charlie Camosy, who has worked with Peter Singer in the past (see info here). A couple other people to consider talking with would be Catriona Mackay and Alex Foster.
One attractive feature about cosmopolitanism in contrast with impartial benevolence is that impartial benevolence is often associated with denying that loved ones and family members are worthy targets of special concern, whereas I don’t think cosmopolitanism has such associations. Another is that I think a larger fraction of educated people already have some knowledge about cosmopolitanism.
Niel, thanks for writing up this post. I think it’s really worthwhile for us to discuss challenges that we encounter while working on EA projects with the community.
I noticed that this link in this sentence is broken:
Creating more disaster shelters to protect against global catastrophic risks (too weird)
I think that comment is mostly Holden being modest.
I agree with all of that, though maybe I’m a bit more queasy about numbers >100.
Reasonable question! Our work is highly continuous with Open Phil’s work, and our background worldview is very similar. At the moment, we’re experimenting with our open call for proposals (combined with our areas of interest and project ideas) and a regranting program. We’ll probably experiment with prizes this year, too. We’re hoping these things will help us launch some new projects that wouldn’t have happened otherwise.
I also endorse Jonas’s answer that just having more grantmaking capacity in the area will probably be helpful as well.