Nonprofit accounting researcher. I mostly study private foundations and how donors use accounting information in their giving decision-making. My agenda aligns strongly with the effective giving topic.
Trustee at CEEALAR
Nonprofit accounting researcher. I mostly study private foundations and how donors use accounting information in their giving decision-making. My agenda aligns strongly with the effective giving topic.
Trustee at CEEALAR
I think it’s great that EAIF is not funding constrained.
Here’s a random idea I had recently if anyone is interested and has the time:
An org that organizes a common application for nonprofits applying to foundations. There is enormous economic inefficiency and inequality in matching PF grants to grantees. PF application processes are extremely opaque and burdensome. Attempts to make common applications have largely been unsuccessful, I believe mostly because they tend to be for a specific geographic region. Instead, I think it would be interesting to create different common applications by cause area. A key part of the common application could be incorporating outcome reporting specific to each cause area, which I believe would cause PF to make more impact-focused grants, making EAs happy.
I joined the audit committee of the Berkeley Existential Risk Initiative, who found me through the EA Good Governance Project. As someone who wants to get more experience serving on boards, I am looking for opportunities, but the EA Good Governance Project made it possible!
I’d encourage people interested in serving on boards to join as a candidate and for organizations to use it as a tool for finding board members.
I think generally I agree with you, that people should be careful to not pull the trigger too early on closing down a project. However, I think in the general philanthropic landscape, organizations persist because of the lack of incentives to close them down, which is of course, inefficient. EA does a good job trying to correct this, but like with other areas of EA, it is possible that EA takes it “too far”.
I tend to think the people involved are most equipped to make this determination, and we have additional reason to trust their judgment because it likely goes against their self-interest to close a project down.
I think a related discussion could be had around funders making the decision to quit on projects too early, which is likely much more prevalent/an issue.
And as an aside—I am interested in this topic for a research project. I think doing some qualitative analysis (interviews?) with folks who have closed down projects would make for a fairly interesting research paper.
The grantmaker training program I think may be the best intervention in this space. Training grantmakers and getting them hired at major US private foundations, where they will ultimately have great influence on where funds are directed, has massive potential.
That’s fair. Though I would counter that GiveWell says that they have directed $2 billion over 10+ years to effective charities. Private Foundations in the US give collectively around $100 billion a year. So there is a lot of money out there with potential to be influenced.
There certainly was a perception among who we spoke with that EA was very “white/tech bro/elitist”, so I think you are correct in that assessment.
Your point about PF accountability is well-taken as well. There is functionally 0 accountability in regards to effectiveness.
It’s pretty clear to me that these constraints are bad (and to me core EA is partially about breaking the self-imposed constraints of giving) but the simple reality is that private foundations are legally required to follow their charter. If the board wanted to radically change their charter, in most instances they could (my understanding), but boards tend to be extremely deferential to the founder’s original intent. They begin with a fundamental assumption: “We will focus our giving on X cause area or Y geographic area” and then they have the power to make decisions beyond that.
The concern I have is that EA has basically written off all private foundations that are not already EA-aligned as a lost cause.
Yes I think this is an area of misconception that could be explored more and ultimately addressed.
Yeah I am really only referring to a perception of all-or-nothing. And like you say, I think it is a product of a maximizing philosophy.
At the end of the day, it really just seems to be an EA marketing/outreach problem, and I think it is entirely addressable by the community. I think the paper idea I mention (discussing the perceived incompatibility of TBP and EA) could be a step in the right direction.
I think this is key. I get the impression (and others do as well) that EA is all-or-nothing. Either you give 100% to AMF or you are not EA.
A private foundation that is focused on the state of New York can use EA principles in trying to identify the biggest impact they can have, within their constraints, and that is still EA. I think even the cause area constraints that are the least EA (say, the arts), can still find ways to improve their impact using EA principles. Though of course that would be more difficult.
As someone who has been a huge believer in CE and the theory of change, I’m honestly just not really seeing it for this.
A few thoughts:
I am a buyer that a group (CE) can identify numerous very high value and high-impact nonprofit organization ideas (because there is not really a robust “market” for doing so), and I think that this is the major innovation of CE. But I am very skeptical that the same can be said of for-profit ideas. If you are expecting participants to bring their own ideas, I’m not sure the value proposition is really there for the program.
I’m skeptical that investing resources into trying to create successful entrepreneurs that are EA-aligned is more effective than just trying to convince existing successful entrepreneurs to become EAs.
Your assumption that the “average chance of a graduate of the Founding to Give program getting into YC of 40%” feels WILDLY optimistic to me. YC and this program may bring in very strong applicants along many dimensions, but I actually think the overlap between those interested in EA and founding unicorns is weak. In addition, the “not funding a company that will make the world worse” constraint on this program likely makes unicorn status substantially less likely, and YC doesn’t really have that constraint. I think it would be helpful to re-estimate the model with a much lower % on this (5%,10%?).
All that said, I am rooting for you!
I’ve been thinking very similarly for a while. Would love to read it.
I study US private foundations, and I’ve observed two things: EA has made virtually no progress in influencing grantmaking, while trust based philanthropy (TBP) has had massive adoption. I think that many believe that EA and TBP are in conflict with one another, but I don’t think that is necessarily the case. I am thinking about writing a post/research paper that makes the case these two movements are compatible with one another.
I generally agree with this critique.
A while back I wrote about an idea for an org that focuses on redirecting US private foundation grants toward more effective causes. Got a lot of feedback, and the consensus was that existing private foundations just aren’t tractable. And I tend to agree with that.
But I have been working on a research paper where we interview private foundation grantmakers to try to better understand how they operate and the information used in their decision making. One of the takeaways is that trust-based philanthropy has had HUGE influence on private foundation grantmaking, despite being very new (every participant we interviewed indicated their foundation had implemented at least some trust based philanthropy practices).
This got me thinking—has EA had any influence? Not a single participant indicated that EA had influenced their grantmaking, and I would say that 75% were neutral and 25% were openly hostile to the idea of EA influencing their grantmaking.
I think EA would benefit from conversations around how to sell EA ideas to these other groups. I think it would require what some would view as “watering down”[1] of EA principles, but could substantially increase the overall impact of EA. Definitely interesting to think about what aspects of EA could be compromised before it ceases to be EA at all.
For example, most US private foundations are severely constrained by the original founder’s intent, such as spending funds in X geographic area. Could these foundations be persuaded and made more effective through a version of EA that encourages effective giving, given existing foundation constraints?
I think this is a good question. To me, EA is pretty ruthless in how it assesses effectiveness, and that leads to many causes feeling left out (especially when those causes are close to you personally).
Taken to an extreme, if all charitable acts/giving was done through an EA lens, it would feel pretty brutalist to any cause not included in its scope. Though from an EA lens, this would be a *more effective* charitable sector and ultimately reduce suffering/increase overall wellbeing.
But the simple reality is EA is small relative to the universe of charitable acts. And I think having a portion of charitable acts approached with an EA lens is a good thing. And I think the actual % is significantly lower than the optimal %.
Quick question—my wife is a provisionally licensed LPC in the US. I know there’s a lot of rules on how they are allowed to practice (like they must be with a client within their state of licensure). Do these rules just not apply when working internationally?
This isn’t really a comment regarding the content of your post, but it made me think of it. I think EAs who write good forum posts should consider submitting them to large philanthropy magazines such as Stanford Social Innovation Review and the Chronicle of Philanthropy.
I think you make an interesting argument here but I can’t help but feel you are preaching to the choir. It is important to make this argument to the people in philanthropy that complain that EA doesn’t address root causes! And those people don’t read the EA forum, they read SSIR and the Chronicle.
I do think the article would need some work and probably toned down a bit, but I don’t think it’s too much of a stretch that posts like this can be published in these other outlets. And more articles that defend EA principles in these other outlets can influence the exact people EA should be trying to influence.