Error
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
I’m in favour to improving this coordination problem, but I think this particular solution is a bad idea and should not be started. The main problem is unilateralist’s curse. Suppose there is a really bad project which 19 out of 20 altruistically minded funders (and no professional grantmaker) would support. Your design of the structure would make it much more likely that it will get funded.
In general effective altruism has a lot of value in brand and goodwill and epistemic standards/culture (like in billions $). It seems relatively easy to create large negative impact by destroying part of this, which can be “achieved” even by relatively small project with modest funding. Public donor list seems to be literally the worst option for structure if we want to avoid bad projects.
I’d like to point out for the benefit of other forum readers that EAs have different views on the average expected value of projects, variance in expected value of projects, and prevalence and severity of negative expected value projects. Based on the applications that the EA Angel Group has received, as well as lengthy lists of projects that have existed or currently exist in the EA community, at present I do not think that many immediately obvious negative EV projects exist (it is possible that people come up with negative EV projects but then receive feedback on potential harms prior to the project’s existence becoming known to many people). I have seen a lot of projects that could potentially have a near-zero EV by failing to achieve their intended objectives or underperforming a top-rated EA charity, but people will often have highly varied opinions on a project’s EV.
Jan has a focus on x-risk and the long-term future. A project seeking to directly impact x-risks by doing something like AI safety research has not yet applied to the EA Angel Group, and I have rarely or ever seen projects like that in EA project lists. It is possible that people behind projects like those are already aware of the risks of sharing information or do not see the need to either share the project’s existence with many people or apply for early-stage funding from funders that do not focus exclusively x-risk. It is possible that complex projects that are doing direct work to impact the long-term future can have a greater potential to create harm and should be reviewed more rigorously.
In the case that most EA projects are EV positive and are in need of funding, then this article’s suggestion is likely net positive. Also, essentially all individual funders I’ve spoken with already consult other funders and experts if they see the need before making funding decisions. If this is the norm, which I think it is, this makes it much less likely that the unilateralist’s curse will happen in practice with regard to EA project funding.
Most importantly, this article’s proposal will probably only have a marginal impact on project and funder discoverability. Historically, many resources have existed online to enable EAs and funders to discover projects, like the .impact Hackpad (which shut down when Hackpad was acquired), various lists of projects that have popped up on the EA forum and elsewhere on the internet, and the EA Work Club. Announcing that a project exists or is seeking funding is simply sharing information, and there doesn’t appear to be any easy way to prevent people from sharing information if they want to.
Therefore, I do not think it is fair to label this proposal a “bad idea.” Implementing the article’s proposal only makes it marginally easier for funders to learn about things than existing methods like someone posting a project idea directly on the EA Forum and even seeking funding, as has been done many times in the past. Someone who is motivated enough about seeking funding can simply speak with a lot of EAs they encounter and ask for funding, sidestepping this article’s list of funders.
Nevertheless, there still may be a risk of one funder making a mistake and not seeking additional evaluations from others before funding something. That is why I created the EA Angel Group, which has an initial staff review of projects followed by funders sharing their evaluations of projects with each other to eliminate the possibility of one funder funding something while not being aware of the opinion of other funders. For optimal safety, a setup like the EA Angel Group is safer than publicly posting everyone’s contact information online and seems to achieve the same overall objectives as this article’s proposal.
Ok, the project is now on indefinite hiatus. I’ll seek to deeply understand all the critiques of it first if I come back with another attempt in the future.
I hope other groups will try and address the problem that this concept was designed to address, in a way that is highly net +EV.
Jan or others, what ideas might you suggest for addressing the challenge of an EA without a strong donor network to get a super early stage grant, in a way that avoids significant potential downsides?
Hey, saw your other post so just wanted to give some feedback. FWIW I think this is a good idea and good post. It builds on a concept that’s already been somewhat discussed, does a good job brainstorming pros cons challenges and ideas, and overall is a very good conversation starter and continuer.
As for the negative feedback, one possibility is that I could see people disliking your “hard to abandon” concept. There’s a fair bit of focus in EA on not causing harm when trying to do good, and one of the most advocated ways to avoid doing harm is to be cautious before taking irreversible actions. I could see someone arguing that a poor implementation of this idea is worse than none at all (because it would undermine possible future attempts, or lower the reputation of startup EA projects for actual success). I’d personally agree that a poor rollout could well be worse than none, and that the general mindset around this should probably be to do it right or not at all, though I don’t see that as reason enough to downvote.
Also, as another newcomer who feels self-conscious/nervous beginning to post on here, just my encouragement to stick with it. It seems very likely that our input is valuable and valued, even when it feels ignored.
Hi there! Ben Pence and I launched the EA Angel Group several months ago, which seems related to your proposal. We wrote an EA Forum post announcing the launch. It’d be great to jump on a call and compare thoughts on what we’re working on and how we might be able to collaborate! One thought, some funders may be uncomfortable with being publicly listed (perhaps due to concerns about lots of people contacting them), but a certain subset of funders could be pretty on board with the idea.
Thank you Brendon, I’ve sent you a PM now!
Agreed. One thought I’ve had is that donors that have concerns like this but that are still interested could set up an anonymous email address that forwards to their main inbox, and list themselves as that. This way if it ever becomes too much for them, they can be removed from the site and it’s somewhat more separate/compartmentalized from their real name. I’m open to additional community input on other ways to allow donors to be listed in a way that is semi-anonymous and preserves much of the upside for donors and potential grantees, while reducing concerns from donors like that.
Thanks for thinking of this! My experience is that, in both for-profit and nonprofit spaces, the limiting constraint is not knowledge that fundable projects exist. Rather, it’s the lack of due diligence on the projects (and people who can do that sort of DD).
In for-profit angel investing, usually one investor will take the “lead”, meaning that they do a full examination of the startup: speak with customers, audit the financials, do background checks on the founders, etc. Other investors will invest conditional on the lead signing off. Certain groups will usually prefer to lead or not; some of them will make investments into hiring lawyers, accountants etc. to help them do this due diligence, whereas others will prefer to just defer to other lead investors.
I’m not aware of any entity similar to a lead investor in the EA community. People sometimes suggest just following on with OpenPhil (i.e. only donating to organizations which OpenPhil grants to) – this doesn’t seem unreasonable, but it does mean that many organizations will be left unfunded.
I agree with this point. Even in the startup world, where due diligence is common, most projects fail after spending a lot of money, achieving very little impact in the process.
In the case of EA projects, even a project that doesn’t have negative value can still lead to a lot of “waste”: There’s a project team that spent time working on something that failed (though perhaps they got useful experience) and one or more donors who didn’t get results.
Hits-based giving (which focuses on big successes even at the cost of some failure) is a useful approach, but in order for that to work, you do need a project that can at least plausibly be a hit, and no idea is strong enough to create that level of credibility by itself. Someone needs to get to know the team’s background and skills, understand their goals, and consider the reasons that they might not reach those goals.
Side note: I hope that anyone who independently funds an EA project considers writing a post about their decision, as Adam Gleave did after winning the 2017 donor lottery.
I quite like this idea, and think that the unilateralist’s curse is less important than others make it out to be (I’ll elaborate on this in a forum post soon).
Just wanted to quickly mention https://lets-fund.org/ as a related project, in case you hadn’t already heard of it.
Thanks, and I’ll look forward to reading your upcoming post!
How is this different from EA Grants?
Thanks Michael!
My understanding is that EA grants and this are both working towards addressing roughly the same problem: the difficulty of getting seed or “pre-seed” grants for new EA organizations, especially for people who are not well connected.
When someone is seeking grants, the more possible grantors the better – the startup analogy would be that EA grants is an individual angel investor (and, it seems, one that isn’t currently accepting pitches), and this concept is analogous a list of active angel investors – they are complementary.
So EA grants would be listed on this site as one of the sources of grants.