GWWC board member, software engineer in Boston, parent, musician. Switched from earning to give to direct work in pandemic mitigation. Married to Julia Wise. Speaking for myself unless I say otherwise. Full list of EA posts: jefftk.com/ânews/âea
Jeff Kaufman đ¸
Thanks for trying this!
Reviewing its judgements:
-
I think YIMBY is not very left or right. Hereâs how Claude put it:
JK: Where does the YIMBY movement fall on the left-right spectrum in the US? The YIMBY (Yes In My Backyard) movement tends to fall on the center-left to center-right of the political spectrum in the US. YIMBYs generally support increasing housing supply and density to address housing affordability, which aligns with liberal/âprogressive goals. However, their support for market-based solutions and property rights puts them at odds with some further left positions. Overall, YIMBY is considered a centrist or âthird wayâ approach to housing and urban development issues.
-
I donât know much about the CHAI or ASG, but given that they were founded by politicians on the US left it seems reasonable to guess theyâre left of center. Like, I think if OP were recommending grants to equivalent international orgs founded by US right politicians weâd count that the other way? Though I think âpolitical think tank or organization within the United Statesâ doesnât really apply.
-
It seems like it thinks animal advocacy and global health are left coded, which on one hand isnât totally wrong (I expect global health and animal advocates to be pretty left on average), but on the other isnât really what weâre trying to get at here.
-
Since the GPT-o1-preview response reads to me as âthese grants donât look politically codedâ Iâd be curious if youâd also get a similar response to:
Here is a spreadsheet of all of Open Philanthropyâs grants since January 2024. Could you identify whether any of them might meaningfully constitute a grant to a âleft of centerâ political think tank or organization within the United States?
I really appreciate you writing up the Voting Norms section! Making it clear when you see âtacticalâ participation as beneficial vs harmful is very helpful.
ConÂsider fundÂing the NuÂcleic Acid ObÂserÂvaÂtory to DeÂtect Stealth Pandemics
if somebody thinks Open Phil is underinvesting in longtermism compared to the ideal allocation, then they should give to longtermist charities- the opportunities available to Open Phil might be significantly stronger than the ones available to donors
âTopping upâ OP grants does reasonably well in this scenario, no?
PerÂsonal AI Planning
While I think this piece is right in some sense, seeing it written out clearly it feels like there is something uncooperative and possibly destructive about it. To take the portfolio management case:
-
Why do the other fund managers prefer 100% stocks? Is this a thoughtful decision you are unthinkingly countering?
-
Each fund manager gets better outcomes if they keep their allocation secret from others.
I think Iâm most worried about (2): it would be bad if OP made their grants secret or individuals lied about their funding allocation in EA surveys.
Tweaking the fund manager scenario to be a bit more stark:
-
There are 100 fund managers
-
50 of them prefer fully stocks, 50 prefer an even split between stocks and bonds
-
If they each decide individually youâd get an overall allocation of 75% stocks and 25% bonds.
-
If instead they all are fully following the lessons of this post, the ones that prefer bonds go 100% bonds, and the overall allocation is 50% stocks and 50% bonds.
It feels to me that the 75-25 outcome is essentially the right one, if the two groups are equally likely to be correct. On the other hand, the adversarial 50-50 outcome is one group getting everything they want.
Note that I donât think this is an issue with other groups covering the gaps left by the recent OP shift away from some areas. Itâs not that OP thought that those areas should receive less funding, but that GV wanted to pick their battles. In that case, external groups that do accept the case for funding responding by supporting work in these areas seems fine and good. Which Moskovitz confirms: âIâm explicitly pro-funding by othersâ And: âIâd much prefer to just see someone who actually feels strongly about that take the wheel.â
(This also reminds me about the perpetual debate about whether you should vote things on the Forum up/âdown directionally vs based on how close the vote total currently is to where you think it should be.)
-
I donât think of putting a small orange diamond only in my EA Forum username as targeting EAs first, but instead that I want to communicate differently with different audiences?
On the Forum mostly people know what the diamond is, and putting it in my username helps communicate that pledging is normal and common.
Elsewhere, I think it would work more as you describe, as a potential conversation starter and an opportunity to introduce people to effective giving. But because of the downsides I describe in the post, in other environments I prefer to do this in words. This also works better as I advocate for more different things: I can write some posts advocating effective giving, other posts advocating letting people build more housing, etc.
I do think that if I were more shy and less willing to discuss effective giving (and if I didnât have a range of other things I was advocating for) putting a diamond in my general social media profiles would make more sense.
SigÂnalÂing with Small Orange Diamonds
AdÂviÂsors for Smaller MaÂjor Donors?
Good post, thanks for writing it!
A quibble:
we should have different expectations for a 20-person organization with a $1 million budget than a 2-person $100,000 budget organization.
I know this is a sketch, but even if 100% of costs are labor both of these come out to fully-loaded costs of $50k/âemployee which seems quite low to me?
As someone who has raised funds from larger funders and is currently considering participating in marginal funding week, I donât think that would work very well:
-
Our main funders have a lot of context on our work, and so our grant applications are missing a lot of information that a typical Forum reader would need. This includes basic stuff like â what problem are you trying to solve?â
-
Because we have engaged with these funders previously, portions of a funding requests can be discussion of specific issues they have previously raised, which might be pretty in the weeds for a Forum reader and require extra context.
-
There is a lot of information you can share in a private grant request that you canât make public. For example, specific quotes youâve received from potential partners on pricing, some kinds of strategic planning, potential partnership opportunities, or frank assessments of the capabilities of other organizations.
-
Writing for public consumption requires more attention to how a wide range of potential readers, including both low context Forum readers and potential partners, would interpret things.
-
I was specifically asking (and am still wondering) whether you stand by every individual point in your original post, such that it would be worth it for me to write a point-by-point response.
(Sometimes when people give high-level instructions to an LLM which results in output where theyâre willing to stand by the general message, but some of the specific claims arenât actually what they believe. The same thing can also happen when hiring people: if I was trying to deeply engage with a company on one of their policies it wouldnât be productive to write a point-by-point response to an answer Iâd received from a first-line support representative.)
Iâd be much more interested in reading your prompts to ChatGPT than the output it produced. I suspect this would make it much easier for me (and others) to understand your position.
Iâm confused: this seems to me to be a restatement of your main point and not a response to my question?
I think the average community member is pretty savvy, and the communityâs demonstrated deliberative skill in evaluating funding issues seems pretty strong.
I donât know, this seems overly optimistic to me. The average community member doesnât come in with much skill in evaluating nascent orgs, and is unlikely to get the kind of practice-with-feedback that would allow them to develop this skill.
people deferring somewhat to a ~randomly selected community screening jury (which could hopefully be at least medium-context)
Donor lottery winners?
Or, less flippantly, this seems to me what EA Funds and the other granting groups that give seed funding do.
I do think there are cases where someone has a good idea that isnât a good match for any of these funders (ex: the Global Health and Development Fund isnât accepting applications) or where the grantmakers are overworked, not omniscient, and not able to consider everything that they would ideally fund. In these cases I do think making a public case is good, but then it should either look like:
An appeal for âangelsâ who are interested in engaging somewhat deeply with the org to advise and fund it.
An appeal for seed funders that gives enough detail that they can make an informed decision without personal engagement. I think @Habiba Banu and Roxanne Hestonâs SpiroâNew TB charity raising seed funds post is an example of doing this well.
generated what I wanted to say
Overall, do you stand by your comment? If I wrote a point-by-point response would some points get a âthatâs just something the LLM put in because it seemed plausible and isnât actually my viewâ?
ApÂpealÂing to the Public
Diana Fleischman, an evolutionary psychologist at the University of New Mexico, has a part-time role hosting Aporiaâs podcast, and is the author of an article on the website headlined: âYouâre probably a eugenicist.ââ
That article (Aporia: Youâre probably a eugenicist) seems to be the same article she has on her Substack (Dissentient: Youâre probably a eugenicist) and that you refer to above (EA Forum: Most people endorse some form of âeugenicsâ), which was also initially titled the same.
Which is to say: donât double-count, and donât treat the non-linked âYouâre probably a eugenicistâ as if it has worse content than the linked âMost people endorse some form of âeugenicsââ.
For what itâs worth this hasnât been my experience: most of the people I know personally who are working on x-risk (where I know their animal views) think animal welfare is quite important. And for the broader sample where I just know diet the majority are at least vegetarian.