Thanks for engaging as well. I think I disagree with much of the framing of your comment, but I’ll try my best to only mention important cruxes.
I don’t think wordcount is a good way to measure information shared
I don’t think “per amount granted” is a particularly relevant denominator when different orgs have very different numbers of employees per amount granted.
I don’t think grantmakers and incubators are a good like-for-like comparison.
As a practical matter, I neither want to write 500-1000 pages/year of grants nor think it’s the best use of my time.
Here is an easy way of seeing LTFF shares way less information than CE. The 2 grants you evaluated for which there is a “long” write-up have 1058 words[...]
I don’t think wordcount is a good way to measure information shared.
I don’t think wordcount is a fair way to estimate (useful) information shared. I mean it’s easy to write many thousands of words that are uninformative, especially in the age of LLMs. I think to estimate useful information shared, it’s better to see how much people actually know about your work, and how accurate their beliefs are.
As an empirical crux, I predict the average EAF reader, or EA donor, knows significantly more about LTFF than they do about CE, especially when adjusting for number of employees[1]. I’m not certain that this is true since I obviously have a very skewed selection, so I’m willing to be updated otherwise. I also understand that “EAF reader” is probably not a fair comparison since a) we crosspost often on the EA Forum and maybe CE doesn’t as much, and b) much of CE’s public output is in global health, which at least in theory has a fairly developed academic audience outside of EA. I’d update towards the “CE shares a lot more information than EA Funds” position if either of the following turned out to be true:
In surveys, people, and especially donors, are more empirically knowledgeable about CE work than LTFF work.
CE has much more views/downloads of their reports than LTFF.
I looked at my profile again and I think I wrote a post about EA Funds work ~once a month, which I think makes it a fair comparison comparable to the 4-8 reports/year CE writes
For context, skimming analytics, we have maybe 1k-5k views/post.
EA Forum has a “reads” function for how many people spend over x minutes on the post, for my posts I think it’s about 1⁄3 − 1⁄2 of views.
(To be clear, quantity =/= quality or quality-adjusted quantity. I’d also update towards your position if somebody from CE or elsewhere tells me that they have slightly less views but their reports are much more decision-relevant for viewers).
I don’t think “per amount granted” is a particularly relevant denominator when different orgs have very different numbers of employees per amount granted.
I don’t know how many employees CE has. I’d guess it’s a lot (e.g. 19 people on their website). EA Funds has 2 full-time employees and some contractors (including for grantmaking and grants disbursement). I’m ~ the only person at EA Funds who has both the time and inclination to do public writing.
Obviously if you have more capacity, you can write more.
I would guess most EA grantmakers (the orgs you mentioned, but also GiveWell) will have a closer $s granted/FTEs ratio to EA Funds than to CE.
If anything, looking at the numbers again, I suspect CE should be devoting more efforts to fundraising and/or finding more scalable interventions. But I’m an outsider and there is probably a lot of context I’m missing.
I don’t think grantmakers and opinionated incubators are a good like-for-like comparison.
Does that mean I think CE staff aren’t doing useful things? Of course not! They’re just doing very different things. CE calls itself an incubator but much of their staff should better be understood as “researchers” trying to deeply understand an issue. (Like presumably they understand the interventions their incubees work on much better than say YC does). It makes a lot of sense to me that researchers for an intervention can and will go into a lot of depth about an intervention[2]. Similarly, EA Funds’ grantees also can and often do go into a lot of depth about their work.
The main difference between EA Funds is that as a superstructure, we don’t provide research support for our grantees. Whereas you can think of CE as an org that provides initial research support for their incubees so the incubees can think more about strategy and execution.
Just different orgs doing different work.
As a practical matter, I neither want to write 500-1000 pages/year of grants nor think it’s the best use of my time.
Nobody else at EA Funds has time/inclination to publicly write detailed reports. If we want to see payout reports for any of the funds this year, most likely I’d have to write it myself. I personally don’t want to write upwards of a thousand grants this year. It frankly doesn’t sound very fun.
But I’m being paid for impact, not for having fun, so I’m willing to make such sacrifices if the utility gods demand it. So concretely, I’d be interested in what projects I ought to drop to write up all the grants. Eg, to compare like-for-like, I’d find it helpful if you or others can look at my EA Funds’ related writings and tell me which posts I ought to drop so I can spend more time writing up grants.
Thanks for the detailed comment. I strongly upvoted it.
I don’t think wordcount is a good way to measure information shared.
I don’t think wordcount is a fair way to estimate (useful) information shared. I mean it’s easy to write many thousands of words that are uninformative, especially in the age of LLMs. I think to estimate useful information shared, it’s better to see how much people actually know about your work, and how accurate their beliefs are.
I agree the number of words per grant is far from an ideal proxy. At the same time, the median length of the write-ups on the database of EA Funds is 15.0 words, and accounting for what you write elsewhere does not impact the median length because you only write longer write-ups for a small fraction of the grants, so the median information shared per grant is just a short sentence. So I assume donors do not have enough information to assess the median grant.
On the other hand, donors do not necessarily need detailed information about all grants because they could infer how much to trust EA Funds based on longer write-ups for a minority of them, such as the ones in your posts. I think I have to recheck your longer write-ups, but I am not confident I can assess the quality of the grants with longer write-ups based on these alone. I suspect trusting the reasoning of EA Funds’ fund managers is a major reason for supporting EA Funds. I guess me and others like longer write-ups because transparency is often a proxy for good reasoning, but we had better look into the longer write-ups, and assess EA Funds based on them rather than the median information shared per grant.
I don’t think “per amount granted” is a particularly relevant denominator when different orgs have very different numbers of employees per amount granted.
At least a priori, I would expect the information shared about a grant to be proportional to the amount of effort put into assessing it, and this to be proportional to the amount granted, in which case the information shared about a grant would be proportional to the amount granted. The grants you assessed in LTFF’s most recent report were of 200 and 71 k$, and you wrote a few paragraphs about each of them. In contrast, CE’s seed funding per charity in 2023 ranged from 93 to 190 k$, but they wrote reports of dozens of pages for each of them. This illustrates CE shares much more information about the interventions they support than EA Funds’ shares about the grants for which there are longer write-ups. So it is possible to have a better picture of CE’s work than EA Funds’. This is not to say CE’s donors actually have a better picture of CE’s work than EA Funds’ donors have of EA Funds’ work. I do not know how whether CE’s donors look into their reports. However, I guess it would still be good for EA Funds to share some in-depth analyses of their grants.
I don’t think grantmakers and opinionated incubators are a good like-for-like comparison.
Does that mean I think CE staff aren’t doing useful things? Of course not! They’re just doing very different things. CE calls itself an incubator but much of their staff should better be understood as “researchers” trying to deeply understand an issue.
I guess EA Funds’ would benefit from having researchers in that sense. I like that Founders Pledge produces lots of research informing the grantmaking of their funds.
As a practical matter, I neither want to write 500-1000 pages/year of grants nor think it’s the best use of my time.
How about just making some applications public, as Austin suggested? I actually think it would be good to make public the applications of all grants EA Funds makes, and maybe even rejected applications. Information which had better remain confidential could be removed from the public version of the application.
At a high level, l I’m of the opinion that we practice better reasoning transparency than ~all EA funding sources outside of global health, e.g. a) I’m responding to your thread here and other people have not, b) (I think) people can have a decent model of what we actually do rather than just an amorphous positive impression, and c) I make an effort of politely deliveringmessages that most grantmakers are aware of but don’t say because they’re worried about flack.
It’s really not obvious that this is the best use of limited resources compared to e.g. engaging with large donors directly or having very polished outwards-facing content, but I do think criticizing our lack of public output is odd given that we invest more in it than almost anybody else.
(I do wonder if there’s an effect where because we communicate our overall views so much, we become a more obvious/noticeable target to criticize.)
This illustrates CE shares much more information about the interventions they support than EA Funds’ shares about the grants for which there are longer write-ups. So it is possible to have a better picture of CE’s work than EA Funds’. This is not to say CE’s donors actually have a better picture of CE’s work than EA Funds’ donors have of EA Funds’ work. I do not know how whether CE’s donors look into their reports.
Well, I haven’t read CE’s reports. Have you?
I think you have a procedure-focused view where the important thing is that articles are written, regardless of whether they’re read. I mostly don’t personally think it’s valuable to write things people don’t read. (though again for all I know CE’s reports are widely read, in which case I’d update!) And it’s actually harder to write things people want to read than to just write things.
(To be clear, I think there are exceptions. Eg all else equal, writing up your thoughts/cruxes/BOTECs are good even if nobody else reads them because it helps with improving quality of thinking).
How about just making some applications public, as Austin suggested? I actually think it would be good to make public the applications of all grants EA Funds makes, and maybe even rejected applications.
We’ve started working on this, but no promises. My guess is that making public the rejected applications is more valuable than accepted ones, eg on Manifund. Note that grantees also have the option to upload their applications as well (and there are less privacy concerns if grantees choose to reveal this information).
We’ve started working on this [making some application public], but no promises. My guess is that making public the rejected applications is more valuable than accepted ones, eg on Manifund. Note that grantees also have the option to upload their applications as well (and there are less privacy concerns if grantees choose to reveal this information).
Manifund already has quite a good infrastructure for sharing grants. However, have you considered asking applicants to post a public version of their applications on EA Forum? People who prefer to remain anonymous could use an anonymous account, and anonymise the public version of their grant. At a higher cost, there would be a new class of posts[1] which would mimic some of the features of Manifund, but this is not strictly necessary. The posts with the applications could simply be tagged appropriately (with new tags created for the purpose), and include a standardised section with some key information, like the requested amount of funding, and the status of the grant (which could be changed over time editing the post).
The idea above is inspired by some thoughts from Hauke Hillebrandt.
Grantees are obviously welcome to do this. That said, my guess is that this will make the forum less enjoyable/useful for the average reader, rather than more.
I think a dedicated area would minimise the negative impact on people that aren’t interested whilst potentially adding value (to prospective applicants in understanding what did and didn’t get accepted, and possibly also to grant assessors if there was occasional additional insight offered by commenters)
I ’d expect there would be some details of some applications that wouldn’t be appropriate to share on a public forum though
Right, but they have not been doing it. So I assume EA Funds would have to at least encourage applicants to do it, or even make it a requirement for most applications. There can be confidential information in some applications, but, as you said below, applicants do not have to share everything in their public version.
That said, my guess is that this will make the forum less enjoyable/useful for the average reader, rather than more.
I guess the opposite, but I do not know. I am mostly in favour of experimenting with a few applications, and then deciding whether to stop or scale up.
(I do wonder if there’s an effect where because we communicate our overall views so much, we become a more obvious/noticeable target to criticize.)
To be clear, the criticisms I make in the post and comments apply to all grantmakers I mentioned in the post except for CE.
Well, I haven’t read CE’s reports. Have you?
I have skimmed some, but the vast majority of my donations have been going to AI safety interventions (via LTFF). I may read CE’s reports in more detail in the futute, as I have been moving away from AI safety to animal welfare as the most promising cause area.
I think you have a procedure-focused view where the important thing is that articles are written, regardless of whether they’re read.
I do not care about transparency per se[1], but I think there is usually a correlation between it and cost-effectiveness (for reasons like the ones you mentioned inside parentheses). So, a priori, lower transparency updates me towards lower cost-effectiveness.
We’ve started working on this, but no promises.
Cool!
My guess is that making public the rejected applications is more valuable than accepted ones, eg on Manifund.
I can see this being the case, as people currently get to know about most accepted aplications, but nothing about the rejected ones.
Thanks for engaging as well. I think I disagree with much of the framing of your comment, but I’ll try my best to only mention important cruxes.
I don’t think wordcount is a good way to measure information shared
I don’t think “per amount granted” is a particularly relevant denominator when different orgs have very different numbers of employees per amount granted.
I don’t think grantmakers and incubators are a good like-for-like comparison.
As a practical matter, I neither want to write 500-1000 pages/year of grants nor think it’s the best use of my time.
I don’t think wordcount is a good way to measure information shared.
I don’t think wordcount is a fair way to estimate (useful) information shared. I mean it’s easy to write many thousands of words that are uninformative, especially in the age of LLMs. I think to estimate useful information shared, it’s better to see how much people actually know about your work, and how accurate their beliefs are.
As an empirical crux, I predict the average EAF reader, or EA donor, knows significantly more about LTFF than they do about CE, especially when adjusting for number of employees[1]. I’m not certain that this is true since I obviously have a very skewed selection, so I’m willing to be updated otherwise. I also understand that “EAF reader” is probably not a fair comparison since a) we crosspost often on the EA Forum and maybe CE doesn’t as much, and b) much of CE’s public output is in global health, which at least in theory has a fairly developed academic audience outside of EA. I’d update towards the “CE shares a lot more information than EA Funds” position if either of the following turned out to be true:
In surveys, people, and especially donors, are more empirically knowledgeable about CE work than LTFF work.
CE has much more views/downloads of their reports than LTFF.
I looked at my profile again and I think I wrote a post about EA Funds work ~once a month, which I think makes it a fair comparison comparable to the 4-8 reports/year CE writes
For context, skimming analytics, we have maybe 1k-5k views/post.
EA Forum has a “reads” function for how many people spend over x minutes on the post, for my posts I think it’s about 1⁄3 − 1⁄2 of views.
(To be clear, quantity =/= quality or quality-adjusted quantity. I’d also update towards your position if somebody from CE or elsewhere tells me that they have slightly less views but their reports are much more decision-relevant for viewers).
I don’t think “per amount granted” is a particularly relevant denominator when different orgs have very different numbers of employees per amount granted.
I don’t know how many employees CE has. I’d guess it’s a lot (e.g. 19 people on their website). EA Funds has 2 full-time employees and some contractors (including for grantmaking and grants disbursement). I’m ~ the only person at EA Funds who has both the time and inclination to do public writing.
Obviously if you have more capacity, you can write more.
I would guess most EA grantmakers (the orgs you mentioned, but also GiveWell) will have a closer $s granted/FTEs ratio to EA Funds than to CE.
If anything, looking at the numbers again, I suspect CE should be devoting more efforts to fundraising and/or finding more scalable interventions. But I’m an outsider and there is probably a lot of context I’m missing.
I don’t think grantmakers and opinionated incubators are a good like-for-like comparison.
Does that mean I think CE staff aren’t doing useful things? Of course not! They’re just doing very different things. CE calls itself an incubator but much of their staff should better be understood as “researchers” trying to deeply understand an issue. (Like presumably they understand the interventions their incubees work on much better than say YC does). It makes a lot of sense to me that researchers for an intervention can and will go into a lot of depth about an intervention[2]. Similarly, EA Funds’ grantees also can and often do go into a lot of depth about their work.
The main difference between EA Funds is that as a superstructure, we don’t provide research support for our grantees. Whereas you can think of CE as an org that provides initial research support for their incubees so the incubees can think more about strategy and execution.
Just different orgs doing different work.
As a practical matter, I neither want to write 500-1000 pages/year of grants nor think it’s the best use of my time.
Nobody else at EA Funds has time/inclination to publicly write detailed reports. If we want to see payout reports for any of the funds this year, most likely I’d have to write it myself. I personally don’t want to write upwards of a thousand grants this year. It frankly doesn’t sound very fun.
But I’m being paid for impact, not for having fun, so I’m willing to make such sacrifices if the utility gods demand it. So concretely, I’d be interested in what projects I ought to drop to write up all the grants. Eg, to compare like-for-like, I’d find it helpful if you or others can look at my EA Funds’ related writings and tell me which posts I ought to drop so I can spend more time writing up grants.
EA Funds has ~2 full-time employees, and maybe 5-6 FTEs including contractors.
And indeed when I did research I had a lot more time to dive into specific interventions than I do now.
Thanks for the detailed comment. I strongly upvoted it.
I agree the number of words per grant is far from an ideal proxy. At the same time, the median length of the write-ups on the database of EA Funds is 15.0 words, and accounting for what you write elsewhere does not impact the median length because you only write longer write-ups for a small fraction of the grants, so the median information shared per grant is just a short sentence. So I assume donors do not have enough information to assess the median grant.
On the other hand, donors do not necessarily need detailed information about all grants because they could infer how much to trust EA Funds based on longer write-ups for a minority of them, such as the ones in your posts. I think I have to recheck your longer write-ups, but I am not confident I can assess the quality of the grants with longer write-ups based on these alone. I suspect trusting the reasoning of EA Funds’ fund managers is a major reason for supporting EA Funds. I guess me and others like longer write-ups because transparency is often a proxy for good reasoning, but we had better look into the longer write-ups, and assess EA Funds based on them rather than the median information shared per grant.
At least a priori, I would expect the information shared about a grant to be proportional to the amount of effort put into assessing it, and this to be proportional to the amount granted, in which case the information shared about a grant would be proportional to the amount granted. The grants you assessed in LTFF’s most recent report were of 200 and 71 k$, and you wrote a few paragraphs about each of them. In contrast, CE’s seed funding per charity in 2023 ranged from 93 to 190 k$, but they wrote reports of dozens of pages for each of them. This illustrates CE shares much more information about the interventions they support than EA Funds’ shares about the grants for which there are longer write-ups. So it is possible to have a better picture of CE’s work than EA Funds’. This is not to say CE’s donors actually have a better picture of CE’s work than EA Funds’ donors have of EA Funds’ work. I do not know how whether CE’s donors look into their reports. However, I guess it would still be good for EA Funds to share some in-depth analyses of their grants.
I guess EA Funds’ would benefit from having researchers in that sense. I like that Founders Pledge produces lots of research informing the grantmaking of their funds.
How about just making some applications public, as Austin suggested? I actually think it would be good to make public the applications of all grants EA Funds makes, and maybe even rejected applications. Information which had better remain confidential could be removed from the public version of the application.
(Appreciate the upvote!)
At a high level, l I’m of the opinion that we practice better reasoning transparency than ~all EA funding sources outside of global health, e.g. a) I’m responding to your thread here and other people have not, b) (I think) people can have a decent model of what we actually do rather than just an amorphous positive impression, and c) I make an effort of politely delivering messages that most grantmakers are aware of but don’t say because they’re worried about flack.
It’s really not obvious that this is the best use of limited resources compared to e.g. engaging with large donors directly or having very polished outwards-facing content, but I do think criticizing our lack of public output is odd given that we invest more in it than almost anybody else.
(I do wonder if there’s an effect where because we communicate our overall views so much, we become a more obvious/noticeable target to criticize.)
Well, I haven’t read CE’s reports. Have you?
I think you have a procedure-focused view where the important thing is that articles are written, regardless of whether they’re read. I mostly don’t personally think it’s valuable to write things people don’t read. (though again for all I know CE’s reports are widely read, in which case I’d update!) And it’s actually harder to write things people want to read than to just write things.
(To be clear, I think there are exceptions. Eg all else equal, writing up your thoughts/cruxes/BOTECs are good even if nobody else reads them because it helps with improving quality of thinking).
We’ve started working on this, but no promises. My guess is that making public the rejected applications is more valuable than accepted ones, eg on Manifund. Note that grantees also have the option to upload their applications as well (and there are less privacy concerns if grantees choose to reveal this information).
Manifund already has quite a good infrastructure for sharing grants. However, have you considered asking applicants to post a public version of their applications on EA Forum? People who prefer to remain anonymous could use an anonymous account, and anonymise the public version of their grant. At a higher cost, there would be a new class of posts[1] which would mimic some of the features of Manifund, but this is not strictly necessary. The posts with the applications could simply be tagged appropriately (with new tags created for the purpose), and include a standardised section with some key information, like the requested amount of funding, and the status of the grant (which could be changed over time editing the post).
The idea above is inspired by some thoughts from Hauke Hillebrandt.
As of now, there are 3 types, normal posts, question posts and linkposts/crossposts.
Grantees are obviously welcome to do this. That said, my guess is that this will make the forum less enjoyable/useful for the average reader, rather than more.
I think a dedicated area would minimise the negative impact on people that aren’t interested whilst potentially adding value (to prospective applicants in understanding what did and didn’t get accepted, and possibly also to grant assessors if there was occasional additional insight offered by commenters)
I ’d expect there would be some details of some applications that wouldn’t be appropriate to share on a public forum though
Hopefully grantees can opt-in/out as appropriate! They don’t need so share everything.
Right, but they have not been doing it. So I assume EA Funds would have to at least encourage applicants to do it, or even make it a requirement for most applications. There can be confidential information in some applications, but, as you said below, applicants do not have to share everything in their public version.
I guess the opposite, but I do not know. I am mostly in favour of experimenting with a few applications, and then deciding whether to stop or scale up.
To be clear, the criticisms I make in the post and comments apply to all grantmakers I mentioned in the post except for CE.
I have skimmed some, but the vast majority of my donations have been going to AI safety interventions (via LTFF). I may read CE’s reports in more detail in the futute, as I have been moving away from AI safety to animal welfare as the most promising cause area.
I do not care about transparency per se[1], but I think there is usually a correlation between it and cost-effectiveness (for reasons like the ones you mentioned inside parentheses). So, a priori, lower transparency updates me towards lower cost-effectiveness.
Cool!
I can see this being the case, as people currently get to know about most accepted aplications, but nothing about the rejected ones.
I fully endorse expected total hedonistic utilitarianism, so I only intrinsically value/disvalue positive/negative conscious experiences.