Skeptical about the cost effectiveness of several of these.
Ought − 50k. “Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future.” Is that really your aim now, being a grant dispenser for random AI companies? What happened to saving lives?
“Our understanding is that hiring is currently more of a bottleneck for them than funding, so we are only making a small grant.” If they have enough money and this is a token grant, why is it 50k? Why not reduce to 15-20k and spend the rest on something else?
Metaculus − 70k, Ozzie Gooen − 70k, Jacob Lagerros − 27k. These are small companies that need funding; why are you acting as grant-givers here rather than as special interest investors?
Robert Miles, video content on AI alignment − 39k.Isn’t this something you guys and/or MIRI should be doing, and could do quickly, for a lot less money, without having to trust that someone else will do it well enough?
Fanfiction handouts − 28k. What’s the cost breakdown here? And do you really think this will make you be taken more seriously? If you want to embrace this fanfic as a major propaganda tool, it certainly makes sense to get it thoroughly edited, especially before doing an expensive print run.
CFAR − 150k(!). If they’re relying on grants like this to survive, you should absolutely insist that they downsize their staff. This funding definitely shouldn’t be unrestricted.
Connor Flexman − 20k. “Techniques to facilitate skill transfer between experts in different domains” is very vague, as is “significant probability that this grant can help Connor develop into an excellent generalist researcher”. I would define this grant much more concretely before giving it.
Lauren Lee − 20k. This is ridiculous, I’m sure she’s a great person but please don’t use the gift you received to provide sinecures to people “in the community”.
Nikhil Kunapul’s research − 30k and Lucius Caviola’s postdoc − 50k. I know you guys probably want to go in a think-tanky direction but I’m still skeptical.
The large gift you received should be used to expand the influence of EA as an entity, not as a one-off. I think you should reconsider grants vs investment when dealing with small companies, the CFAR grant also concerns me, and of course in general I support de-emphasizing AI risk in favor of actual charity.
I would be interested in other people creating new top-level comments with individual concerns or questions. I think I have difficulty responding to this top-level comment, and expect that other people stating their questions independently will overall result in better discussion.
The large gift you received should be used to expand the influence of EA as an entity, not as a one-off [...] and of course in general I support de-emphasizing AI risk in favor of actual charity.
While I’m not involved in EA Funds donation processing or grantmaking decisions, I’d guess that anyone making a large gift to the Far Future Fund does, in fact, support emphasizing AI risk, and considers funding this branch of scientific research to be “actual charity”.
It could make sense for people with certain worldviews to recommend that people not donate to the fund for many reasons, but this particular criticism seems odd in context, since supporting AI risk work is one of the fund’s explicit purposes.
If the donation was specifically earmarked for AI risk, that aside isn’t relevant, but most of the comment still applies. Otherwise, AI risk is certainly not the only long term problem.
Robert Miles, video content on AI alignment − 39k. Isn’t this something you guys and/or MIRI should be doing, and could do quickly, for a lot less money, without having to trust that someone else will do it well enough?
Creating good video scripts is a rare skill. So is being able to explain things on a video in a way many viewers find compelling. And a large audience of active viewers is a rare resource (one Miles already has through his previous work).
I share some of your questions and concerns about other grants here, but in this case, I think it makes a lot of sense to outsource this tricky task, which most organizations do badly, to someone with a track record of doing it well.
I honestly think this was one of the more obvious ones on the list. 39k for one full year of work is a bit of a steal, especially for someone who already has the mathematical background, video production skills, and audience. I imagine if CEA were to try to recreate that it would have a pretty hard time, plus the recruitment would be quite a challenge.
I second this analysis and agree that this was a great grant. I was considering donating to Miles’ Patreon but was glad to see the Fund step in to do so instead. It’s more tax-efficient to do it that way. Miles is a credible, entertaining, informative source on AI Safety and could be a real asset to beginners in the field. I’ve introduced people to AIS using his videos.
(Top-level seems better, but will reply here anyway)
The Ought grant was one of the grants I was least involved in, so I can’t speak super much to the motivation behind that one. I think you will want to get Matt Wage’s thoughts on that.
“Metaculus − 70k, Ozzie Gooen − 70k, Jacob Lagerros − 27k. These are small companies that need funding; why are you acting as grant-givers here rather than as special interest investors?”
I’m not sure why you think all of these are companies. Metaculus is a company, but the other two aren’t.
Personally, I think it would be pretty neat if this group (or a similar one) were to later set up the legal infrastructure to properly invest in groups where that would make sense. But this would take quite a bit of time (both fixed costs and marginal costs), and if there are only a few groups per year (one, in this case, I believe) is probably not worth it.
Skeptical about the cost effectiveness of several of these.
Ought − 50k. “Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future.” Is that really your aim now, being a grant dispenser for random AI companies? What happened to saving lives?
“Our understanding is that hiring is currently more of a bottleneck for them than funding, so we are only making a small grant.” If they have enough money and this is a token grant, why is it 50k? Why not reduce to 15-20k and spend the rest on something else?
Metaculus − 70k, Ozzie Gooen − 70k, Jacob Lagerros − 27k. These are small companies that need funding; why are you acting as grant-givers here rather than as special interest investors?
Robert Miles, video content on AI alignment − 39k. Isn’t this something you guys and/or MIRI should be doing, and could do quickly, for a lot less money, without having to trust that someone else will do it well enough?
Fanfiction handouts − 28k. What’s the cost breakdown here? And do you really think this will make you be taken more seriously? If you want to embrace this fanfic as a major propaganda tool, it certainly makes sense to get it thoroughly edited, especially before doing an expensive print run.
CFAR − 150k(!). If they’re relying on grants like this to survive, you should absolutely insist that they downsize their staff. This funding definitely shouldn’t be unrestricted.
Connor Flexman − 20k. “Techniques to facilitate skill transfer between experts in different domains” is very vague, as is “significant probability that this grant can help Connor develop into an excellent generalist researcher”. I would define this grant much more concretely before giving it.
Lauren Lee − 20k. This is ridiculous, I’m sure she’s a great person but please don’t use the gift you received to provide sinecures to people “in the community”.
Nikhil Kunapul’s research − 30k and Lucius Caviola’s postdoc − 50k. I know you guys probably want to go in a think-tanky direction but I’m still skeptical.
The large gift you received should be used to expand the influence of EA as an entity, not as a one-off. I think you should reconsider grants vs investment when dealing with small companies, the CFAR grant also concerns me, and of course in general I support de-emphasizing AI risk in favor of actual charity.
This comment strikes me as quite uncharitable, but asks really good questions that I do think would be good to see more detail on.
I would be interested in other people creating new top-level comments with individual concerns or questions. I think I have difficulty responding to this top-level comment, and expect that other people stating their questions independently will overall result in better discussion.
While I’m not involved in EA Funds donation processing or grantmaking decisions, I’d guess that anyone making a large gift to the Far Future Fund does, in fact, support emphasizing AI risk, and considers funding this branch of scientific research to be “actual charity”.
It could make sense for people with certain worldviews to recommend that people not donate to the fund for many reasons, but this particular criticism seems odd in context, since supporting AI risk work is one of the fund’s explicit purposes.
--
I work for CEA, but these views are my own.
If the donation was specifically earmarked for AI risk, that aside isn’t relevant, but most of the comment still applies. Otherwise, AI risk is certainly not the only long term problem.
I was not informed of any earmarking, so I don’t think there were any stipulations around that donation.
Creating good video scripts is a rare skill. So is being able to explain things on a video in a way many viewers find compelling. And a large audience of active viewers is a rare resource (one Miles already has through his previous work).
I share some of your questions and concerns about other grants here, but in this case, I think it makes a lot of sense to outsource this tricky task, which most organizations do badly, to someone with a track record of doing it well.
--
I work for CEA, but these views are my own.
I honestly think this was one of the more obvious ones on the list. 39k for one full year of work is a bit of a steal, especially for someone who already has the mathematical background, video production skills, and audience. I imagine if CEA were to try to recreate that it would have a pretty hard time, plus the recruitment would be quite a challenge.
I second this analysis and agree that this was a great grant. I was considering donating to Miles’ Patreon but was glad to see the Fund step in to do so instead. It’s more tax-efficient to do it that way. Miles is a credible, entertaining, informative source on AI Safety and could be a real asset to beginners in the field. I’ve introduced people to AIS using his videos.
It would be really useful if this was split up into separate comments that could be upvoted/downvoted separately.
+1. I have pretty different thoughts about many of the points you raise.
I don’t think karma/voting system should be given that much attention or should be used as a highly visible feedback on project funding.
I do think that it would help independently of that by allowing more focused discussion on individual issues.
To clarify—agree with the benefits of splitting the discussion threads for readability, but I was unenthusiastic about the motivation be voting.
{Made this a top-level comment at Oli’s request.}
(Will reply to this if you make it a top-level comment, like the others)
K, it’s now top-level.
Ought: why provide $50,000 to Ought rather than ~$15,000, given that they’re not funding constrained?
(Top-level seems better, but will reply here anyway)
The Ought grant was one of the grants I was least involved in, so I can’t speak super much to the motivation behind that one. I think you will want to get Matt Wage’s thoughts on that.
Cool, do you know if he’s reading & reacting to this thread?
Don’t know. My guess is he will probably read it, but I don’t know whether he will have the time to respond to comments.
I’m not sure why you think all of these are companies. Metaculus is a company, but the other two aren’t.
Personally, I think it would be pretty neat if this group (or a similar one) were to later set up the legal infrastructure to properly invest in groups where that would make sense. But this would take quite a bit of time (both fixed costs and marginal costs), and if there are only a few groups per year (one, in this case, I believe) is probably not worth it.