I’ve gotten a sense that the staff isn’t interested in increasing the number of intro workshops, that the intro workshops don’t feel particularly exciting for the staff, and that most staff are less interested in improving the intro workshops than other parts of CFAR. This makes it less likely that those workshops will maintain their quality and impact, and I currently think that those workshops are likely one of the best ways for CFAR to have a large impact.
...
CFAR is struggling to attract top talent, partially because some of the best staff left, and partially due to a general sense of a lack of forward momentum for the organization. This is a bad sign, because I think CFAR in particular benefits from having highly talented individuals teach at their workshops and serve as a concrete example of the skills they’re trying to teach.
Why a large, unrestricted grant to CFAR, given these concerns? Would a smaller grant catalyze changes such that the organization becomes cash-flow positive?
By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is.
What is going to happen between now & then that will help you learn enough to have a higher-credence view about CFAR?
Seems like a large, unrestricted grant permits further “business-as-usual” operations. Are “business-as-usual” operations the best state for driving your learning as a grant-maker?
I assume that by “cash-flow positive”, you mean supported by fees from workshop participants?
I don’t consider that to be a desirable goal for CFAR.
Habryka’s analysis focuses on CFAR’s track record. But CFAR’s expected value comes mainly from possible results that aren’t measured by that track record.
My main reason for donating to CFAR is the potential for improving the rationality of people who might influence x-risks. That includes mainstream AI researchers who aren’t interested in the EA and rationality communities. The ability to offer them free workshops seems important to attracting the most influential people.
I assume that by “cash-flow positive”, you mean supported by fees from workshop participants?
Yes, that’s roughly what I mean.
I’m gesturing towards “getting to a business structure where it’s straightforward to go into survival mode, wherein CFAR maintains core staff & operations via workshop fees.”
Seems like in that configuration, the org wouldn’t be as buffeted by the travails of a 6-month or 12-month fundraising cycle.
I agree that being entirely supported by workshop fees wouldn’t be a desirable goal-state for CFAR. But having a “survival mode” option at the ready for contingencies seems good.
Why a large, unrestricted grant to CFAR, given these concerns? Would a smaller grant catalyze changes such that the organization becomes cash-flow positive?
I have two interpretations of what your potential concerns here might be, so might be good to clarify first. Which of these two interpretations is closer to what you mean?
1. “Why give CFAR such a large grant at all, given that you seem to have a lot of concerns about their future”
2. “Why not give CFAR a grant that is conditional on some kind of change in the organization?”
1. “Why give CFAR such a large grant at all, given that you seem to have a lot of concerns about their future”
I am overall still quite positive on CFAR. I have significant concerns, but the total impact CFAR had over the course of its existence strikes me as very large and easily worth the resources it has taken up so far.
I don’t think it’s the correct choice for CFAR to take irreversible action right now because they correctly decided to not run a fall fundraiser, and I still assign significant probability to CFAR actually being on the right track to continue having a large impact. My model here is mostly that whatever allowed CFAR to have a historical impact did not break, and so will continue producing value of the same type.
2. “Why not give CFAR a grant that is conditional on some kind of change in the organization?”
I considered this for quite a while, but ultimately decided against it. I think grantmakers should generally be very hesitant to make earmarked or conditional grants to organizations, without knowing the way that organization operates in close detail. Some things that might seem easy to change from the outside often turn out to be really hard to change for good reasons, and this also has the potential to create a kind of adversarial relationship where the organization is incentivized to do the minimum amount of effort necessary to meet the conditions of the grant, which I think tends to make transparency a lot harder.
Overall, I much more strongly prefer to recommend unconditional grants with concrete suggestions for what changes would cause future unconditional grants to be made to the organization, while communicating clearly what kind of long-term performance metrics or considerations would cause me to change my mind.
I expect to communicate extensively with CFAR over the coming weeks, talk to most of its staff members, generally get a better sense of how CFAR operates and think about the big-picture effects that CFAR has on the long-term future and global catastrophic risk. I think I am likely to then either:
make recommendations for a set of changes with conditional funding,
decide that CFAR does not require further funding from the LTF,
or be convinced that CFAR’s current plans make sense and that they should have sufficient resources to execute those plans.
My model here is mostly that whatever allowed CFAR to have a historical impact did not break, and so will continue producing value of the same type.
Perhaps a crux here is whether whatever mechanism historically drove CFAR’s impact has already broken or not. (Just flagging, doesn’t seem important to resolve this now.)
Yeah, that’s what I intended to say. “In the world where I come to the above opinion, I expect my crux will have been that whatever made CFAR historically work, is still working”
CFAR:
Why a large, unrestricted grant to CFAR, given these concerns? Would a smaller grant catalyze changes such that the organization becomes cash-flow positive?
What is going to happen between now & then that will help you learn enough to have a higher-credence view about CFAR?
Seems like a large, unrestricted grant permits further “business-as-usual” operations. Are “business-as-usual” operations the best state for driving your learning as a grant-maker?
I assume that by “cash-flow positive”, you mean supported by fees from workshop participants?
I don’t consider that to be a desirable goal for CFAR.
Habryka’s analysis focuses on CFAR’s track record. But CFAR’s expected value comes mainly from possible results that aren’t measured by that track record.
My main reason for donating to CFAR is the potential for improving the rationality of people who might influence x-risks. That includes mainstream AI researchers who aren’t interested in the EA and rationality communities. The ability to offer them free workshops seems important to attracting the most influential people.
Yes, that’s roughly what I mean.
I’m gesturing towards “getting to a business structure where it’s straightforward to go into survival mode, wherein CFAR maintains core staff & operations via workshop fees.”
Seems like in that configuration, the org wouldn’t be as buffeted by the travails of a 6-month or 12-month fundraising cycle.
I agree that being entirely supported by workshop fees wouldn’t be a desirable goal-state for CFAR. But having a “survival mode” option at the ready for contingencies seems good.
I have two interpretations of what your potential concerns here might be, so might be good to clarify first. Which of these two interpretations is closer to what you mean?
1. “Why give CFAR such a large grant at all, given that you seem to have a lot of concerns about their future”
2. “Why not give CFAR a grant that is conditional on some kind of change in the organization?”
I’m curious about both (1) and (2), as they both seem like plausible alternatives that you may have considered.
Seems good.
I am overall still quite positive on CFAR. I have significant concerns, but the total impact CFAR had over the course of its existence strikes me as very large and easily worth the resources it has taken up so far.
I don’t think it’s the correct choice for CFAR to take irreversible action right now because they correctly decided to not run a fall fundraiser, and I still assign significant probability to CFAR actually being on the right track to continue having a large impact. My model here is mostly that whatever allowed CFAR to have a historical impact did not break, and so will continue producing value of the same type.
I considered this for quite a while, but ultimately decided against it. I think grantmakers should generally be very hesitant to make earmarked or conditional grants to organizations, without knowing the way that organization operates in close detail. Some things that might seem easy to change from the outside often turn out to be really hard to change for good reasons, and this also has the potential to create a kind of adversarial relationship where the organization is incentivized to do the minimum amount of effort necessary to meet the conditions of the grant, which I think tends to make transparency a lot harder.
Overall, I much more strongly prefer to recommend unconditional grants with concrete suggestions for what changes would cause future unconditional grants to be made to the organization, while communicating clearly what kind of long-term performance metrics or considerations would cause me to change my mind.
I expect to communicate extensively with CFAR over the coming weeks, talk to most of its staff members, generally get a better sense of how CFAR operates and think about the big-picture effects that CFAR has on the long-term future and global catastrophic risk. I think I am likely to then either:
make recommendations for a set of changes with conditional funding,
decide that CFAR does not require further funding from the LTF,
or be convinced that CFAR’s current plans make sense and that they should have sufficient resources to execute those plans.
This is super helpful, thanks!
Perhaps a crux here is whether whatever mechanism historically drove CFAR’s impact has already broken or not. (Just flagging, doesn’t seem important to resolve this now.)
Yeah, that’s what I intended to say. “In the world where I come to the above opinion, I expect my crux will have been that whatever made CFAR historically work, is still working”