I think that paragraph is quite misguided. “Becoming much more risk averse” is a great way to stop doing anything at all because it’s passed through eight layers of garbage. On top of this, it’s not like “literally becoming the US federal government” and “not having any accounting or governance at all” are your only two options; this creates a sad false dichotomy. SBF was actively and flagrantly ignoring governance, regulation, and accounting. This is not remotely common for EA orgs.
Like, for the last couple of decades we’ve been witnesssing over and over again how established, risk-averse institutions fail because they’re unable to compete with new, scrappy, risk-tolerant ones (that is, startups).
“Good governance” and bureaucracy are, while correlated, emphatically not the same thing. EA turning into a movement that fundamentally values these over just doing good in the world as effectively as possible will be a colossal failure, because bureaucracy is a slippery slope and the Thing That Happens when you emulate the practices that have been used for centuries is that you end up not being able to do anything. I’d be very sad if this was our final legacy.
The “move fast and break things” model of startups works great for something like software businesses where the failures are harmless and easily forgotten.
But we’re not in the software business. We’re in the charity business. And in the charity business, reputation matters in a real, monetary sense. Thanks to FTX, EA has now been associated in pretty much every major newspaper with reckless, harmful, and irresponsible behavior. If you make an EA startup that goes wrong somehow, it’s going to be written up in the guardian or the wall street journal, reminding everyone of FTX again.
And then potential donors are going to read those articles, and know that other people around them are reading said articles as well. If when people see the words “Effective altruism”, the words that come to mind are “fraud and mismanagement”, then most donors are going to go somewhere else, where their donations are met with applause rather than raised eyebrows. This damages everyone associated with EA, no matter how responsible they are for the latest mistake.
A small amount of bureaucracy and checks and balances is a very small price to pay, if we want to avoid being permanently hobbled by a poor reputation.
I think an increase in bureaucracy / risk-aversion is inevitable—and probably necessary—with increasing size/power/influence after a certain point. The 51% coin flip is great when the wager is $100, not so great when it is all life on Earth. I would submit that part of the answer is to prevent any one organization from getting too massive so that it doesn’t get mired down in bureaucracy and ossified. The one thing I will give FTXFF some credit for is the interest in regranting programs.
Right, I wouldn’t want to over-correct, but personally “more respect for good governance (even at cost of some increase in bureaucracy)” is the major lesson I’ve drawn from recent events. (I expect I’m still more anti-bureaucratic than most people, but maybe finding a more balanced view than I previously had.)
I’m unsure whether “risk aversion” is the right way to put this, but even if it is I think we probably just want a bit more of it rather than much more.
FTX and Alameda definitely needed more bureaucracy—as in, doing stuff in a way that doesn’t resemble a scene from Idiocracy. https://docs.house.gov/meetings/BA/BA00/20221213/115246/HHRG-117-BA00-Wstate-RayJ-20221213.pdf “Although our investigation is ongoing and detailed findings will have to await its conclusion, the FTX Group’s collapse appears to stem from the absolute concentration of control in the hands of a very small group of grossly inexperienced and unsophisticated individuals who failed to implement virtually any of the systems or controls that are necessary for a company that is entrusted with other people’s money or assets.”
We should distinguish risk aversity, transparency, and bureaucracy. They’re obviously related but different concepts. I would argue that transparency is far more important than risk aversity, the more so the less risk averse you are—and unfortunately nontransparency often seems to be correlated with risk-taking. This is sometimes justified on infohazard logic (cf MIRI in general) or some harder-to-pin-down lack of urgency to communicate controversial decisions (cf Wytham Abbey). Increasing transparency necessarily increases bureaucracy, but there are many other ways bureaucracy can increase, so we shouldn’t expect it to balloon uncontrollably just because of one upward pressure.
I feel like most core EA organisations would come nowhere near meeting the transparency requirements Givewell place on charities they recommend (though Givewell themselves do impressively well on this score, so it’s clearly not impossible for metacharities).
Established procedures should be questioned. You should definitely use good business practices such as proper accounting, separation of entities with conflicts of interest but you don’t want to copy the copious amounts of “established procedures” that amount to getting nothing done through piles of pointless paperwork and administrative bloat and endless committees who talk about nothing. There’s lots of teams in EA who get a lot done, specifically because they aren’t bogged down in bureaucracy and have a clear focus and mission.
I think that paragraph is quite misguided. “Becoming much more risk averse” is a great way to stop doing anything at all because it’s passed through eight layers of garbage. On top of this, it’s not like “literally becoming the US federal government” and “not having any accounting or governance at all” are your only two options; this creates a sad false dichotomy. SBF was actively and flagrantly ignoring governance, regulation, and accounting. This is not remotely common for EA orgs.
Like, for the last couple of decades we’ve been witnesssing over and over again how established, risk-averse institutions fail because they’re unable to compete with new, scrappy, risk-tolerant ones (that is, startups).
“Good governance” and bureaucracy are, while correlated, emphatically not the same thing. EA turning into a movement that fundamentally values these over just doing good in the world as effectively as possible will be a colossal failure, because bureaucracy is a slippery slope and the Thing That Happens when you emulate the practices that have been used for centuries is that you end up not being able to do anything. I’d be very sad if this was our final legacy.
The “move fast and break things” model of startups works great for something like software businesses where the failures are harmless and easily forgotten.
But we’re not in the software business. We’re in the charity business. And in the charity business, reputation matters in a real, monetary sense. Thanks to FTX, EA has now been associated in pretty much every major newspaper with reckless, harmful, and irresponsible behavior. If you make an EA startup that goes wrong somehow, it’s going to be written up in the guardian or the wall street journal, reminding everyone of FTX again.
And then potential donors are going to read those articles, and know that other people around them are reading said articles as well. If when people see the words “Effective altruism”, the words that come to mind are “fraud and mismanagement”, then most donors are going to go somewhere else, where their donations are met with applause rather than raised eyebrows. This damages everyone associated with EA, no matter how responsible they are for the latest mistake.
A small amount of bureaucracy and checks and balances is a very small price to pay, if we want to avoid being permanently hobbled by a poor reputation.
I think an increase in bureaucracy / risk-aversion is inevitable—and probably necessary—with increasing size/power/influence after a certain point. The 51% coin flip is great when the wager is $100, not so great when it is all life on Earth. I would submit that part of the answer is to prevent any one organization from getting too massive so that it doesn’t get mired down in bureaucracy and ossified. The one thing I will give FTXFF some credit for is the interest in regranting programs.
Right, I wouldn’t want to over-correct, but personally “more respect for good governance (even at cost of some increase in bureaucracy)” is the major lesson I’ve drawn from recent events. (I expect I’m still more anti-bureaucratic than most people, but maybe finding a more balanced view than I previously had.)
I’m unsure whether “risk aversion” is the right way to put this, but even if it is I think we probably just want a bit more of it rather than much more.
FTX and Alameda definitely needed more bureaucracy—as in, doing stuff in a way that doesn’t resemble a scene from Idiocracy. https://docs.house.gov/meetings/BA/BA00/20221213/115246/HHRG-117-BA00-Wstate-RayJ-20221213.pdf “Although our investigation is ongoing and detailed findings will have to await its conclusion, the FTX Group’s collapse appears to stem from the absolute concentration of control in the hands of a very small group of grossly inexperienced and unsophisticated individuals who failed to implement virtually any of the systems or controls that are necessary for a company that is entrusted with other people’s money or assets.”
We should distinguish risk aversity, transparency, and bureaucracy. They’re obviously related but different concepts. I would argue that transparency is far more important than risk aversity, the more so the less risk averse you are—and unfortunately nontransparency often seems to be correlated with risk-taking. This is sometimes justified on infohazard logic (cf MIRI in general) or some harder-to-pin-down lack of urgency to communicate controversial decisions (cf Wytham Abbey). Increasing transparency necessarily increases bureaucracy, but there are many other ways bureaucracy can increase, so we shouldn’t expect it to balloon uncontrollably just because of one upward pressure.
I feel like most core EA organisations would come nowhere near meeting the transparency requirements Givewell place on charities they recommend (though Givewell themselves do impressively well on this score, so it’s clearly not impossible for metacharities).
Strongly approve of this comment.
Established procedures should be questioned. You should definitely use good business practices such as proper accounting, separation of entities with conflicts of interest but you don’t want to copy the copious amounts of “established procedures” that amount to getting nothing done through piles of pointless paperwork and administrative bloat and endless committees who talk about nothing. There’s lots of teams in EA who get a lot done, specifically because they aren’t bogged down in bureaucracy and have a clear focus and mission.