Thank you sharing this. As a distinct matter, the specific way FTX failed also makes me more concerned about the viability of a certain type of mindset that seems somewhat common and normalized amongst some in the EA community.
holding the belief that by being very very smart, you can work in areas you have minimal experience and know better than others
having (experienced) adults in the room/adhering to formal compliance norms is overrated
understating the risks posed by conflict of interest issues
accepting ends justify the means type reasoning
I believe Sam’s adherence to the above referenced beliefs played a critical role in FTX’s story. I don’t think that any one of these beliefs is inherently problematic, but I have adjusted downwards against those who hold all of them.
While I agree with the substance of this comment to a great extent, I want to note that EA also has a problem of being much more willing to tolerate abstract criticism than concrete criticism.
If I singled out a specific person in EA and accused them of significant conflicts of interest or of being too unqualified and inexperienced to work on whatever they are currently working on, the reaction in the forum would be much more negative than it was to this comment.
If you really believe the issues raised in the comment are important, take it seriously when people raise these concerns in concrete cases.
This comment exactly proves the point I’m talking about: it gets upvoted because it’s an abstract comment that doesn’t blame anyone in particular of anything. My comments that are critical of or adversarial towards specific people perform significantly worse, and plenty of my comments have fluctuated between having −10 karma to +10 or +20 karma.
I’m reasonably confident this is happening because people in EA hate conflict in general. I’m sorry to say that genuine criticism has to involve conflict of some kind. I understand people in EA are not used to this and prefer to shut their ears while accusing their counterparty of being “insufficienty truth-seeking” or “acting in bad faith” whenever they make use of adversarial discourse, and indeed this is exactly what happened to Guzey’s post from 2018.
I wish EA could move past these dysfunctional social norms, but I’m not getting my hopes up.
Go to any psychiatrist at the conference and criticize psychiatry in these terms—“Don’t you think our field is systemically racist and sexist and fails to understand that the true problem is Capitalism?” and they will enthusiastically agree and maybe even tell you stories about how their own experience proves that’s true and how they need to try to do better.
Is there any criticism that can touch these people at all?
Here’s my proposal: ask why they prescribe s-ketamine instead of racemic ketamine for treatment-resistant depression.
If you say this, psychiatrists will push back. If you say it in a confrontational way—maybe you hint that they’ve outsourced their thought processes to a handful of regulators and pharmaceutical companies even when this severely disadvantages patients, because thinking for themselves is hard and scary—they’ll get offended. The world is full of psychiatrists who will confess to systemic racism with a smile on their face, then get all huffy when you ask them about esketamine.
DMMF—I share your first concern, that many young EAs seem to have quite a bit of ageism, distrust of legacy systems, contempt for tradition, arrogance about being able to reinvent complex systems from first principles, and wariness of welcoming mid-career and senior experts into the EA community.
This youthful arrogance is partly merited. Traditional charity circa 2000 (before GiveWell) was an unaccountable, unempirical grift that was better suited to virtue signaling, status seeking, and wealthy in-group power games, than to actually helping people or animals. EA challenging this legacy system was entirely right, proper, and helpful. Consequentialist moral philosophy, sentientism (eg about animal welfare), randomized trials, cost-benefit reasoning, scope-sensitivity, etc were crucial and revolutionary developments at the foundation of EA.
However, applied to other domains, subcultures, professions, legacy systems, and cultural traditions, that youthful arrogance can be quite misguided. Especially in the domain of financial accounting, corporate management, and business ethics. SBF seemed to think that he could ignore all the accumulated wisdom of the past few centuries about how to run a company, and protect users and investors. The result was, from the perspective of experienced business managers, risk managers, and accountants, and highly predictable catastrophe (for anyone who actually knew the inner workings of Alameda/FTX).
So, one takeaway from this whole FTX debacle is that maybe EAs should be a little more selective about when to challenge tradition and try to reinvent things from first principles, versus when to respect tradition, expertise, and domain knowledge.
This is a tricky and delicate balancing act, but I think science provides an example of how to get that balance roughly right. Young scientists in grad school quickly learn that almost every new idea that they think is earth-shaking and ground-breaking was already thought about by the late 1800s, or 1950s, or whenever. With great scholarship comes great humility. Only after they’ve mastered a fair amount of the literature in some domain are they equipped to make even modest contributions at the cutting edge of thinking. And only in the rarest cases can a true genius overturn a major established paradigm.
The early EA geniuses managed to overturn the existing charity paradigm, and that was great. But with that initial success, we must be wary of developing a generalized hubris about overturning every tradition we see, and challenging every expert we meet.
This comment by Geoffrey Miller is one of the most insightful things I’ve read. I had a lot of these feelings, but lack the STEM background to say it as you have. EA did a great very specific thing, but then it went to their head and they saw themselves as able to do all things better. Hopefully this crisis will be the pivot where they see the difference and re-gather past collective wisdom where it’s still important.
And the fastest route to that is to bring in more seasoned veterans, just import the data straight in through their bodies in EA org’s.
I agree and I am also concerned about this. I have witnessed this many times. I do think there are tremendous merits in vigorously thinking from first principal on some subject matters. But others such as risk management and regulation for example do require experise as we have now seen in the case of FTX.
Thank you sharing this. As a distinct matter, the specific way FTX failed also makes me more concerned about the viability of a certain type of mindset that seems somewhat common and normalized amongst some in the EA community.
holding the belief that by being very very smart, you can work in areas you have minimal experience and know better than others
having (experienced) adults in the room/adhering to formal compliance norms is overrated
understating the risks posed by conflict of interest issues
accepting ends justify the means type reasoning
I believe Sam’s adherence to the above referenced beliefs played a critical role in FTX’s story. I don’t think that any one of these beliefs is inherently problematic, but I have adjusted downwards against those who hold all of them.
While I agree with the substance of this comment to a great extent, I want to note that EA also has a problem of being much more willing to tolerate abstract criticism than concrete criticism.
If I singled out a specific person in EA and accused them of significant conflicts of interest or of being too unqualified and inexperienced to work on whatever they are currently working on, the reaction in the forum would be much more negative than it was to this comment.
If you really believe the issues raised in the comment are important, take it seriously when people raise these concerns in concrete cases.
This comment exactly proves the point I’m talking about: it gets upvoted because it’s an abstract comment that doesn’t blame anyone in particular of anything. My comments that are critical of or adversarial towards specific people perform significantly worse, and plenty of my comments have fluctuated between having −10 karma to +10 or +20 karma.
I’m reasonably confident this is happening because people in EA hate conflict in general. I’m sorry to say that genuine criticism has to involve conflict of some kind. I understand people in EA are not used to this and prefer to shut their ears while accusing their counterparty of being “insufficienty truth-seeking” or “acting in bad faith” whenever they make use of adversarial discourse, and indeed this is exactly what happened to Guzey’s post from 2018.
I wish EA could move past these dysfunctional social norms, but I’m not getting my hopes up.
Relevant:
DMMF—I share your first concern, that many young EAs seem to have quite a bit of ageism, distrust of legacy systems, contempt for tradition, arrogance about being able to reinvent complex systems from first principles, and wariness of welcoming mid-career and senior experts into the EA community.
This youthful arrogance is partly merited. Traditional charity circa 2000 (before GiveWell) was an unaccountable, unempirical grift that was better suited to virtue signaling, status seeking, and wealthy in-group power games, than to actually helping people or animals. EA challenging this legacy system was entirely right, proper, and helpful. Consequentialist moral philosophy, sentientism (eg about animal welfare), randomized trials, cost-benefit reasoning, scope-sensitivity, etc were crucial and revolutionary developments at the foundation of EA.
However, applied to other domains, subcultures, professions, legacy systems, and cultural traditions, that youthful arrogance can be quite misguided. Especially in the domain of financial accounting, corporate management, and business ethics. SBF seemed to think that he could ignore all the accumulated wisdom of the past few centuries about how to run a company, and protect users and investors. The result was, from the perspective of experienced business managers, risk managers, and accountants, and highly predictable catastrophe (for anyone who actually knew the inner workings of Alameda/FTX).
So, one takeaway from this whole FTX debacle is that maybe EAs should be a little more selective about when to challenge tradition and try to reinvent things from first principles, versus when to respect tradition, expertise, and domain knowledge.
This is a tricky and delicate balancing act, but I think science provides an example of how to get that balance roughly right. Young scientists in grad school quickly learn that almost every new idea that they think is earth-shaking and ground-breaking was already thought about by the late 1800s, or 1950s, or whenever. With great scholarship comes great humility. Only after they’ve mastered a fair amount of the literature in some domain are they equipped to make even modest contributions at the cutting edge of thinking. And only in the rarest cases can a true genius overturn a major established paradigm.
The early EA geniuses managed to overturn the existing charity paradigm, and that was great. But with that initial success, we must be wary of developing a generalized hubris about overturning every tradition we see, and challenging every expert we meet.
This comment by Geoffrey Miller is one of the most insightful things I’ve read. I had a lot of these feelings, but lack the STEM background to say it as you have. EA did a great very specific thing, but then it went to their head and they saw themselves as able to do all things better. Hopefully this crisis will be the pivot where they see the difference and re-gather past collective wisdom where it’s still important.
And the fastest route to that is to bring in more seasoned veterans, just import the data straight in through their bodies in EA org’s.
Jeffrey—thanks for your kind comment! Appreciate it.
I agree and I am also concerned about this. I have witnessed this many times. I do think there are tremendous merits in vigorously thinking from first principal on some subject matters. But others such as risk management and regulation for example do require experise as we have now seen in the case of FTX.