The measures you list would have prevented some financial harm to FTXFF grantees, but it seems to me that that is not the harm that people have been most concerned about. I think it’s fair for Ben to ask about what would have prevented the bigger harms.
Michael_PJ
if any charity’s rationale for not being at least moderately open and transparent with relevant constituencies and the public is “we are afraid the CC will shut us down,” that is a charity most people would run away from fast, and for good reason
I do think a subtext of the reported discussion above is that the CC is not considered to be a necessarily trustworthy or fair arbiter here. “If we do this investigation then the CC may see things and take them the wrong way” means you don’t trust the CC to take them the right way. Now, I have no idea whether that is justified in this case, but it’s pretty consistent with my impression of government bureaucracies in general.
So it perhaps comes down to whether you previously considered the charity or the CC more trustworthy. In this case I think I trust EVF more.
He’d need a catastrophic stock/bond market crash, plus almost all depositors wanting out, to be unable to honor withdrawals.
I think this significantly under-estimates the likelihood of “bank run”-type scenarios. It is not uncommon for financial institutions with backing for a substantial fraction of their deposits to still get run out due a simple loss of confidence snowballing.
I feel like “if you get legal advice, follow it” is a pretty widely held and sensible broad principle, and violating it can have very bad personal consequences. I think the bar should be pretty high for someone violating that principle, and I’m not sure “avoiding quite a lot of frustration” meets that bar, especially since the magnitude of the frustration is only obvious in hindsight.
Thanks, I think this is all right. I think I didn’t write what I meant. I want more specificity, but I do agree with you that it’s wrong to expect full specificity (and that’s what I sounded like I was asking for).
What I want something more like “CEA should investigate the staff of EVF for whether they knew about X and Y”, not “Alice should investigate Bob and Carol for whether they knew about X and Y”.
I do think that specificity raises questions, and that this can be a good thing. I agree that it’s not reasonable for someone to work out e.g. exactly where the funding comes from, but I do think it’s reasonable for them to think in enough detail about what they are proposing to realise that a) it will need funding, b) possibly quite a lot of funding, c) this trades off against other uses of the money, so d) what does that mean for whether this is a good idea. Whereas if “EA” is going to do it, then we don’t need to worry about any of those things. I’m sure someone can just do it, right?
We do of course need to worry about the flip side: plenty of times (especially in political groups) you see people being told not to criticise the group’s positions because it will make it less likely that the public in general will buy the overall picture (which the critic probably still agrees with). This can be pretty toxic.
I don’t think Richard is advocating for that, but I think it’s a risk once you legitimize this kind of argument.
At some point it surely has to be the case that they’ve done enough.
This doesn’t seem true? It makes perfect sense for advocacy groups to continue advocating their position, since a lot of the point is to reach people for whom the message is new. 80k is (or at least was) all about how to use your career for good, I would expect them to always be talking about earning to give as an option.
I think I agree with Hypothetical EA that we basically know the broad picture.
Probably nobody was actually complicit or knew there was fraud; and
Various people made bad judgement calls and/or didn’t listen to useful rumours about Sam
I guess I’m just… satisfied with that? You say:
But there are plenty of people, both within EA and outside of it, who legitimately just want to know what happened, and will be very reassured to have a clearer picture of the basic sequence of events, which orgs did a better or worse job, which processes failed or succeeded.
.. why? None of this seems that important to me? Most of it seems like a matter for the person/org in question to reflect/improve on. Why is it important for “plenty of people” to learn this stuff, given we already know the broad picture above?
I would sum up my personal position as:
We got taken for a ride, so we should take the general lesson to be more cautious of charismatic people with low scruples, especially bearing large sums of money.
If you or your org were specifically taken for a ride you should reflect on why that happened to you and why you didn’t listen to the people who did spot what was going on.
I’m against doing further investigation. I expressed why I think we have already spent too much time on this here.
I also think your comments are falling into the trap of referring to “EA” like it was an entity. Who specifically should do an investigation, and who specifically should they be investigating? (This less monolithic view of EA is also part of why I don’t feel as bothered by the the whole thing: so maybe some people in “senior” positions made some bad judgement calls about Sam. They should maybe feel bad. I’m not sure we should feel much collective guilt about that.)
On the object level, the original question was:
> Are financial statements going to be released? In particular, how much was spent on estate agent fees, maintaince and bills? And the value of the events hosted. Is the reason for the change that EA has less money or that there was an error in the initial reasoning for buying it?Even given the context I think this is asking too much. I would support a question like “I would love to know what the reasoning was: in particular, was the project financially unsustainable or were there other reasons?”.
Asking whether the financial statements for the project are going to be published is asking for way more information than is necessary or useful. From the follow up question, it sounds like Dean is interested in doing a financial audit of the project and maybe thinks they were paying too much on maintenance or something? (Apologies if I’m putting words in your mouth, Dean, but I’m trying to talk about how it comes across to me as a reader) This doesn’t seem like a reasonable avenue for Forum commenters to be encouraged to push down.
On the meta level, while there are not that many items at this level, the calls for random transparency are constant. Look at almost any post made by an EA org and you will see people asking for everything from full financial statements, to written documentation of hiring processes, to extensive reports on all kinds of internal operations.
To put it another way, by all means ask for transparency when it matters… but stick to when it matters, please!
I think there is a bit of tendency to assume that it is appropriate to ask for arbitrary amounts of transparency from EA orgs. I don’t think this is a good norm: transparency has costs, often significant, and constantly asking for all kinds of information (often with a tone that suggests that it ought to be presented) is I think often harmful.
I wonder if we would benefit from something like a system that hides the karma (and treats it as zero for visibility purposes) of posts that have less than <some quantity> of engagement. That way posts would get a “grace period” before getting hidden.
Again, I don’t think my picture here is a stretch from the normal English sense of the word “wholesomely”.
The more I read of these essays the less I agree with this. On my subjective authority as a native English speaker, your usage seems pretty far from the normal sense to me. I think what you’re gesturing at is a reasonable concept but I think it’s quite confusing to call it “wholesome”.
As some evidence, I kept finding myself having to reinterpret sentences to use your meaning rather than what I would consider the more normal meaning. For example, “What is wholesome depends on the whole system.” This is IMO kind of nonsensical in normal English.
I don’t think we would have been able to use the additional information we would have gained from delaying the industrial revolution but I think if we could have the answer might be “yes”. It’s easy to see in hindsight that it went well overall, but that doesn’t mean that the correct ex ante attitude shouldn’t have been caution!
100% agree. I think it is almost always better to be honest, even if that makes you look weird. If you are worried about optics, “oh yeah, we say this to get people in but we don’t really believe it” looks pretty bad.
I would qualify this statement by saying that it would be nice for OP to have more reasoning transparency, but it is not the most important thing and can be expensive to produce. So it would be quite reasonable for additional marginal transparency to not be the most valuable use of their staff time.
Some EAs knew about his relationship with Caroline, which would undermine the public story about FTX<->Alameda relations, but didn’t disclose this.
Some EAs knew that Sam and FTX weren’t behaving frugally, which would undermine his public image, but also didn’t disclose.
FWIW, these examples feel hindsight-bias-y to me. They have the flavour of “we now know this information was significant, so of course at the time people should have known this and done something about it”. If I put myself in the shoes of the “some EAs” in these examples, it’s not clear to me that I would have acted differently and it’s not clear what norm would suggest different action.
Suppose you are a random EA. Maybe you run an EA org. You have met Sam a few times, he seems fine. You hear that he is dating Caroline. You go “oh, that’s disappointing, probably bad for the organization, but I guess we’ll have to see what happens” and get on with your life.
It seems to me that you’re suggesting this was negligent, but I’m not sure what norm we would like to enforce here. Always publish (on the forum?) negative information about people you are at all associated with, even if it seems like it might not matter?
The case doesn’t seem much stronger to me even if you’re, say, on the FTX Foundation board. You hear something that sounds potentially bad, maybe you investigate a little, but it seems that you want a norm that there should be some kind of big public disclosure, and I’m not sure that really is something we could endorse in general.
To reuse your example, if you were the only person the perpetrator of the heist could con into lending their car to act as a getaway vehicle, then that would make P(Heist happens | Your actions) quite a bit higher than P(Heist happens | You acting differently), but you would still be primarily a mark or (minor) victim of the crime
Yes, this is a good point. I notice that I don’t in fact feel very moved by arguments that P(FTX exists | EA exists) is higher, I think for this reason. So perhaps I shouldn’t have brought that argument up, since I don’t think it’s the crux (although I do think it’s true, it’s just over-determining the conclusion).
Only ~10k/10B people are in EA, while they represent ~1/10 of history’s worst frauds, giving a risk ratio of about 10^5:1, or 10^7:1, if you focus on an early cohort of EAs.
This seems wildly off to me—I think the strength of the conclusion here should make you doubt the reasoning!
I think that the scale of the fraud seems like a random variable uncorrelated with our behaviour as a community. It seems to me like the relevant outcome is “producing someone able and willing to run a company-level fraud”; given that, whether or not it’s a big one or a small one seems like it just adds (an enormous amount of) noise.
How many people-able-and-willing-to-run-a-company-level-fraud does the world produce? I’m not sure, but I would say it has to be at least a dozen per year in finance alone, and more in crypto. So far EA has got 1. Is that above the base rate? Hard to say, especially if you control for the community’s demographics (socioeconomic class, education, etc.).
Okay yes, I agree that a driver of bank runs is the knowledge that the bank usually can’t cover all deposits, by design. So as long as you keep that fact secret you’re much less likely to face a run.
I am now unsure how to reason about the likelihood of a run-like scenario in this case.