so hang on, 8k Hours gave some really really good financial advice and you’re trying to critique them because giving the good advice may have subconsciously influenced some people to do the opposite???
jesus, that’s a bit harsh
so hang on, 8k Hours gave some really really good financial advice and you’re trying to critique them because giving the good advice may have subconsciously influenced some people to do the opposite???
jesus, that’s a bit harsh
dude you’ve misunderstood what the 8k Hours piece was saying. It wasn’t saying that the median EA’s individual investment portfolio was overweight FB and FTX. It was saying that EA collectively was hugely overweight FB and FTX, which yeah, duh, it absolutely was because Dustin M and SBF—the two main zillionaire funders—were massively overweight those two things. Obviously if you’re working in EA don’t really want to be massively long the same two things that the people who ultimately pay your salary are long, because if things go south you both lose your job (potentially) AND the value of your personal investment portfolio drops like a rock. So diversifying is both good for you as an individual and good for EA collectively because then collective EA wealth is at least very slightly more diversified.
How exactly do you disassociate SBF from EA? The guy grew up with two utilitarian philosopher parents, explicitly started Alameda on earn to give principles, used EA social connections to do Alameda’s first major trades (the yen-denominated BTC arbitrage), and gave a ton of money to EA causes. If he isn’t an effective altruist, no one is!
idk, when people explicitly endorse your ideology as why they endorse “high leverage and double-or-nothing flips” I think it’s at least worth taking a look at yourself. Now quite probably the person in question has misunderstood your ideology and doesn’t understand why EAs do in fact care about the risk of ruin and why stealing money isn’t ok, but then perhaps try to correct them?
Fwiw I think it very unlikely that the decision to use customer funds was a one-off decision made in 2022. My view is that that FTX was set up from the start to use customer money as a source of cheap capital for Alameda. In 2018 Alameda was offering potential investors a 15% guaranteed return on loans. It seems fairly likely that at some point SBF figured “fuck this, why are we offering these dorks 15% when we can just set up our own exchange and access huge amounts of capital at 0%”. Never mind that the fact that privileged information from the exchange may well have opened up for Alameda more ways to make money!
The plan, imo, was always to accrue as much as wealth as possible as fast as possible with as few ethical constraints as possible. This worked for a while because Alameda’s trades were profitable and crypto was in a bull market. This plan may or may not have been a EA-aligned, but if you have short enough AI/pandemic timelines (I don’t), it doesn’t seem obviously non-compatible and given the career backgrounds and interest set of all the major people involved, yes, I think they were committed and sincere EAs who really believed this stuff. SBF’s own weird version of EA, at least, seems to have played a fairly large role in why they took on so much risk, as he himself explained in an overly long and boring twitter thread somewhere and Caroline also mentioned on her blog.
It also makes zero sense to compare FTX’s spending on stadiums vs the Future Fund as a sign for how much they cared about these respective things. The Future Fund would almost certainly have got way more money in subsequent years, while the stadium rights purchase was a form of advertising designed to help grow the business faster. I can’t imagine SBF is a big sports fan and was doing that sort of thing because he really enjoyed seeing the FTX logo on umpire shirts.
Not to Godwinpost, but this isn’t really “were Nietzsche and Wagner at fault for the Nazis”, it’s more “were Nietzsche and Wagner at fault for the Nazis if they’d actually lived throughout the 1930s and worked in prominent cultural education posts in the German state bureaucracy.”
Most transparency in charitable endeavours is donor-driven, right? People want to see what you’re doing with their money. But that doesn’t apply here because it’s basically all just one guy’s money and he clearly DGAF what OpenPhil do with it. So all the usual stuff about conflicts of interest etc doesn’t really apply because Dustin knowns full well he’s giving his money to a small group of closely connected people to dispense with as they see fit. So I assume no one thought there would be any problems until FTX blew up and suddenly EA was under loads of additional scrutiny.
The purchase itself seem fairly reasonable, fwiw, although I don’t know much about the volatility in the super-prime property market. But assuming it gets used a lot for conferences etc, country houses are pretty great things and very useful.
I sort of suspect that they were not, in fact, exemplary on any definitions of protecting retail investors at any point. The whole point of FTX was to offer leverage to its users! It was the derivatives exchange where you could get margin! This is generally bad for retail! (and then maybe had Alameda trading against you, but hey).
This is all before their exchange suffered huge outflows and it turned out they didn’t have customer funds protected at all. So no, at no point was this good for retail, it was incredibly predatory from beginning to end!
Hypothetically, let’s just say I own a business, Andromeda Research, with $500 million of assets and about 8 billion of outstanding liabilities? How much would you pay to acquire this concern? Perhaps $1 might seem quite a lot, in context?
If a VC isn’t investing in some frauds they’re being overly conservative IMO given the extreme power laws that govern the returns to VC investing. Of course you need to do some due diligence, but ultimately it’s going to be way too much time and effort to flush out every potential fraudster that wants your money, and you only have so much time to find good investment opportunities.
Obviously that piece on Sequioa’s website about how amazing SBF is looks terrible in hindsight, but I thought a lot of the commentary surrounding it betrayed a fundamental misunderstanding of how early-stage investing actually works vs late-stage investing. Perhaps fair enough to criticize the people who invested in FTX in late 2021 or 2022 (they did their Series C only in Jan 2022!), but even then it’s probably not always easy to distinguish between a hypergrowth company & an unsustainable fraud from the outside. And ofc SBF had told a lot of lies to everyone, his investors included. The better argument might be that any VC that touches crypto is an idiot because the whole space is ultimately going to zero (I don’t actually believe that but it’s not far off, certainly most currrent projects will fail including BTC), but then again, through token sales crypto offers amazing ways for VCs to make money even on failed investments.
Ultimately VC is just very different to most other forms of investing, including investing the reputational capital of your movement on a high-profile billionaire. Then you really should do some DD.
I love how “the community’s second biggest funder turns out to be a scam artist” does not even prompt the slightest reconsideration of the value of people being able to call out bad actors when they see them, nope, we’re just going to double-down on this absurd and comical set of community politeness norms that do little but enable savvy predators to exploit EA as a movement and individual EAs as people.
And fwiw, there were a number of other conclusions I could have very easily drawn about Torres and signs of mental illness I could have pointed out, but I very carefully did not since I thought those might be upsetting. This was the toned-down version of my original post.
what operationalization would you accept? Perhaps at some point in the next 5 years a court finds that FTX/Alameda were doing this? Perhaps CZ/Binance publicly anouncing that they’ve found this to be the case when examining FTX’s books?
“This person should have made some money betting against FTX before it collapsed and then I’d take them more seriously.”
this is naive EMH fundamentalism
not everything can be shorted, not everything can be shorted easily, not everything should be shorted, markets can be manipulated. Especially the crypto market. It both can be the case that people 100% think X is a fraud, and X collapses, and shorting X would have been a losing trade over most timeframes. “Never short” is an oversimplification but honestly not a bad one.
Yes, this. Most fraudsters don’t have such strongly held views on why the Kelly criterion for determining optimal bet size doesn’t apply to them! (SBF did a famous thread on this and Caroline’s tumblr has that line about how real EAs endorse high leverage and double-or-nothing flips).
I think it would be wrong to blame utilitarianism per se for what happened because the vast majority of utilitarians absolutely do care about the risk of ruin—as they should—but I think SBF’s own brand of EA-aligned thinking (I assume short AI/bioengineered pandemic timelines factored in here) played a huge role in why he took such insane risks.
Not sure why this is getting negative votes or w/e, it’s basically correct. And even in the PR stakes, the cost of the Abbey on the most pessimistic assumptions is absolutely peanuts compared to FTX! No one will remember, no one will care (whereas they absolutely will remember FTX, that’s a real reputational long-term hit).
I think perhaps a public education course—as much for EAs themselves as for everyone else—that while the ends do sometimes justify the means, they often don’t, and when using extremely dodgy means you have to be unbelievably confident that your ends are worth it and that no other means will do. In short, I think EA should come to firmly reject the philosophy of the street mugger—that he (we) is justified in taking other people’s money just because he (we) thinks or even knows that he (we) can do better things with the money than the people in question. People have rights and we shouldn’t steal from them, and honesty, decency, civility are all very important things. AI risk or pandemic prevention of course might be important too, but it’s not necessarily more important than maintaining basic societal norms like “don’t steal”. In short, EA’s chronic and severe epistemic overconfidence problem should be publicly addressed.
Tremendous stuff. One thing I would add is that higher crime, by its very nature, provokes worse policing. This happens through some obvious mechanisms (more brutalized, overworked, and trigger-happy cops), but also through some more subtle mechanisms. As crime reduces urban densities, police numbers relative to the area they cover start to fall, and cops become confined to their cars, since this is the only way they can cover their expanding territories. This kills off foot patrolling, one of the best and most proven anti-crime interventions, with a pedigree hearkening back to the Victorians. Some London police stations still have 19th-century maps on their wall of their territory divided up into individual officers’ beats, which were often no more than a few streets of super-densely populated space. Solve the crime problem, and you may well find that America’s policing problem fixes itself to a surprising degree.
Duration doesn’t mean time to maturity. It’s a measure of bond sensitivity to interest rates. Higher duration = more sensitivity. It’s measured in years tho which is confusing. You can make your 162% in one year if the interest rates move as the authors say, which is pretty mouth-watering!
(edit: just to showcase the degree of difference, a bond with 40 years to maturity can have duration of just 10 years if the bond’s coupon value is 10% & market rates were also 10% at the time of isssuance. This means the value of the bond will change less with rising rates. A 40-year bond with a 1% coupon, issued when market rates were also 1% will have duration of around 33 years, which in plain English just means that if interest rates go up a lot you get “FTX-linked tokens in November” returns. Bonds are tricky things and their pricing is weird is especially in ZIRP environments)
I think it’s mostly just FTX tbh. Most people, even smart well-informed people, had never heard of EA beforehand. If they’ve been following the FTX saga closely they have, and their first impression is obviously a very bad one. There’s not much anyone can do about this, it is what it is. Obviously the effects will partially dissipate over time.
An additional problem is all SBF’s donations to the Dems, so there’s probably the additional perception created that EA is some weird and whacky scam the Dems are running on everyone i.e EA has become associated with the insane American partisan culture wars and for some people is probably now in the same bracket as Epstein and child abusing pizza restaurants. Again, there’s not much you can do about this, although perhaps maybe EA could try to highlight its links to the political right a bit more to help address the balance? (Thiel, Musk)
I would not draw on that grant for quite some time, if ever: you should be worried about clawbacks.
in addition to all of this, the statement compounds the already existent trust problem EA has. It was already extremely bad in the aftermath of FTX that people were running to journos to leak them screenshots from private EA governance channels (vide that New Yorker piece). You can’t trust people in an organization or culture who all start briefing the press against each other the minute the chips are down! Now we have CEA publicly knifing a long-term colleague and movement founder figure with this unbelievably short and brutal statement, more or less a complete disowning, when really they needed to say nothing at all, or at least nothing right now.
When your whole movement is founded on the idea of utility maximizing, trust is already impaired because you forever feel that you’re only going to be backed for as long as you’re perceived useful: virtues such as loyalty and friendship are not really important in the mainstream EA ethical framework. It’s already discomfiting enough to feel that EAs might slit your throat in exchange for the lives of a million chickens, but when they appear to metaphorically be quite prepared to slit each other’s throats for much less, it’s even worse!
Maybe hold off on this sentiment until we know exactly what they were doing with customer funds? It could age quite badly.