If you stop calling yourself an EA in public because you think doing so will give people the wrong impression, that’s one thing, I guess.
lc
What percentage of Chinese people have ever been arrested for subversion?
EA didn’t cause the FTX fraud.
The eagerness with which people rushed to condemn is frankly a warning sign for involution. We have to stop it with the pointless infighting or it’s all we will end up doing.
- 22 Jan 2023 2:52 UTC; 11 points) 's comment on Thread for discussing Bostrom’s email and apology by (
This comment turned out to be entirely correct.
I really, really, realllllly disagree. Saying that EA caused FTX is more like saying EA caused Facebook than the contrapositive. You should have a pretty firm prior that someone who becomes a billionaire does it primarily because they enjoy the immense status and prestige that being “the world’s richest U30” bestows on a person; likewise someone committing fraud to keep that status.
My primary character assessment at this point is that he was an EA who was also one of those flavors of people who become quasi-sociopaths when they become rich and powerful. Nothing in Sam’s actual, concrete actions seem to indicate differently, and indeed he actually spent the grander part of that money on consumption goods like mansions for himself and his coconspirators. Maybe he really was in it for the good, at the beginning, but I just can’t believe that someone making a late decision to start a ponzi scheme “for the greater good” would act like he did.
(Also, using the resources of the EA movement how, exactly? Seems to me like his fraud would have been just as effective had he not identified as an EA. He received investment and consumer funds because of the firm’s growth rate and Alameda’s generous trades, respectively, not because people were interested in contributing to his charities.)
I don’t really understand the distinction here. If a core member of the EA community had founded Facebook...[snip]...and was acting throughout as a pretty prominent member of the EA community, I would also say that “EA had a substantial responsibility in causing Facebook”
I likewise don’t understand what you’re finding weird about my position? If Eliezer Yudkowsky robbed a bank, that wouldn’t make LessWrong “responsible for a bank robbery”, even if Eliezer Yudkowsky were in the habit of donating a proportion of his money to AI alignment organizations. Looking at the AU-EY grabbing the money out of the brown paper bag and throwing it at strippers, you would conclude he mostly did it for his own reasons, just like you would say of a robber that happened to be a congressman.
If we could look into AU-EY’s mind and see that he thought was doing it “in the name of EA”, and indeed donated the robbed funds to charity, then, sure, I’d freely grant that EA is at least highly complicit—but my point is that I don’t believe that was SBF’s main motivation for founding FTX, and think absent EA he probably had a similar outset chance of running such frauds. You can say that SBF’s being a conditional sociopath is immaterial to his reducing “the group of people with the EA sticker’s point total”, but it’s relevant for answering the more productive question of whether EA made him more or less likely to commit massive fraud.
[unsnip]...recruiting for its leadership primarily from members of EA...[/unsnip]
Well, I guess recruiting from EA leadership is one thing, but to what extent did FTX actually benefit from an EA-affiliated talent pool? I reviewed most of the executive team during my manifold betting and didn’t actually come across anybody who I could find had a history of EA affiliation besides SBF (though you may know more than me).
I am quite confident Sam spent <$100MM on consumption, and the FTX Future Fund has given away more than $400MM in grants, so this statement is off by a factor of 4, and more likely by a full order of magnitude.
I actually didn’t know that. Is this counting the Anthropic investment or did FTXFF really give-away give-away that much money?
:(
Are you black?
I think if we had refrained from criticizing their initial statement, their final, formal statement would be a lot worse, so if anything, we did them a favour.
I don’t think you have internalized the point: there was no misconduct. If their initial statement was insufficient to convince us of this, that is on us, not on them. Their job as a charity is not to manage a public persona so that you or me continue to look good by affiliation, it’s to actually do good. Accusing them of secretly financing nazis because we’re weak and afraid of being tarred by association is the exact reverse polar opposite of doing them a “favor”.
Ok, that gives me some pause about his motivations… Probably enough to change my opinion entirely, but, still.
It seems really quite beyond a doubt to me that FTX wouldn’t have really existed without the EA community existing. Even the early funding for Alameda was downstream of a bunch of EA funders.
Yeah, I guess I’m just wrong then. I’m confused as to why I didn’t remember reading the bit about Caroline in particular—it’s literally on her wikipedia page that she was an EA at Stanford.
I mean, if Eliezer robbed a bank, I think I would definitely think the rationality community is responsible for a bank robbery (not “LessWrong”, which is a website). That seems like the only consistent position by which the rationality community can be responsible for anything, including good things. If the rationality community is not responsible for Eliezer robbing a bank, then it definitely can’t be responsible for any substantial fraction of AI Alignment research either, which is usually more indirectly downstream of the core people in the community.
FWIW I still don’t understand this perspective, at all. It seems bizarre. The word “responsible” implies some sort of causal relationship between the ideology and the action; i.e., Eliezer + Exposure to/Existence of rationalist community --> Robbed bank. Obviously AI Alignment research is downstream of rationalism, because you can make an argument, at least, that some AI alignment research wouldn’t have happened if those researchers hadn’t been introduced to the field by LessWrong et. al. But just because Eliezer does something doesn’t mean rationalism is responsible for it any more than Calculus or the scientific method was “responsible” for Isaac Newton’s neuroticisms.
It sounds like the problem is you’re using the term “Rationality Community” to mean “all of the humans who make up the rationality community” and I’m using the term “Rationality Community” to refer to the social network. But I prefer my definition, because I’d rather discuss the social network and the ideology than the group of people, because the people would exist regardless, and what we really want to talk about is whether or not the social network is +EV.
I don’t think it’s reasonable to expect them to anticipate this exact scenario, nor do I think they should be spending lots of time planning for tail risk PR scenarios like these instead of being actually productive.
It seems really clear that the social network of EA played a huge role in FTX’s existence, so it seems like you would agree that the community should play some role, but then for some reason you are then additionally constraining things to the effect of some ill-specified ideology
No, I agree with you now that at the very least EA is highly complicit if not genuinely entirely responsible for causing FTX.
I don’t think we actually disagree on anything at this point. I’m just pointing out that, if the community completely disbanded and LessWrong shut down and rationalists stopped talking to each other and trained themselves not to think about things in rationalist terms, and after all that AU-Yudkowsky still decided to rob a bank, then there’s a meaningful sense in which the Inevitable Robbery was never “the rationality community’s” fault even though AU-Yudkowsky is a quintessential member. At least, it implies a different sort of calculus WRT considering the alternative world without the rationality community.
After this post I started to make a private spreadsheet of the EA forum users with the silliest takes, ranked by silliness. But I couldn’t finish, because while I was using the spreadsheet app a grey van pulled up and two men ordered me into the back. They’re telling me I’m being extradited to Europe to answer for my GDPR violations. Halp.
The joke is that the form of “data collection” you suggest is illegal according to GDPR would also imply my spreadsheet was illegal. Indeed, if recording inferences about the character of people you meet were illegal, any employer making a scoring list of candidates to interview would be breaking the law. This indicts virtually every organization above a current size.
Similarly, in the strict manner in which you cite #1, it’s probably also true that every charity in the UK is in violation of UK charity law. Most charities (for example) have employees that are paid materially for their work, that they don’t entirely give away to the charity. It’s also essentially impossible to avoid “benefitting” entirely, in some way, from successful charity efforts—for example in the form of boosted professional reputation and prestige among a certain class of people. If your reading of UK charity law were correct then charity in the UK would be largely illegal.
The obviously nonsensical legal implications of this post, combined with the gravity of the accusations, combined with the offensive suggestion that this is “whistleblowing” in any good-faith sense of the word, led me to respond with a joke instead of engage with it seriously. I do not regret not doing so, the downvotes someone applied to my entire profile notwithstanding.
It is literally impossible for a charity to follow the law as you have described it, because doing any good whatsoever under your own name opens you up to claims that you have benefited in some material sense,whether that be financially or professionally or reputationally. Charities are well-known to employ people and directly pay them for services rendered, so without being a UK lawyer I don’t know what kind of additional context is required for such a case to be prosecutable in practice, but it’s certainly a nonzero amount. Granted this sort of completely unpragmatic interpretation of the law, you are probably breaking similar laws yourself, because there is simply too much law to support following a layman’s interpretation of all of it.
This is a reality of the world you live in -the one with dozens of countries with hundreds of thousands of pages of law on the books, many designed explicitly to give as much discretion to prosecutors as possible—and so this “attitude” you describe (where people remain willing to do ethical things that prosecutors will not actually prosecute them for) is the only way to live. That is of course unless you’re willing to single out particularly high profile organizations—then you can pretty much accuse anybody of breaking the law in some country or another.
I’m skeptical by default of accusations of sexual misconduct that don’t name the perpetrator, even when the source is anonymous, in 2023. That seems to include most of the accusations here.
Which is not to say definitively that the piece is untrue—everything in it could very well be accurate—just that the way the piece is now, it’s essentially set up to do maximum damage to EA while limiting EA leadership’s ability to take productive action against the offenders, and that makes me at least suspicious that events have been distorted.
As has been noted many times, EA is currently about 70% male, whilst environmentalism/animal advocacy is majority women. I would be fairly confident that a more balanced gender ratio would mean less misogyny towards women.
My guess is that EA is currently male because aggressively quantifying and measuring charitable giving is an activity that appeals primarily to men. As long as that remains true, and Effective Altruism remains Effective Altruism in that way, my prediction is that the gender ratio will remain the same, just as most hobbies and social groups maintain similar gender ratios over time even when people work really hard to change them. If this form of harassment is inherent to male-dominated activities then that would be pretty sad.
Some EAs have a kind of “anti-woke” sentiment to the point where I actually think it could be fairly damaging e.g. it causes people to think issues related to race, gender, nationality etc aren’t important at all. I think it would be pretty valuable if everyone read a few core texts on things like racism, sexism, ableism, etc. to actually understand the every-day experiences of people facing various forms of discrimination and bigotry.
I’m pretty sure the standard left-American take on everyday harrassment is straightforwardly compatible with believing it’s not very important in a world with existential risk and malaria and the Jalisco New Generation Cartel, and that this is a sensible position for EAs to hold even when they’re not explicitly “anti-woke”.
Repost from LessWrong:
Dude is going to prison for life. I can’t believe how many rationalists are out here suggesting he’s gonna just walk away from this “because he’s a democratic donor”.
Maybe there was a galaxy brained legal road he could have taken that limited his damage to 5-10y after spending 10MM on an extraordinary plea negotiation and shutting the fuck up. He passed that fork in the road a hundred miles back. Currently he is sounding and acting like a megalomaniac to the point of seeming mentally ill. One wonders if he even realizes he’s going to be criminally charged regardless of the outcome of his “fundraising”.