I do think it’s an interesting question whether EA is prone to generate Sams at higher than the base rate. I think it’s pretty hard to tell from a single case, though.
Michael_PJ
I am one of the people who thinks that we have reacted too much to the FTX situation. I think as a community we sometimes suffer from a surfeit of agency and we should consider the degree to which we are also victims of SBF’s fraud. We got used. It’s like someone asking to borrow your car and then using it as the getaway car for a heist. Sure, you causally contributed, but your main sin was poor character judgement. And many, many people, even very sophisticated people, get taken in by charismatic financial con men.
I also think there’s too much uncritical acceptance of what SBF said about his thoughts and motivations. That makes it look like FTX wouldn’t have existed and the fraud wouldn’t have happened without EA ideas… but that’s what Sam wants you to think.
I think if SBF had never encountered EA, he would still have been working in finance and spotted the opportunity to cash in with a crypto exchange. And he would still have been put in situations where he could commit fraud to keep his enterprise alive. And he would still have done it. Perhaps RB provided cover for what he was thinking (subconsciously even), but plenty of financial fraud happens with much flimsier cover.
That is, I think P(FTX exists) is not much lower than P(FTX exists | SBF in EA), and similarly for P(FTX is successful) and P(SBF commits major fraud) (actually I’d find it plausible to think that the EA involvement might even have lowered the chances of fraud).
Overall this makes me mostly feel that we should be embarrassed about the situation and not really blameworthy. I think the following people could do with some reflection:
People who met, liked, and promoted SBF. “How could I have spotted that he was a con man?”
People who bought SBF’s RB reasoning at face value. “How could I have noticed that this was a cover for bad behaviour?”
I include myself in this a bit. I remember reading the Tyler Cowen interview where he says he’d play double or nothing with the world on a coin flip and thinking “haha, how cute, he enjoys biting some bullets in decision theory, I’m sure this mostly just applies to his investing strategy”.
That’s about it. I think the people who took money from the FTXFF are pretty blameless and it’s weird to expect them to have done massive due diligence that others didn’t manage.
It seems like we could use the new reactions for some of this. At the moment they’re all positive but there could be some negative ones. And we’d want to be able to put the reactions on top level posts (which seems good anyway).
I wonder if the focus on “narrow EA” is a reflection of short AI timelines and/or a belief that we need to make changes sooner rather than later.
It seems to me that “global EA” looks better the longer the future we have. Gains in people compound, and other countries may be much more influential in the future. If Nigeria is a global power in 50 years, then growing a community there now might be a good investment.
I don’t think this has a clear answer though, since benefits from actually solving problems sooner can compound too.
I wholeheartedly agree with this post.
I think there has been a bit of over-reacting to recent events. I don’t think the damage is that bad, and to some degree I think we’ve just been unlucky. Maybe we need to do some things differently (e.g. try to project less of an air of certainty, which many critics seem to perceive) but we should also beware the illusion of control.
That was sad to read :(
You say, in effect, “not that centralised”, but, from your description, EA seems highly centralised
Your argument that it’s not centralised seems to be that EA is not a single legal entity
These are two examples, but I generally didn’t feel like your reply really engaged with Will’s description of the ways in which EA is decentralized, nor his attempt to look for finer distinctions in decentralization. It felt a bit like you just said “no, it is centralised!”.
democracy has the effect of decentralising power.
I don’t agree with this at all. IMO democracy often has the opposite effect, and many decentralized communities (e.g. the open-source community) have zero democracy. But I think this needs me to write a full post...
If we think of centralisation just on a spectrum of ‘decision-making power’, as you define it above (how few people determine what happens to the whole) EA could hardly be more centralised!
This seems false to me. If the only kind of decision you think matters is funding decisions, then sure, those are somewhat centralised. But that’s not everything, and it’s far from clear to me why you think that’s the only thing that matters?
For example, as Will discusses in the post, even amongst the individual EA orgs:
There are many of them, and they are small
They basically all do their own strategy and planning
Sure doesn’t look like centralized decision-making to me. You could say “For any decision, OP could threaten to refuse to fund an organization unless it made the choice that OP wants, therefore actually OP has the decision-making power”. But this seems to me to just not be a good description of reality. OP doesn’t behave like that, and in practice most decisions are made in a decentralized fashion.
Yet, de facto, if we think about where power, in fact, resides, it is concentrated in a very small group. If someone sets up an invite-only group called the ‘leaders’ forum’, it seems totally reasonable for people to say “ah, you guys run the show”.
This equivocates between saying that power does resides a small group, and saying that we have created the perception that power resides with a small group. As I already argued, I think the former is false, and Will explicitly agrees with the latter and thinks we should change it.
My overall impression of your post is that it seems to me that you think the non-diversity of funding is bad (which I think we all agree on), but that for some reason funding is the only thing that matters when it comes to whether we describe EA as centralized or not.
Whereas to me EA looks like a pretty decentralized movement that currently happens to have a dominant funder. Moreover, we’re lucky in that our funder doesn’t (AFAIK) throw their weight around too much.
I think the average EA might underestimate the extent to which being visible in EA (e.g. speaking at EAG) is seen as a burden rather than an opportunity.
Having read this I’m still unclear what the benefit of your restructuring of CEA is. It’s not a decentralising move (if anything it seems like the opposite to me); it might be a legitimising move, but is lack of legitimacy an actual problem that we have?
The main other difference I can see is that it might make CEA more populist in the sense of following the will of the members of the movement more. Maybe I’m as much of an instinctive technocrat as you are a democrat, but it seems far from clear to me that that would be good. Nor that it solves a problem we actually have.
Yes, I think there’s a lot of sliding between “decentralised” and “democratic” even though these have pretty much nothing to do with each other.
As a pretty clear example, the open source software community is extremely decentralised but has essentially zero democracy anywhere.
I think this is a place where the centralisation vs decentralisation axis is not the right thing to talk about. It sounds like you want more transparency and participation, which you might get by having more centrally controlled communication systems.
IME decentralised groups are not usually more transparent, if anything the opposite as they often have fragmented communication, lots of which is person-to-person.
I would love the community to be more supportive in ways that would help with that. Things I would like:
Accept that new projects may be not that great, encourage them to grow and maybe even chip in as well as criticising.
Accept and even celebrate failure.
Even more incubator style things. I love what CE does here.
A few thoughts:
The level of quality and professionalism has risen since the old days which makes it intimidating to contribute your own half-assed thing.
Doing things does usually require time, and a lot of the early doing was done by students (and still is!). It’s much harder to be that involved when you’re older without becoming professionally involved. These days we have a lot more non-students!
I think all Will’s stuff about the perceived allocation of responsibility and control has a big impact.
I’m not super convinced that the fundraising situation is tougher? It seems much easier to me than it was. Especially for small things we have a decent range of funders.
I also had this reaction, and I think it was mostly just the phrasing. “Break up OP” suggests that we have the power or right to do that, which we definitely don’t. I think if the post said “OP could consider breaking itself up” it wouldn’t sound like that.
I think the proposal to have significant re-grantors is a more approachable way of achieving something similar, in that it delegates control of some funds.
I think the intention wasn’t “have lots of forums where EA topics are discussed”, so much as “don’t make it sound like the (in practice, one) forum is the only one that can be”.
Thank you! This post says very well a lot of things I had been thinking and feeling in the last year but not able to articulate properly.
I think it’s very right to say that EA is a “do-ocracy”, and I want to focus in on that a bit. You talked about whether EA should become more or less centralized, but I think it’s also interesting to ask “Should EA be a do-ocracy?”
My response is a resounding yes: this aspect of EA feels (to me) deeply linked to an underrated part of the EA spirit. Namely, that the EA community is a community of people who not only identify problems in the world, but take personal action to remedy them.
I love that we have a community where random community members who feel like an idea is neglected feel empowered to just do the research and write it up.
I love that we have a community where even those who do not devote much of their time to action take the very powerful action of giving effectively and significantly.
I love that we have a community where we fund lots of small experimental projects that people just though should exist.
I love that most of our “big” orgs started with a couple of people in a basement because they thought it was a good idea.
Honoring the taking of action and supporting people who take action is really great and I hope it remains a core part of EA culture indefinitely.
I always feel a bit sad when I see “EA should …” posts. I want to say: “Maybe you should do it! Look at the fire within you that made you write that long critique, could you nurture that into a fire that could actually make it happen?”. The idea that you might just write an angry critique and hope that some mandarin with centralized power will pick it up is very sad to me and antithecal to (my conception of) the EA spirit.
I think this is related to an important feature of a do-ocracy which is that you don’t have any voice if you don’t do. You can persuade, but nobody has any obligation whatsoever to listen to you. It’s not a democracy (and that’s good). I think this confuses people a lot.
I definitely think there’s a “generational” thing here. For those of us who’ve been around long enough to see how everything came from nothing but people doing things they thought needed to be done, it’s perfectly obvious. But I can very much see how if you join the community today it looks like there are these serious, important organizations who are In Charge. But I do think it’s still not really true.
That’s much stronger than what I read it as. I think Sjir was saying something more like “if you turn up to a local EA event you should feel welcomed and like you are ‘one of the gang’ even if you only donate”.
The purpose of EAG these days seems a bit murky to me, but it seems to be to be mostly for people who are highly engaged, and I think it’s fair to say that if you just donate you are probably not highly engaged (although you might be).
Great post, I agree with a lot of it. There is definitely a kind of large org sclerosis that can develop, but IME it’s generally associated with much larger orgs unless you have very dysfunctional management (which is a risk!).
One missing factor I think is fungibility. It’s hard for donations to be funged between organizations, but it’s very easy for them to be funged between protects within an organisation. So we might expect donors to have some preference for separate orgs for separate projects.
This seems wildly off to me—I think the strength of the conclusion here should make you doubt the reasoning!
I think that the scale of the fraud seems like a random variable uncorrelated with our behaviour as a community. It seems to me like the relevant outcome is “producing someone able and willing to run a company-level fraud”; given that, whether or not it’s a big one or a small one seems like it just adds (an enormous amount of) noise.
How many people-able-and-willing-to-run-a-company-level-fraud does the world produce? I’m not sure, but I would say it has to be at least a dozen per year in finance alone, and more in crypto. So far EA has got 1. Is that above the base rate? Hard to say, especially if you control for the community’s demographics (socioeconomic class, education, etc.).