I was considering writing something similar, but investing time in writing a post is hard, so thank you for writing this.
While I understand emotions we all feel, I’m under an impression that effective altruism community is now reacting in a way that they try to educate public NOT to—to use anecdotal examples and emotional states to guide the decision making. While it’s very human to react in such a way I see that I’m growing more and more anxious of this direction observing the discourse.
Of course what happened is devastating and that there is a lot of constructive feedback and good questions to be asked, but I hope we can preserve our commitment to compassion and truth-seeking even in hard moments like this. The things that worry me can be seen in how demanding people are to the EA leaders—to the point of almost blaming them for this without enough evidence or sympathy. And that even not that good proposals earn a lot of support because I feel we are all incentivizied to be critical of EA (and it’s good to feel better about ourselves by distancing ourselves from the misery that FTX caused).
I’m also worried about EA doing things now (both community here and established orgs) to influence optics rather than for the sake of integrity. It would be worrying if true, because if so it may lead to the recreating potentials errors, like people silencing themselves instead of having good-faith arguments. I hope that accountability will prevail and both community and people will be open where we screwed up and, if needed, there will be consequences, instead of us protecting the brand of EA at any cost.
But I want to mention that I’m also incredibly impressed by some people here and generally very happy to consider myself part of this community. I admire the courage, integrity and sobriety of thinking of many people here. After spending recently way more time on EA Forums than I should, I came to conclusion that I would want to especially mention Habryka for his behavior and comments during the last period . It’s really a privilege to have such people in EA community (and I’m really sorry for not mentioning other people behaving in a similar way who I didn’t notice).
I think the point I have been trying to make in criticising the leadership is something like this:
EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trust with a lot of power. We treat them, in many ways, like “philosopher-kings” of the community
We do this on the premise that these people are uniquely good at leading the community and uniquely good at seeing risks that might be posed to the EA enterprise, and if they were more accountable and public, this would reduce their ability to do this, so reduce the ability of EA to do good in the world
However, the FTX scandal has shown us that these EA leaders are just humans, with the same flaws and biases as the rest of us (as I think many of us suspected), and aren’t these superhuman philosopher kings
Thus, we probably shouldn’t centralise so much power and trust in such a small unaccountable opaque group of people.
Its not that these people aren’t worthy of compassion, or that I would have done better (I wouldn’t have done). Its that we defer dso much power to them on the assumption that they are so much better than us at this, and I just think they’re not.
I’m placing myself on a ban because I’m not my best self right now but briefly: 1. “EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trustwith a lot of power. We treat them, in many ways, like “philosopher-kings” of the community.” I don’t think very many people treat him like this (certainly not the majority- literally, no one I know defers to him in this way and this strikes me as a super weird interpretation of his role in EA). But treating this as true how is this Will’s fault? At no point do I personally remember Will standing on a podium of What We Owe The Future, screaming “I am your leader bow to me, peasants!” I think I would have remembered that. Sounds like a hoot. Instead, the, and so help me I don’t know why I have to emphasize this, *living, breathing human being with feelings* is a philosopher and an academic who had an idea and promoted it because he thought it would make the world a better place. This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work. Maybe people who deferred to Will should be the ones who are pondering their own epistemics. 2. On Lantern Ventures: There is a big difference between not wanting to work with someone because you don’t like their ethics (looking at you Kerry) and thinking they are going to commit the century’s worst fraud. Also, she’s getting death threats. Do you not reckon she’s suffering enough without her community adding their voices to the mob of people saying she should have known or that she could have prevented this? I remember when this stuff happened, it was on the periphery of EA at best. I literally don’t think I thought about it for longer than around 10 minutes and I had friends personally involved at the time. My take was “Wow! Sam sounds like a bad manager” not “Uhoh best nail down the furniture.” 3. On Dancer, and a handful of other people being kind while being critical: Thank you. You can stay. ;)
I don’t want to get into debates around object-level criticisms this early but I keep being puzzled by this assertion: ”This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work.”
There was also a similar quote elsewhere: ”But Will is not the CEO of EA! He’s a philosopher who writes books about EA and has received a bunch of funding to do PR stuff.”
I don’t think this conception of “people loosely connected together in various ways” really captures the correct level of accountability here. There is a legal entity named Effective Ventures, which is the umbrella organisation of CEA, 80000 Hours, GWWC etc. and Will is the president of Effective Ventures as well as CEA. The people in the community volunteer their time and credibility by referring these organisations(and their literature) to their social circles. Many also do donate money to these organisations.
I refuse to have a verdict on FTX related criticisms until the dust settles, and most of the non-FTX related criticisms seem unreasonable to me, but this argument of “no one is the leader of EA really” strikes me as quite odd. I suspect CEA might even be the official copyright owner for “Effective Altruism” brand as I don’t see any organisation that has “Effective Altruism” in its name despite not being approved by CEA. Please inform me on this if I’m wrong. EA is much more centralised than “Socialism” or “Feminism”.
I like your comment- thank you. I am really confused by the opposite view. I don’t quite understand where Will as governor of EA comes from. There are influential organizations and thinkers in EA for sure but ownership and ultimate responsibility particularly cosmic responsibility for bad actors feels very different… I think something is obviously going on here which is important and once things are less awful we should try and dissect calmly, kindly, and deliberately.
There is a big difference between not wanting to work with someone because you don’t like their ethics (looking at you Kerry) and thinking they are going to commit the century’s worst fraud.
I don’t think anyone asking for more information about what people knew believes that central actors knew anything about fraud. If that is what you think, then maybe therein lies the rub. It is more that strong signs of bad ethics are important signals and your example of Kerry is perhaps a good one. Imagine if people had concerns with someone on the level of what they had with Kerry (plausible in the case of SBF—that is what it important to find out) and despite that promoted Kerry to be one of the few faces of the movement. That would be problematic and it is important to figure out what happened. That’s not a witch hunt.
Also, it is very surprising to me that you don’t hold the belief that many people treat e.g. Will as a central figure to defer to. Given what you’ve written it sounds like you are quite central so maybe get exposed to different people who have more eye-to-eye contact with Will, but it is certainly my experience that most people confer star power on Will. You are right that that doesn’t make it Will’s fault. I am just trying to claim that I don’t think you claim that people don’t treat him like a philosopher king hasn’t been my experience at all.
However, the FTX scandal has shown us that these EA leaders are just humans, with the same flaws and biases as the rest of us (as I think many of us suspected), and aren’t these superhuman philosopher kings
To be clear, this was always known, just never acted upon.
Who are these EA leaders? What are their names and what do they do? Who directs what? How sure are you that you aren’t just imagining that there must be some adults in charge somewhere?
EA is anarchy. No one is even a little in charge. Some people get asked for advice more than others. Some people have ideas that are more influential than others. No one knows what they are doing, but all the constructive EAs are trying to do things and while also asking around to shore up their epistemics as they do so. EA is a workshop where everyone has their own projects but is free to ask the old hands for tips or to spy on the cool new up-and-coming projects and to try to emulate them.
EA forum, Twitter, EAG talks, etc are not real life. Will is a mascot. At best an “influencer.” If you want to shape the future (of the world, EA, the British government) you have to ground yourself in what is actually going on, not just the chattering in places like the EA forum. Be constructive, be nice, support others, and start building some stuff of your own.
The monarchy wields influence. That is not the same thing as control.
I will go back and edit this. I’ve clearly crossed the line by more than I realized. Like a lot of people, I’m fraying with this one. It feels like all of the worst folks in EA publicly (and permanently!) shaming all of the best. Break ups don’t hurt this much.
Not going to lie, I understand you’re emotional, but being referred to as “all of the worst folks” in EA, and your previous comment before you edited wich was exceptionally rude, is honestly one of the worst interactions I’ve had in this community.
I have volunteered hours of my time running seminar programmes, writing curricula, helping organise speaker events and Q&As. I have spent literally 100s of hours consuming EA content and trying to compile EA content for people tgo use. I have orientated my research around X-Risk, spent an entire summer at CERI, taken a year off from university to research X-Risk, spoken at EAGx Rotterdam, and tried to make sense of the world to reduce X-Risk. Sure, I’m not perfect, and critiques of my approach and my criticisms I am sure I am valid, and understand this is emotional and hard for you, and I apologise if my critiques have made it harder- I still tink its time to have such difficult conversations, but I do apologise if this is hard. To be called “one of the worst folks in EA” and being accused essentially of “not actually doing stuff” is really upsetting when I really am trying my best, so I do hope you will apologise
I don’t usually like responding to these sorts of comments as it is rarely worth it, but:
I truly hope your usual method of arguing is not this ad-hominem, and with emotive language used rather than rational argument.
Edit: I understand emotions are running high, and I see your above comment. Personal attacks, particularly to specific individuals in the above way really aren’t appropriate though, which is why I felt I had to say something here.
I was considering writing something similar, but investing time in writing a post is hard, so thank you for writing this.
While I understand emotions we all feel, I’m under an impression that effective altruism community is now reacting in a way that they try to educate public NOT to—to use anecdotal examples and emotional states to guide the decision making. While it’s very human to react in such a way I see that I’m growing more and more anxious of this direction observing the discourse.
Of course what happened is devastating and that there is a lot of constructive feedback and good questions to be asked, but I hope we can preserve our commitment to compassion and truth-seeking even in hard moments like this. The things that worry me can be seen in how demanding people are to the EA leaders—to the point of almost blaming them for this without enough evidence or sympathy. And that even not that good proposals earn a lot of support because I feel we are all incentivizied to be critical of EA (and it’s good to feel better about ourselves by distancing ourselves from the misery that FTX caused).
I’m also worried about EA doing things now (both community here and established orgs) to influence optics rather than for the sake of integrity. It would be worrying if true, because if so it may lead to the recreating potentials errors, like people silencing themselves instead of having good-faith arguments. I hope that accountability will prevail and both community and people will be open where we screwed up and, if needed, there will be consequences, instead of us protecting the brand of EA at any cost.
But I want to mention that I’m also incredibly impressed by some people here and generally very happy to consider myself part of this community. I admire the courage, integrity and sobriety of thinking of many people here. After spending recently way more time on EA Forums than I should, I came to conclusion that I would want to especially mention Habryka for his behavior and comments during the last period . It’s really a privilege to have such people in EA community (and I’m really sorry for not mentioning other people behaving in a similar way who I didn’t notice).
I think the point I have been trying to make in criticising the leadership is something like this:
EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trust with a lot of power. We treat them, in many ways, like “philosopher-kings” of the community
We do this on the premise that these people are uniquely good at leading the community and uniquely good at seeing risks that might be posed to the EA enterprise, and if they were more accountable and public, this would reduce their ability to do this, so reduce the ability of EA to do good in the world
However, the FTX scandal has shown us that these EA leaders are just humans, with the same flaws and biases as the rest of us (as I think many of us suspected), and aren’t these superhuman philosopher kings
Thus, we probably shouldn’t centralise so much power and trust in such a small unaccountable opaque group of people.
Its not that these people aren’t worthy of compassion, or that I would have done better (I wouldn’t have done). Its that we defer dso much power to them on the assumption that they are so much better than us at this, and I just think they’re not.
I’m placing myself on a ban because I’m not my best self right now but briefly:
1. “EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trust with a lot of power. We treat them, in many ways, like “philosopher-kings” of the community.”
I don’t think very many people treat him like this (certainly not the majority- literally, no one I know defers to him in this way and this strikes me as a super weird interpretation of his role in EA). But treating this as true how is this Will’s fault? At no point do I personally remember Will standing on a podium of What We Owe The Future, screaming “I am your leader bow to me, peasants!” I think I would have remembered that. Sounds like a hoot. Instead, the, and so help me I don’t know why I have to emphasize this, *living, breathing human being with feelings* is a philosopher and an academic who had an idea and promoted it because he thought it would make the world a better place. This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work. Maybe people who deferred to Will should be the ones who are pondering their own epistemics.
2. On Lantern Ventures:
There is a big difference between not wanting to work with someone because you don’t like their ethics (looking at you Kerry) and thinking they are going to commit the century’s worst fraud. Also, she’s getting death threats. Do you not reckon she’s suffering enough without her community adding their voices to the mob of people saying she should have known or that she could have prevented this? I remember when this stuff happened, it was on the periphery of EA at best. I literally don’t think I thought about it for longer than around 10 minutes and I had friends personally involved at the time. My take was “Wow! Sam sounds like a bad manager” not “Uhoh best nail down the furniture.”
3. On Dancer, and a handful of other people being kind while being critical:
Thank you. You can stay. ;)
I don’t want to get into debates around object-level criticisms this early but I keep being puzzled by this assertion:
”This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work.”
There was also a similar quote elsewhere:
”But Will is not the CEO of EA! He’s a philosopher who writes books about EA and has received a bunch of funding to do PR stuff.”
I don’t think this conception of “people loosely connected together in various ways” really captures the correct level of accountability here. There is a legal entity named Effective Ventures, which is the umbrella organisation of CEA, 80000 Hours, GWWC etc. and Will is the president of Effective Ventures as well as CEA. The people in the community volunteer their time and credibility by referring these organisations(and their literature) to their social circles. Many also do donate money to these organisations.
I refuse to have a verdict on FTX related criticisms until the dust settles, and most of the non-FTX related criticisms seem unreasonable to me, but this argument of “no one is the leader of EA really” strikes me as quite odd.
I suspect CEA might even be the official copyright owner for “Effective Altruism” brand as I don’t see any organisation that has “Effective Altruism” in its name despite not being approved by CEA.Please inform me on this if I’m wrong. EA is much more centralised than “Socialism” or “Feminism”.Correction: No one owns “Effective Altruism” as a trademark. More detailed information here.
I like your comment- thank you. I am really confused by the opposite view. I don’t quite understand where Will as governor of EA comes from. There are influential organizations and thinkers in EA for sure but ownership and ultimate responsibility particularly cosmic responsibility for bad actors feels very different… I think something is obviously going on here which is important and once things are less awful we should try and dissect calmly, kindly, and deliberately.
I don’t think anyone asking for more information about what people knew believes that central actors knew anything about fraud. If that is what you think, then maybe therein lies the rub. It is more that strong signs of bad ethics are important signals and your example of Kerry is perhaps a good one. Imagine if people had concerns with someone on the level of what they had with Kerry (plausible in the case of SBF—that is what it important to find out) and despite that promoted Kerry to be one of the few faces of the movement. That would be problematic and it is important to figure out what happened. That’s not a witch hunt.
Also, it is very surprising to me that you don’t hold the belief that many people treat e.g. Will as a central figure to defer to. Given what you’ve written it sounds like you are quite central so maybe get exposed to different people who have more eye-to-eye contact with Will, but it is certainly my experience that most people confer star power on Will. You are right that that doesn’t make it Will’s fault. I am just trying to claim that I don’t think you claim that people don’t treat him like a philosopher king hasn’t been my experience at all.
To be clear, this was always known, just never acted upon.
Well, yes, that I do agree with
Having read Plato, I have no idea what you mean by this.
Who are these EA leaders? What are their names and what do they do? Who directs what? How sure are you that you aren’t just imagining that there must be some adults in charge somewhere?
EA is anarchy. No one is even a little in charge. Some people get asked for advice more than others. Some people have ideas that are more influential than others. No one knows what they are doing, but all the constructive EAs are trying to do things and while also asking around to shore up their epistemics as they do so. EA is a workshop where everyone has their own projects but is free to ask the old hands for tips or to spy on the cool new up-and-coming projects and to try to emulate them.
EA forum, Twitter, EAG talks, etc are not real life. Will is a mascot. At best an “influencer.” If you want to shape the future (of the world, EA, the British government) you have to ground yourself in what is actually going on, not just the chattering in places like the EA forum. Be constructive, be nice, support others, and start building some stuff of your own.
I don’t think that’s true. I’ve worked at CEA myself, and I know that CEA wields considerable influence.
I also think your way of discussing is inappropriate.
The monarchy wields influence. That is not the same thing as control.
I will go back and edit this. I’ve clearly crossed the line by more than I realized. Like a lot of people, I’m fraying with this one. It feels like all of the worst folks in EA publicly (and permanently!) shaming all of the best. Break ups don’t hurt this much.
Not going to lie, I understand you’re emotional, but being referred to as “all of the worst folks” in EA, and your previous comment before you edited wich was exceptionally rude, is honestly one of the worst interactions I’ve had in this community.
I have volunteered hours of my time running seminar programmes, writing curricula, helping organise speaker events and Q&As. I have spent literally 100s of hours consuming EA content and trying to compile EA content for people tgo use. I have orientated my research around X-Risk, spent an entire summer at CERI, taken a year off from university to research X-Risk, spoken at EAGx Rotterdam, and tried to make sense of the world to reduce X-Risk. Sure, I’m not perfect, and critiques of my approach and my criticisms I am sure I am valid, and understand this is emotional and hard for you, and I apologise if my critiques have made it harder- I still tink its time to have such difficult conversations, but I do apologise if this is hard. To be called “one of the worst folks in EA” and being accused essentially of “not actually doing stuff” is really upsetting when I really am trying my best, so I do hope you will apologise
I don’t usually like responding to these sorts of comments as it is rarely worth it, but:
I truly hope your usual method of arguing is not this ad-hominem, and with emotive language used rather than rational argument.
Edit: I understand emotions are running high, and I see your above comment. Personal attacks, particularly to specific individuals in the above way really aren’t appropriate though, which is why I felt I had to say something here.