The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn’t perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will “rationally” go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I’ve also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they’re supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I’ve desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I’ve gotten of them. And that’s why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn’t supported by the evidence to conclude Leverage is so incapable of change that anything they’re associated with should be distrusted. But what you’re trying to do with Leverage Research is no different than what EA’s worst critics do not in an effort to change EA or its members, but to tarnish them. From within or outside of EA, to criticize any EA organization in such a fashion is below any acceptable epistemic standard in this movement.
If the post and comments here are stating facts about Leverage Research, and you’re reporting impressions with no ability to remember specific details that Leverage is like a cult, those are barely facts. The only fact is some people perceived Leverage to be like a cult in the past, which are only anecdotes. And without details, they’re only hearsay. Combined with the severity of the consequences if this hearsay was borne out, to be unable to produce actual facts invalidates the point you’re trying to make.
Given there are usernames like “throwaway” and “throwaway2,” and knowing the EA Forum, and its precursor, LessWrong, I’m confident there is only be one account under the username “anonymous,” and that all the comments on this post using this account are coming from the same individual.
I’m confused: the comments on Less Wrong you’d see by “person” and “personN” that were the same person happened when importing from Overcoming Bias. That wouldn’t be happening here.
They might still be the same person, but I don’t think this forum being descended from LessWrong’s code tells us things one way or the other.
Given there are usernames like “throwaway” and “throwaway2,” and knowing the EA Forum, and its precursor, LessWrong, I’m confident there is only be one account under the username “anonymous,” and that all the comments on this post using this account are coming from the same individual.
I don’t feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.
Upvoted. I’m sorry for the ambiguity of my comment. I meant each of the posts here under the usernames “throwaway,” “throwaway2,” and “anonymous” are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it’s almost certainly the same anonymous individual making the comments under the same account. I wouldn’t expect you to forgo your anonymity.
Your comments seem to be way longer than they need to be because you don’t trust other users here. Like, if someone comes and says they felt like it was a cult, I’m just going to think “OK, someone felt like it was a cult.” I’m not going to assume that they are doing secret blood rituals, I’m not going to assume that it’s a proven fact. I don’t need all these qualifications about the difference between cultishness and a stereotypical cult, I don’t need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you’re wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.
I admit I’m coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it’s worth, I treat the EA Forum not as an internal space but how I’d ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the ‘Effective Altruism’ Facebook group, or LessWrong.
I admit I’ve been wasting time. I’ve stopped responding directly to the OP because if I’m coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven’t decided yet.
“Compared to a Ponzi scheme” seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.
Maybe my criticism would have been better received if I’d left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.
Summary: Reading comments in this thread which are similar reactions I’ve seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn’t gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience.
“Compared to a Ponzi scheme” seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.
Upvoted.
I don’t if you’ve read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren’t new to me, either. I’m sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read.
I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I’d spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I’m as if not more socially involved with the rationality community than I am with EA.
From my inside view, here is how I’d describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven’t found how or what I write on the EA Forum to be too long. But that’s because I’m anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don’t self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won’t be formatted on any discourse style common to the rationalist diaspora.
I’ve touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be the problem with discourse on the EA Forum is it isn’t LW. In terms of genres of creative non-fiction writing, the EA Forum is less tolerant of diversity than LW. That’s fine. Thinking about this consequentially, I think rationalists who want their message heard by EA more don’t need to learn to write better, but write different.
The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn’t perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will “rationally” go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I’ve also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they’re supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I’ve desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I’ve gotten of them. And that’s why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn’t supported by the evidence to conclude Leverage is so incapable of change that anything they’re associated with should be distrusted. But what you’re trying to do with Leverage Research is no different than what EA’s worst critics do not in an effort to change EA or its members, but to tarnish them. From within or outside of EA, to criticize any EA organization in such a fashion is below any acceptable epistemic standard in this movement.
If the post and comments here are stating facts about Leverage Research, and you’re reporting impressions with no ability to remember specific details that Leverage is like a cult, those are barely facts. The only fact is some people perceived Leverage to be like a cult in the past, which are only anecdotes. And without details, they’re only hearsay. Combined with the severity of the consequences if this hearsay was borne out, to be unable to produce actual facts invalidates the point you’re trying to make.
I’m confused: the comments on Less Wrong you’d see by “person” and “personN” that were the same person happened when importing from Overcoming Bias. That wouldn’t be happening here.
They might still be the same person, but I don’t think this forum being descended from LessWrong’s code tells us things one way or the other.
Thanks. I wasn’t aware of that. I’ll redact that part of my comment.
I don’t feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.
Upvoted. I’m sorry for the ambiguity of my comment. I meant each of the posts here under the usernames “throwaway,” “throwaway2,” and “anonymous” are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it’s almost certainly the same anonymous individual making the comments under the same account. I wouldn’t expect you to forgo your anonymity.
Your comments seem to be way longer than they need to be because you don’t trust other users here. Like, if someone comes and says they felt like it was a cult, I’m just going to think “OK, someone felt like it was a cult.” I’m not going to assume that they are doing secret blood rituals, I’m not going to assume that it’s a proven fact. I don’t need all these qualifications about the difference between cultishness and a stereotypical cult, I don’t need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you’re wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.
I admit I’m coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it’s worth, I treat the EA Forum not as an internal space but how I’d ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the ‘Effective Altruism’ Facebook group, or LessWrong.
I admit I’ve been wasting time. I’ve stopped responding directly to the OP because if I’m coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven’t decided yet.
“Compared to a Ponzi scheme” seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.
Maybe my criticism would have been better received if I’d left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.
[epistemic status: meta]
Summary: Reading comments in this thread which are similar reactions I’ve seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn’t gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience.
Upvoted.
I don’t if you’ve read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren’t new to me, either. I’m sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read.
I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I’d spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I’m as if not more socially involved with the rationality community than I am with EA.
From my inside view, here is how I’d describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven’t found how or what I write on the EA Forum to be too long. But that’s because I’m anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don’t self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won’t be formatted on any discourse style common to the rationalist diaspora.
I’ve touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be the problem with discourse on the EA Forum is it isn’t LW. In terms of genres of creative non-fiction writing, the EA Forum is less tolerant of diversity than LW. That’s fine. Thinking about this consequentially, I think rationalists who want their message heard by EA more don’t need to learn to write better, but write different.