Connection Theory has been criticized as follows: “It is incomplete and inadequate, has flawed methodology, and conflicts well established science.” The key paper has been removed from their websites and the web archive but is still available at the bottom of this post.
The main person at LEAN is closely involved with Paradigm Academy and helps them recruit people.
Recruitment transparency
I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as “participating organizations,” implying they’re equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)
(low confidence) I’ve heard through the grapevine that the EA Summit 2018 wasn’t coordinated with other EA organizations except for LEAN and Charity Science.
Overall, I am under the impression that a majority of EAs think that Leverage is quite culty and ineffective. Leverage staff usually respond by claiming that their unpublished research is valuable, but the insiders mentioned above seemed to disagree.
If someone has strong counterevidence to this skeptical view of Leverage, I would be very interested and open to changing my mind.
Just to add a bit of info: I helped with THINK when I was a college student. It wasn’t the most effective strategy (largely, it was founded before we knew people would coalesce so strongly into the EA identity, and we didn’t predict that), but Leverage’s involvement with it was professional and thoughtful. I didn’t get any vibes of cultishness from my time with THINK, though I did find Connection Theory a bit weird and not very useful when I learned about it.
Do you mind clarifying what you mean by “recruits people?” I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff.
I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
In this comment I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn’t not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren’t being provided.
I’m disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I’m unconvinced your or the other reports of Leverage as being like a cult aren’t being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren’t a deliberate attempt to do nothing but tarnish Leverage.
The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as “participating organizations,” implying they’re equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)
Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment.
The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as “participating organizations,” implying they’re equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)
As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA. They aren’t even the only organization in EA with the letters “EA” in their name, nor are they exclusively considered among EA organizations able to wield the EA brand. And yet despite all this nobody objects on priors to the CEA as a single organization branding these events each year. As we shouldn’t. Of course, none of this necessary to invalidate the point you’re trying to make. Julia Wise as the Community Liaison for the CEA has already clarified the CEA themselves support the Summit.
So the EA Summit has already been legitimized by multiple EA organizations as a genuine EA event, including the one which is seen as the default legitimate representation for the whole movement.
(low confidence) I’ve heard through the grapevine that the EA Summit 2018 wasn’t coordinated with other EA organizations except for LEAN and Charity Science.
As above, that the EA Summit wasn’t coordinated by more than one organization means nothing. There are already EA retreat- and conference-like events organized by local university groups and national foundations all over the world, which have gone well, such as the Czech EA Retreat in 2017. So the idea EA should be so centralized only registered non-profits with some given caliber of prestige in the EA movement, or those they approve, can organize events to be viewed as legitimate by the community is unfounded. Not even the CEA wants that centralized. Nobody does. So whatever point you’re trying to prove about the EA Summit using facts about Leverage Research is still invalid.
For what it’s worth, while no other organizations are officially participating, here are some effective altruists who will be speaking at the EA Summit, and the organizations they’re associated with. This is sufficient to warrant a correct identification that those organizations are in spirit welcome and included at EAG. So the same standard should apply to the EA Summit.
Ben Pace, Ray Arnold and Oliver Habryka: LessWrong isn’t an organization, but it’s played a formative role in EA, and with LW’s new codebase being the kernel of for the next version of the EA Forum, Ben and Oliver as admins and architects of the new LW are as important representatives of this online community as any in EA’s history.
Rob Mather is the ED of the AMF. AMF isn’t typically regarded as an “EA organization” because they’re not a metacharity in need of dependence directly on the EA movement. But that Givewell’s top-recommended charity since EA began, which continues to receive more donations from effective altruists than any other, to not been given consideration would be senseless.
Sarah Spikes runs the Berkeley REACH.
Holly Morgan is a staffer for the EA London organization.
In reviewing these speakers, and seeing so many from LEAN and Rethink Charity, with Kerry Vaughan being a director for individual outreach at CEA, I see what the EA Summit is trying to do. They’re trying to have as speakers at the event to rally local EA group organizers from around the world to more coordinated action and spirited projects. Which is exactly what the organizers of the EA Summit have been saying the whole time. This is also why as an organizer for rationality and EA projects in Vancouver, Canada, trying to develop a project to scale both here and cities everywhere a system for organizing local groups to do direct work; and as a very involved volunteer online community organizer in EA, I was invited to attend the EA Summit. It’s also why one the event organizers consulted with me before they announced the EA Summit how they thought it should be presented in the EA community.
This isn’t counterevidence to be skeptical of Leverage. This is evidence counter to the thesis the EA Summit is nothing but a launchpad for Leverage’s rebranding within the EA community as “Paradigm Academy,” being advanced in these facts about Leverage Research. No logical evidence has been presented that the tenuous links between Leverage and the organization of the 2018 EA Summit entails the negative reputation Leverage has acquired over the years should be transferred onto the upcoming Summit.
Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations.
See Geoff’s reply to me above: Paradigm and Leverage will at some point be separate, but right now they’re closely related (both under Geoff etc). I don’t think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.
I don’t think this is accurate. (Please excuse the lack of engagement with anything else here; I’m just skimming some of it for now but I did notice this.)
[Edit: Unless you meant EA Funds (rather than Effective Altruism Foundation, as I read it)?]
I meant the EA Foundation, who I was under the impression received incubation from CEA. Since apparently my ambiguous perception of those events might be wrong, I’ve switched the example of one CEA’s incubees to ACE.
Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details.
I could list a number of specific details, but not without violating the preferences of the people who shared their experiences with me, and not without causing even more unnecessary drama.
These details wouldn’t make for a watertight case that they’re a “cult”. I deliberately didn’t claim that Leverage is a cult. (See also this.) But the details are quite alarming for anyone who strives to have well-calibrated beliefs and an open-minded and welcoming EA community. I do think their cultishness led to unnecessary harm to well-meaning, young people who wanted to do good in the world.
There’s a big difference between feeling cultlike, as in “weird”, “disorienting”, “bizarre” etc, and exhibiting the epistemic flaws of a cult, as in having people be afraid to disagree with the thought leader, a disproportionate reverence for a single idea or corpus, the excommunication of dissenters, the application of one idea or corpus to explain everything in the world, instinctively explaining away all possible counterarguments, refusal to look seriously at outside ideas, and so on.
If you could provide any sanitized, abstracted details to indicate that the latter is going on rather than merely the former, then it would go a long way towards indicating that LR is contrary to the goal of well-calibrated beliefs and open-mindedness.
(While LessWrong.com was historically run by MIRI, the new LessWrong is indeed for most intents and purposes an independent organization (while legally under the umbrella of CFAR) and we are currently filing documents to get our own 501c3 registered, and are planning to stick around as an organization for at least another 5 years or so. Since we don’t yet have a name that is different from “LessWrong”, it’s easy to get confused about whether we are an actual independent organization, and I figured I would comment to clarify that.)
Thanks for making this post, it was long overdue.
Further facts
Connection Theory has been criticized as follows: “It is incomplete and inadequate, has flawed methodology, and conflicts well established science.” The key paper has been removed from their websites and the web archive but is still available at the bottom of this post.
More of Geoff Anders’s early work can be seen at https://systematicphilosophy.com/ and https://philosophicalresearch.wordpress.com/. (I hope they don’t take down these websites as well.)
Former Leverage staff have launched a stablecoin cryptocurrency called Reserve (formerly “Flamingo”), which was backed by Peter Thiel and Coinbase.
In 2012-2014, they ran THINK.
The main person at LEAN is closely involved with Paradigm Academy and helps them recruit people.
Recruitment transparency
I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as “participating organizations,” implying they’re equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)
(low confidence) I’ve heard through the grapevine that the EA Summit 2018 wasn’t coordinated with other EA organizations except for LEAN and Charity Science.
Overall, I am under the impression that a majority of EAs think that Leverage is quite culty and ineffective. Leverage staff usually respond by claiming that their unpublished research is valuable, but the insiders mentioned above seemed to disagree.
If someone has strong counterevidence to this skeptical view of Leverage, I would be very interested and open to changing my mind.
Just to add a bit of info: I helped with THINK when I was a college student. It wasn’t the most effective strategy (largely, it was founded before we knew people would coalesce so strongly into the EA identity, and we didn’t predict that), but Leverage’s involvement with it was professional and thoughtful. I didn’t get any vibes of cultishness from my time with THINK, though I did find Connection Theory a bit weird and not very useful when I learned about it.
Do you mind clarifying what you mean by “recruits people?” I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff.
In this comment I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn’t not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren’t being provided.
I’m disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I’m unconvinced your or the other reports of Leverage as being like a cult aren’t being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren’t a deliberate attempt to do nothing but tarnish Leverage.
Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment.
As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA. They aren’t even the only organization in EA with the letters “EA” in their name, nor are they exclusively considered among EA organizations able to wield the EA brand. And yet despite all this nobody objects on priors to the CEA as a single organization branding these events each year. As we shouldn’t. Of course, none of this necessary to invalidate the point you’re trying to make. Julia Wise as the Community Liaison for the CEA has already clarified the CEA themselves support the Summit.
So the EA Summit has already been legitimized by multiple EA organizations as a genuine EA event, including the one which is seen as the default legitimate representation for the whole movement.
As above, that the EA Summit wasn’t coordinated by more than one organization means nothing. There are already EA retreat- and conference-like events organized by local university groups and national foundations all over the world, which have gone well, such as the Czech EA Retreat in 2017. So the idea EA should be so centralized only registered non-profits with some given caliber of prestige in the EA movement, or those they approve, can organize events to be viewed as legitimate by the community is unfounded. Not even the CEA wants that centralized. Nobody does. So whatever point you’re trying to prove about the EA Summit using facts about Leverage Research is still invalid.
For what it’s worth, while no other organizations are officially participating, here are some effective altruists who will be speaking at the EA Summit, and the organizations they’re associated with. This is sufficient to warrant a correct identification that those organizations are in spirit welcome and included at EAG. So the same standard should apply to the EA Summit.
Ben Pace, Ray Arnold and Oliver Habryka: LessWrong isn’t an organization, but it’s played a formative role in EA, and with LW’s new codebase being the kernel of for the next version of the EA Forum, Ben and Oliver as admins and architects of the new LW are as important representatives of this online community as any in EA’s history.
Rob Mather is the ED of the AMF. AMF isn’t typically regarded as an “EA organization” because they’re not a metacharity in need of dependence directly on the EA movement. But that Givewell’s top-recommended charity since EA began, which continues to receive more donations from effective altruists than any other, to not been given consideration would be senseless.
Sarah Spikes runs the Berkeley REACH.
Holly Morgan is a staffer for the EA London organization.
In reviewing these speakers, and seeing so many from LEAN and Rethink Charity, with Kerry Vaughan being a director for individual outreach at CEA, I see what the EA Summit is trying to do. They’re trying to have as speakers at the event to rally local EA group organizers from around the world to more coordinated action and spirited projects. Which is exactly what the organizers of the EA Summit have been saying the whole time. This is also why as an organizer for rationality and EA projects in Vancouver, Canada, trying to develop a project to scale both here and cities everywhere a system for organizing local groups to do direct work; and as a very involved volunteer online community organizer in EA, I was invited to attend the EA Summit. It’s also why one the event organizers consulted with me before they announced the EA Summit how they thought it should be presented in the EA community.
This isn’t counterevidence to be skeptical of Leverage. This is evidence counter to the thesis the EA Summit is nothing but a launchpad for Leverage’s rebranding within the EA community as “Paradigm Academy,” being advanced in these facts about Leverage Research. No logical evidence has been presented that the tenuous links between Leverage and the organization of the 2018 EA Summit entails the negative reputation Leverage has acquired over the years should be transferred onto the upcoming Summit.
See Geoff’s reply to me above: Paradigm and Leverage will at some point be separate, but right now they’re closely related (both under Geoff etc). I don’t think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.
I don’t think this is accurate. (Please excuse the lack of engagement with anything else here; I’m just skimming some of it for now but I did notice this.)
[Edit: Unless you meant EA Funds (rather than Effective Altruism Foundation, as I read it)?]
I meant the EA Foundation, who I was under the impression received incubation from CEA. Since apparently my ambiguous perception of those events might be wrong, I’ve switched the example of one CEA’s incubees to ACE.
That one is accurate.
Also “incubees” is my new favourite word.
I could list a number of specific details, but not without violating the preferences of the people who shared their experiences with me, and not without causing even more unnecessary drama.
These details wouldn’t make for a watertight case that they’re a “cult”. I deliberately didn’t claim that Leverage is a cult. (See also this.) But the details are quite alarming for anyone who strives to have well-calibrated beliefs and an open-minded and welcoming EA community. I do think their cultishness led to unnecessary harm to well-meaning, young people who wanted to do good in the world.
There’s a big difference between feeling cultlike, as in “weird”, “disorienting”, “bizarre” etc, and exhibiting the epistemic flaws of a cult, as in having people be afraid to disagree with the thought leader, a disproportionate reverence for a single idea or corpus, the excommunication of dissenters, the application of one idea or corpus to explain everything in the world, instinctively explaining away all possible counterarguments, refusal to look seriously at outside ideas, and so on.
If you could provide any sanitized, abstracted details to indicate that the latter is going on rather than merely the former, then it would go a long way towards indicating that LR is contrary to the goal of well-calibrated beliefs and open-mindedness.
(While LessWrong.com was historically run by MIRI, the new LessWrong is indeed for most intents and purposes an independent organization (while legally under the umbrella of CFAR) and we are currently filing documents to get our own 501c3 registered, and are planning to stick around as an organization for at least another 5 years or so. Since we don’t yet have a name that is different from “LessWrong”, it’s easy to get confused about whether we are an actual independent organization, and I figured I would comment to clarify that.)