You raise good points, thank you for taking the time. To address them:
I don’t think Envision is anywhere near as difficult a message to get across as EA. The basic idea already exists in latent form in many students, and the messaging is naturally attractive to those with ambition (who tend to have world-scale goals already), without the negative associations that often exist around the words “altruism” and “impact.” The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name; over 30 student groups at universities across the US have said they are planning to attend the conference (including in tech, engineering, entrepreneurship, and policy). We’re not peddling a controversial message, or one that many perceive as in opposition to their own interests (which is how, in my experience, many see altruism and EA); the way I see it, we’re giving words and tangible action paths to what people already want. I certainly could be wrong about this; it will become clearer over the next year. If I’m wrong strategy will be adjusted accordingly.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA, which had less precedent. As a concrete example, we don’t need to write any books about our message, we just need to promote books and invite speakers.
As a result of the above two points, I think it will be easier than you suggest to grow Envision, although ensuring the integrity of the organization and its message as it grows will certainly be a major challenge. That said, easier does not mean easy, and we certainly acknowledge that it will be difficult.
The danger of weak leaders is indeed serious, and one of the most likely failure scenarios to pan out, at least on a localized level. That’s why we’ll be cautious in founding chapters and are devoting significant time and effort to figuring out how best to identify good chapter founders. Any advice on this is much appreciated, as we have little prior experience to go off of.
I disagree that there’s much overlap between EA and Envision; although they may appear similar, there’s a deep distinction. The majority of those interested in Envision so far are either not, or barely, EA, including many that have heard of it. For various reasons, most entrepreneurs are not attracted to EA, but are attracted to Envision (our conference is co-hosted with Princeton Entrepreneurship Club). Although I don’t want to speculate too much about the causes of this, I think there’s a strong psychological difference between a movement whose primary goal is helping all sentient beings, with one of the tools being technology (a crude but I think sufficiently accurate description of EA) and a movement whose primary goal is the realization of technological development, done in a way that is beneficial to humanity. I could be wrong here and welcome any counter-points.
So to summarize, although EA and Envision are pursuing a similar end state and there’s some similarity in the means, there’s a pretty fundamental distinction in the mindset and implementation, which means Envision appeals to many who are not attracted by EA. And I think that audience will play a pivotal role in shaping humanity’s future.
I also agree with AGB’s points below; will comment separately.
I hope that addresses all your points; let me know if it didn’t or if you have any additional questions and/or counter-points.
The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name
I’m aware of this from the main post, but I think it’s pretty weak evidence.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA
You’re essentially trying to integrate the idea of concern for existential risk into tech development, which seems like a similarly difficult task to EA.
Moreover, EA had many excellent existing resources and powerful ideas to draw on, such as the importance of global poverty, the biases literature, the evidence-based movement, and so on. I don’t see a significant difference in difficulty here.
most entrepreneurs are not attracted to EA
Entrepreneurs are perhaps EAs best target audience. Almost all of GiveWell’s donors are either from tech or finance, and then they partnered with Dustin Moskovitz. Ried Hoffman and the Gates Foundation endorsed Will’s book. Our blog posts are regularly front page of Hacker News. I could go on.
I disagree that there’s much overlap between EA and Envision
Overall I agree there’s some nice features of the messaging that are different (more positive frame etc.) but I think these benefits are relatively small, and don’t obviously outweigh the large costs of setting up a new org, in an area that’s already extremely crowded by EA effort, and potentially diverting attention from EA groups.
I think a more cost-effective strategy would be to try to spread these messages through existing groups. Or by trying to integrate the positive features of the messaging into EA, perhaps starting in the Princeton group. I think with some ingenuity you could get the Princeton EA group to seriously engage 5% of students then become self-sustaining, and that would be an extremely valuable project that would only take a couple of years.
Hi Ben,
You raise good points, thank you for taking the time. To address them:
I don’t think Envision is anywhere near as difficult a message to get across as EA. The basic idea already exists in latent form in many students, and the messaging is naturally attractive to those with ambition (who tend to have world-scale goals already), without the negative associations that often exist around the words “altruism” and “impact.” The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name; over 30 student groups at universities across the US have said they are planning to attend the conference (including in tech, engineering, entrepreneurship, and policy). We’re not peddling a controversial message, or one that many perceive as in opposition to their own interests (which is how, in my experience, many see altruism and EA); the way I see it, we’re giving words and tangible action paths to what people already want. I certainly could be wrong about this; it will become clearer over the next year. If I’m wrong strategy will be adjusted accordingly.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA, which had less precedent. As a concrete example, we don’t need to write any books about our message, we just need to promote books and invite speakers.
As a result of the above two points, I think it will be easier than you suggest to grow Envision, although ensuring the integrity of the organization and its message as it grows will certainly be a major challenge. That said, easier does not mean easy, and we certainly acknowledge that it will be difficult.
The danger of weak leaders is indeed serious, and one of the most likely failure scenarios to pan out, at least on a localized level. That’s why we’ll be cautious in founding chapters and are devoting significant time and effort to figuring out how best to identify good chapter founders. Any advice on this is much appreciated, as we have little prior experience to go off of.
I disagree that there’s much overlap between EA and Envision; although they may appear similar, there’s a deep distinction. The majority of those interested in Envision so far are either not, or barely, EA, including many that have heard of it. For various reasons, most entrepreneurs are not attracted to EA, but are attracted to Envision (our conference is co-hosted with Princeton Entrepreneurship Club). Although I don’t want to speculate too much about the causes of this, I think there’s a strong psychological difference between a movement whose primary goal is helping all sentient beings, with one of the tools being technology (a crude but I think sufficiently accurate description of EA) and a movement whose primary goal is the realization of technological development, done in a way that is beneficial to humanity. I could be wrong here and welcome any counter-points. So to summarize, although EA and Envision are pursuing a similar end state and there’s some similarity in the means, there’s a pretty fundamental distinction in the mindset and implementation, which means Envision appeals to many who are not attracted by EA. And I think that audience will play a pivotal role in shaping humanity’s future.
I also agree with AGB’s points below; will comment separately.
I hope that addresses all your points; let me know if it didn’t or if you have any additional questions and/or counter-points.
I’m aware of this from the main post, but I think it’s pretty weak evidence.
You’re essentially trying to integrate the idea of concern for existential risk into tech development, which seems like a similarly difficult task to EA.
Moreover, EA had many excellent existing resources and powerful ideas to draw on, such as the importance of global poverty, the biases literature, the evidence-based movement, and so on. I don’t see a significant difference in difficulty here.
Entrepreneurs are perhaps EAs best target audience. Almost all of GiveWell’s donors are either from tech or finance, and then they partnered with Dustin Moskovitz. Ried Hoffman and the Gates Foundation endorsed Will’s book. Our blog posts are regularly front page of Hacker News. I could go on.
Overall I agree there’s some nice features of the messaging that are different (more positive frame etc.) but I think these benefits are relatively small, and don’t obviously outweigh the large costs of setting up a new org, in an area that’s already extremely crowded by EA effort, and potentially diverting attention from EA groups.
I think a more cost-effective strategy would be to try to spread these messages through existing groups. Or by trying to integrate the positive features of the messaging into EA, perhaps starting in the Princeton group. I think with some ingenuity you could get the Princeton EA group to seriously engage 5% of students then become self-sustaining, and that would be an extremely valuable project that would only take a couple of years.