The main thing I want to mention is that it seems like a big undertaking. Student groups that promote novel ideas only succeed if there’s a strong leader in each location, who can drive the group and persuade others. These people are very hard to recruit, and tend to have strong alternative options (e.g. founding their own project). Moreover, there’s constant turnover, so you need to recruit one strong leader every 1-2 years in every location. If you miss one year, the group can easily end up in a negative cycle: a weak leader recruiters a weaker replacement, and so on, until the group falls apart.
We’ve had plenty of experience of this in 7 years of student group organising around EA.
(Note that it’s not the same if you’re promoting an idea that’s already widely adopted, because then you have an existing pool of talented people to draw on.)
Building this network out to 10-20 locations will already be years of work, and it will require constant maintenance. Making the groups awesome enough that you get significant penetration in each location will be much harder again. You’ll need to develop your messaging from scratch, and figure out how to make the whole thing seem like “a big deal” with a generation of students. My guess is it’s a 5-10 year project with several full-time staff.
Just look at how much investment there has been in EA student groups so far over 7 years, and there’s still under ~20 successful ones (my rough guess), and these only engage a few percent of the student body, so it’s a long way from getting the future leaders. Perhaps this idea will be a bit easier to spread than EA, but that’s not obvious. Also consider that EA student groups can piggypack off the main organisations, which have received excellent media coverage and produce all kinds of quality content and provide lots of support and experience.
So one question is whether you want to spend the next multiple years of your life doing this. Of course you don’t have to decide right away, but at some point someone will have to make this kind of commitment.
My other thought is that although there’s interesting differences, there’s still a lot of overlap with the strategy of EA groups (and also 80k). They’re going after basically the same audience, and one of the key messages of EA groups is the importance of shaping future tech. And the EA groups could maybe adopt some of your other differences, such as more positive messaging.
Given that both projects require a major amount of work, and it’s better to have one big success than two mediocre efforts (lognormal returns), and similar projects cause confusion, I wonder if it would be better to put your considerable talents into promoting EA groups instead.
You raise good points, thank you for taking the time. To address them:
I don’t think Envision is anywhere near as difficult a message to get across as EA. The basic idea already exists in latent form in many students, and the messaging is naturally attractive to those with ambition (who tend to have world-scale goals already), without the negative associations that often exist around the words “altruism” and “impact.” The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name; over 30 student groups at universities across the US have said they are planning to attend the conference (including in tech, engineering, entrepreneurship, and policy). We’re not peddling a controversial message, or one that many perceive as in opposition to their own interests (which is how, in my experience, many see altruism and EA); the way I see it, we’re giving words and tangible action paths to what people already want. I certainly could be wrong about this; it will become clearer over the next year. If I’m wrong strategy will be adjusted accordingly.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA, which had less precedent. As a concrete example, we don’t need to write any books about our message, we just need to promote books and invite speakers.
As a result of the above two points, I think it will be easier than you suggest to grow Envision, although ensuring the integrity of the organization and its message as it grows will certainly be a major challenge. That said, easier does not mean easy, and we certainly acknowledge that it will be difficult.
The danger of weak leaders is indeed serious, and one of the most likely failure scenarios to pan out, at least on a localized level. That’s why we’ll be cautious in founding chapters and are devoting significant time and effort to figuring out how best to identify good chapter founders. Any advice on this is much appreciated, as we have little prior experience to go off of.
I disagree that there’s much overlap between EA and Envision; although they may appear similar, there’s a deep distinction. The majority of those interested in Envision so far are either not, or barely, EA, including many that have heard of it. For various reasons, most entrepreneurs are not attracted to EA, but are attracted to Envision (our conference is co-hosted with Princeton Entrepreneurship Club). Although I don’t want to speculate too much about the causes of this, I think there’s a strong psychological difference between a movement whose primary goal is helping all sentient beings, with one of the tools being technology (a crude but I think sufficiently accurate description of EA) and a movement whose primary goal is the realization of technological development, done in a way that is beneficial to humanity. I could be wrong here and welcome any counter-points.
So to summarize, although EA and Envision are pursuing a similar end state and there’s some similarity in the means, there’s a pretty fundamental distinction in the mindset and implementation, which means Envision appeals to many who are not attracted by EA. And I think that audience will play a pivotal role in shaping humanity’s future.
I also agree with AGB’s points below; will comment separately.
I hope that addresses all your points; let me know if it didn’t or if you have any additional questions and/or counter-points.
The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name
I’m aware of this from the main post, but I think it’s pretty weak evidence.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA
You’re essentially trying to integrate the idea of concern for existential risk into tech development, which seems like a similarly difficult task to EA.
Moreover, EA had many excellent existing resources and powerful ideas to draw on, such as the importance of global poverty, the biases literature, the evidence-based movement, and so on. I don’t see a significant difference in difficulty here.
most entrepreneurs are not attracted to EA
Entrepreneurs are perhaps EAs best target audience. Almost all of GiveWell’s donors are either from tech or finance, and then they partnered with Dustin Moskovitz. Ried Hoffman and the Gates Foundation endorsed Will’s book. Our blog posts are regularly front page of Hacker News. I could go on.
I disagree that there’s much overlap between EA and Envision
Overall I agree there’s some nice features of the messaging that are different (more positive frame etc.) but I think these benefits are relatively small, and don’t obviously outweigh the large costs of setting up a new org, in an area that’s already extremely crowded by EA effort, and potentially diverting attention from EA groups.
I think a more cost-effective strategy would be to try to spread these messages through existing groups. Or by trying to integrate the positive features of the messaging into EA, perhaps starting in the Princeton group. I think with some ingenuity you could get the Princeton EA group to seriously engage 5% of students then become self-sustaining, and that would be an extremely valuable project that would only take a couple of years.
I share some of these concerns and don’t have anything like a settled opinion on what to do, but there are also arguments against simply having this idea promoted by EA groups, many of which are mentioned in the post. Notably:
EA is generally much more narrow-base/high-ask than Enivision would be. We’ve done this because we seem to get the most impact out of a relatively small number of people doing relatively dramatic things, but it makes the targeting poorly suited for a broad-based low-ask group.
EA already has a political dimension to it that I suspect ‘make technology developments safe’ might be able to avoid. Again, for EA this isn’t super-problematic because we’re only going after a fairly narrow base to start with and it’s not obvious that what negative optics EA already has are hugely affecting that narrow base. But it would be quite sad if, e.g., the terrible reputation of Peter Singer in Germany meant that we couldn’t make headway with future German leaders on technology safety given how far apart the actual topics are.
A related question is what kinds of percentages you really need to make Envision work, or rather at what point the value starts to flatten off. I find it fairly intuitive that 90% of an organisation working on a dangerous technology (e.g. AI) being safety-conscious to start with isn’t that much better than 70% or probably even 50%; all you need is a critical mass to get these ideas seriously considered in circulation. But how low can you go here? Is a 10% starting point enough because that 10% can then teach the others? What about 5%?
I definitely agree there are some arguments against, but I’m concerned they’re not strong enough to offset the downsides of setting up a new org.
Also, my understanding is that Envision is also narrow-base. They’re explicitly aiming at future leaders in tech. EA is aiming at that group plus others, so if anything is a wider base than Envision. Rather, Envision differs in being low-ask.
If envision really is only aiming at tech and adjacent undergrads I’ll be disappointed, but that wasn’t my read; what I see “leaders in tech development, policy, academia, and business”. So for instance I assume an high-flying Oxford PPE graduate with a side interest in tech would qualify*.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent. So I think I could sell Envision or something like it to a much wider cross-section of maths/comp sci types than I could EA in general (within JS, maybe 55% versus 20% to give you some idea of percentages).
*For non-UK readers, Oxford PPE graduates have fairly insane levels of penetration into the highest levels of UK politics.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent.
Ah, I got you.
Also just to clarify I was saying with Envision the audience is future leaders, whereas with EA it’s future leaders plus others; so that’s a sense in which EA has a broader audience.
Alex is correct, Envision is not only targeting future tech leaders, it’s targeting future leaders in tech development, policy, academia, and business.
Great points, I completely agree. On your last question, this is an intriguing one. I think 10% is too low; they’ll be sidelined, unless those 10% include most of the leadership and most socially influential individuals. Probably 50% is a good starting level, as long as this increases quite quickly.
This is an impressive plan.
The main thing I want to mention is that it seems like a big undertaking. Student groups that promote novel ideas only succeed if there’s a strong leader in each location, who can drive the group and persuade others. These people are very hard to recruit, and tend to have strong alternative options (e.g. founding their own project). Moreover, there’s constant turnover, so you need to recruit one strong leader every 1-2 years in every location. If you miss one year, the group can easily end up in a negative cycle: a weak leader recruiters a weaker replacement, and so on, until the group falls apart.
We’ve had plenty of experience of this in 7 years of student group organising around EA.
(Note that it’s not the same if you’re promoting an idea that’s already widely adopted, because then you have an existing pool of talented people to draw on.)
Building this network out to 10-20 locations will already be years of work, and it will require constant maintenance. Making the groups awesome enough that you get significant penetration in each location will be much harder again. You’ll need to develop your messaging from scratch, and figure out how to make the whole thing seem like “a big deal” with a generation of students. My guess is it’s a 5-10 year project with several full-time staff.
Just look at how much investment there has been in EA student groups so far over 7 years, and there’s still under ~20 successful ones (my rough guess), and these only engage a few percent of the student body, so it’s a long way from getting the future leaders. Perhaps this idea will be a bit easier to spread than EA, but that’s not obvious. Also consider that EA student groups can piggypack off the main organisations, which have received excellent media coverage and produce all kinds of quality content and provide lots of support and experience.
So one question is whether you want to spend the next multiple years of your life doing this. Of course you don’t have to decide right away, but at some point someone will have to make this kind of commitment.
My other thought is that although there’s interesting differences, there’s still a lot of overlap with the strategy of EA groups (and also 80k). They’re going after basically the same audience, and one of the key messages of EA groups is the importance of shaping future tech. And the EA groups could maybe adopt some of your other differences, such as more positive messaging.
Given that both projects require a major amount of work, and it’s better to have one big success than two mediocre efforts (lognormal returns), and similar projects cause confusion, I wonder if it would be better to put your considerable talents into promoting EA groups instead.
Hi Ben,
You raise good points, thank you for taking the time. To address them:
I don’t think Envision is anywhere near as difficult a message to get across as EA. The basic idea already exists in latent form in many students, and the messaging is naturally attractive to those with ambition (who tend to have world-scale goals already), without the negative associations that often exist around the words “altruism” and “impact.” The Princeton Futurist Society (Envision’s previous name) has only been around for one semester and already has 91 members without a strong marketing effort and with an off-putting name; over 30 student groups at universities across the US have said they are planning to attend the conference (including in tech, engineering, entrepreneurship, and policy). We’re not peddling a controversial message, or one that many perceive as in opposition to their own interests (which is how, in my experience, many see altruism and EA); the way I see it, we’re giving words and tangible action paths to what people already want. I certainly could be wrong about this; it will become clearer over the next year. If I’m wrong strategy will be adjusted accordingly.
I also don’t think we’re developing our message from scratch. We’re combining several different messages into one; ie the massive potential of technology, and the importance of safety in realizing that potential. There’s many existing resources to draw from and existing ideas which make it a lot easier to build off of what exists, especially as compared to EA, which had less precedent. As a concrete example, we don’t need to write any books about our message, we just need to promote books and invite speakers.
As a result of the above two points, I think it will be easier than you suggest to grow Envision, although ensuring the integrity of the organization and its message as it grows will certainly be a major challenge. That said, easier does not mean easy, and we certainly acknowledge that it will be difficult.
The danger of weak leaders is indeed serious, and one of the most likely failure scenarios to pan out, at least on a localized level. That’s why we’ll be cautious in founding chapters and are devoting significant time and effort to figuring out how best to identify good chapter founders. Any advice on this is much appreciated, as we have little prior experience to go off of.
I disagree that there’s much overlap between EA and Envision; although they may appear similar, there’s a deep distinction. The majority of those interested in Envision so far are either not, or barely, EA, including many that have heard of it. For various reasons, most entrepreneurs are not attracted to EA, but are attracted to Envision (our conference is co-hosted with Princeton Entrepreneurship Club). Although I don’t want to speculate too much about the causes of this, I think there’s a strong psychological difference between a movement whose primary goal is helping all sentient beings, with one of the tools being technology (a crude but I think sufficiently accurate description of EA) and a movement whose primary goal is the realization of technological development, done in a way that is beneficial to humanity. I could be wrong here and welcome any counter-points. So to summarize, although EA and Envision are pursuing a similar end state and there’s some similarity in the means, there’s a pretty fundamental distinction in the mindset and implementation, which means Envision appeals to many who are not attracted by EA. And I think that audience will play a pivotal role in shaping humanity’s future.
I also agree with AGB’s points below; will comment separately.
I hope that addresses all your points; let me know if it didn’t or if you have any additional questions and/or counter-points.
I’m aware of this from the main post, but I think it’s pretty weak evidence.
You’re essentially trying to integrate the idea of concern for existential risk into tech development, which seems like a similarly difficult task to EA.
Moreover, EA had many excellent existing resources and powerful ideas to draw on, such as the importance of global poverty, the biases literature, the evidence-based movement, and so on. I don’t see a significant difference in difficulty here.
Entrepreneurs are perhaps EAs best target audience. Almost all of GiveWell’s donors are either from tech or finance, and then they partnered with Dustin Moskovitz. Ried Hoffman and the Gates Foundation endorsed Will’s book. Our blog posts are regularly front page of Hacker News. I could go on.
Overall I agree there’s some nice features of the messaging that are different (more positive frame etc.) but I think these benefits are relatively small, and don’t obviously outweigh the large costs of setting up a new org, in an area that’s already extremely crowded by EA effort, and potentially diverting attention from EA groups.
I think a more cost-effective strategy would be to try to spread these messages through existing groups. Or by trying to integrate the positive features of the messaging into EA, perhaps starting in the Princeton group. I think with some ingenuity you could get the Princeton EA group to seriously engage 5% of students then become self-sustaining, and that would be an extremely valuable project that would only take a couple of years.
I share some of these concerns and don’t have anything like a settled opinion on what to do, but there are also arguments against simply having this idea promoted by EA groups, many of which are mentioned in the post. Notably:
EA is generally much more narrow-base/high-ask than Enivision would be. We’ve done this because we seem to get the most impact out of a relatively small number of people doing relatively dramatic things, but it makes the targeting poorly suited for a broad-based low-ask group.
EA already has a political dimension to it that I suspect ‘make technology developments safe’ might be able to avoid. Again, for EA this isn’t super-problematic because we’re only going after a fairly narrow base to start with and it’s not obvious that what negative optics EA already has are hugely affecting that narrow base. But it would be quite sad if, e.g., the terrible reputation of Peter Singer in Germany meant that we couldn’t make headway with future German leaders on technology safety given how far apart the actual topics are.
A related question is what kinds of percentages you really need to make Envision work, or rather at what point the value starts to flatten off. I find it fairly intuitive that 90% of an organisation working on a dangerous technology (e.g. AI) being safety-conscious to start with isn’t that much better than 70% or probably even 50%; all you need is a critical mass to get these ideas seriously considered in circulation. But how low can you go here? Is a 10% starting point enough because that 10% can then teach the others? What about 5%?
Hey Alex,
I definitely agree there are some arguments against, but I’m concerned they’re not strong enough to offset the downsides of setting up a new org.
Also, my understanding is that Envision is also narrow-base. They’re explicitly aiming at future leaders in tech. EA is aiming at that group plus others, so if anything is a wider base than Envision. Rather, Envision differs in being low-ask.
If envision really is only aiming at tech and adjacent undergrads I’ll be disappointed, but that wasn’t my read; what I see “leaders in tech development, policy, academia, and business”. So for instance I assume an high-flying Oxford PPE graduate with a side interest in tech would qualify*.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent. So I think I could sell Envision or something like it to a much wider cross-section of maths/comp sci types than I could EA in general (within JS, maybe 55% versus 20% to give you some idea of percentages).
*For non-UK readers, Oxford PPE graduates have fairly insane levels of penetration into the highest levels of UK politics.
Ah, I got you.
Also just to clarify I was saying with Envision the audience is future leaders, whereas with EA it’s future leaders plus others; so that’s a sense in which EA has a broader audience.
Alex is correct, Envision is not only targeting future tech leaders, it’s targeting future leaders in tech development, policy, academia, and business.
Hi AGB,
Great points, I completely agree. On your last question, this is an intriguing one. I think 10% is too low; they’ll be sidelined, unless those 10% include most of the leadership and most socially influential individuals. Probably 50% is a good starting level, as long as this increases quite quickly.
Curious to hear everyone else’s thoughts on this!