I share some of these concerns and don’t have anything like a settled opinion on what to do, but there are also arguments against simply having this idea promoted by EA groups, many of which are mentioned in the post. Notably:
EA is generally much more narrow-base/high-ask than Enivision would be. We’ve done this because we seem to get the most impact out of a relatively small number of people doing relatively dramatic things, but it makes the targeting poorly suited for a broad-based low-ask group.
EA already has a political dimension to it that I suspect ‘make technology developments safe’ might be able to avoid. Again, for EA this isn’t super-problematic because we’re only going after a fairly narrow base to start with and it’s not obvious that what negative optics EA already has are hugely affecting that narrow base. But it would be quite sad if, e.g., the terrible reputation of Peter Singer in Germany meant that we couldn’t make headway with future German leaders on technology safety given how far apart the actual topics are.
A related question is what kinds of percentages you really need to make Envision work, or rather at what point the value starts to flatten off. I find it fairly intuitive that 90% of an organisation working on a dangerous technology (e.g. AI) being safety-conscious to start with isn’t that much better than 70% or probably even 50%; all you need is a critical mass to get these ideas seriously considered in circulation. But how low can you go here? Is a 10% starting point enough because that 10% can then teach the others? What about 5%?
I definitely agree there are some arguments against, but I’m concerned they’re not strong enough to offset the downsides of setting up a new org.
Also, my understanding is that Envision is also narrow-base. They’re explicitly aiming at future leaders in tech. EA is aiming at that group plus others, so if anything is a wider base than Envision. Rather, Envision differs in being low-ask.
If envision really is only aiming at tech and adjacent undergrads I’ll be disappointed, but that wasn’t my read; what I see “leaders in tech development, policy, academia, and business”. So for instance I assume an high-flying Oxford PPE graduate with a side interest in tech would qualify*.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent. So I think I could sell Envision or something like it to a much wider cross-section of maths/comp sci types than I could EA in general (within JS, maybe 55% versus 20% to give you some idea of percentages).
*For non-UK readers, Oxford PPE graduates have fairly insane levels of penetration into the highest levels of UK politics.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent.
Ah, I got you.
Also just to clarify I was saying with Envision the audience is future leaders, whereas with EA it’s future leaders plus others; so that’s a sense in which EA has a broader audience.
Alex is correct, Envision is not only targeting future tech leaders, it’s targeting future leaders in tech development, policy, academia, and business.
Great points, I completely agree. On your last question, this is an intriguing one. I think 10% is too low; they’ll be sidelined, unless those 10% include most of the leadership and most socially influential individuals. Probably 50% is a good starting level, as long as this increases quite quickly.
I share some of these concerns and don’t have anything like a settled opinion on what to do, but there are also arguments against simply having this idea promoted by EA groups, many of which are mentioned in the post. Notably:
EA is generally much more narrow-base/high-ask than Enivision would be. We’ve done this because we seem to get the most impact out of a relatively small number of people doing relatively dramatic things, but it makes the targeting poorly suited for a broad-based low-ask group.
EA already has a political dimension to it that I suspect ‘make technology developments safe’ might be able to avoid. Again, for EA this isn’t super-problematic because we’re only going after a fairly narrow base to start with and it’s not obvious that what negative optics EA already has are hugely affecting that narrow base. But it would be quite sad if, e.g., the terrible reputation of Peter Singer in Germany meant that we couldn’t make headway with future German leaders on technology safety given how far apart the actual topics are.
A related question is what kinds of percentages you really need to make Envision work, or rather at what point the value starts to flatten off. I find it fairly intuitive that 90% of an organisation working on a dangerous technology (e.g. AI) being safety-conscious to start with isn’t that much better than 70% or probably even 50%; all you need is a critical mass to get these ideas seriously considered in circulation. But how low can you go here? Is a 10% starting point enough because that 10% can then teach the others? What about 5%?
Hey Alex,
I definitely agree there are some arguments against, but I’m concerned they’re not strong enough to offset the downsides of setting up a new org.
Also, my understanding is that Envision is also narrow-base. They’re explicitly aiming at future leaders in tech. EA is aiming at that group plus others, so if anything is a wider base than Envision. Rather, Envision differs in being low-ask.
If envision really is only aiming at tech and adjacent undergrads I’ll be disappointed, but that wasn’t my read; what I see “leaders in tech development, policy, academia, and business”. So for instance I assume an high-flying Oxford PPE graduate with a side interest in tech would qualify*.
I think we might be talking past each other slightly on the base point though, when I said EA was narrow-base/high-ask I meant to imply that our available base is narrowed (a lot) by the high ask; it only appeals to people with a fairly strong to very strong altruistic bent. So I think I could sell Envision or something like it to a much wider cross-section of maths/comp sci types than I could EA in general (within JS, maybe 55% versus 20% to give you some idea of percentages).
*For non-UK readers, Oxford PPE graduates have fairly insane levels of penetration into the highest levels of UK politics.
Ah, I got you.
Also just to clarify I was saying with Envision the audience is future leaders, whereas with EA it’s future leaders plus others; so that’s a sense in which EA has a broader audience.
Alex is correct, Envision is not only targeting future tech leaders, it’s targeting future leaders in tech development, policy, academia, and business.
Hi AGB,
Great points, I completely agree. On your last question, this is an intriguing one. I think 10% is too low; they’ll be sidelined, unless those 10% include most of the leadership and most socially influential individuals. Probably 50% is a good starting level, as long as this increases quite quickly.
Curious to hear everyone else’s thoughts on this!