Micah—well, it’s an interesting empirical question how much stigma would be required to slow down large-scale AI development.
In terms of ‘ethical investment’, investors might easily be scared away from investing in tech that is stigmatized, given that it faces radically increased regulatory risk, adverse PR, and might be penalized under ESG standards.
In terms of talent recruitment & retention, stigma could be very powerful in dissuading smart, capable young people from joining an industry that would make them unpopular as friends, unattractive as mates, and embarrassments to their parents and family.
Without money and people, the AI industry would starve and slow down.
Of course, terrorist cells and radical activists might still try to develop and deploy AI, but they’re not likely to make much progress without large-scale institutional support.
I think your reasoning here is sound, but we have what I believe is a strong existence proof that when there is money to be made weak stigma doesn’t do much:
Porn.
I think the porn industry fits nicely into your description of a weakly stigmatized industry, yet it is a booming industry that has many smart/talented people working in it even though it is weakly stigmatized.
If we are all correct, AI will be bigger (in terms of money) than the porn industry (which is huge) and I suspect demand will be higher than for porn. People may use VPNs and private browsers when using AIs, but it won’t stop them I don’t think.
Micah—that’s a fascinating comparison actually. I’ll have to think about it further.
My first reaction is, well, porn’s a huge industry overall. But it’s incredibly decentralized among a lot of very small-scale producers (down to the level of individual OnlyFans producers). The capital and talent required to make porn videos seems relatively modest: a couple of performers, a crew of 2-3 people with some basic A/V training, a few thousand dollars of equipment (camera, sound, lights), a rental property for a day, and some basic video editing services. By contrast, the capital and talent required to make or modify an AI seems substantially higher. (Epistemic status: I know about the porn industry mostly from teaching human sexuality classes for 20 years, and lecturing about the academic psychology research concerning it; I’m not an expert on its economics.)
If porn was more like AI, and required significant investment capital (e.g. a tens of millions of dollars, rather than tens of thousands), if it required recruiting and managing several smart and skilled developers, if it required access to cloud computing resources, and if it required long-term commercial property rental, it seems like there are lot more chokepoints where moral stigmatization could slow down AI progress.
But it’s certainly worth doing some compare-and-contrast studies of morally stigmatized industries (which might include porn, sex work, guns, gambling, drugs, etc).
Cybercrime probably has somewhat higher barriers to entry than porn (although less than creating an AGI) and arguably higher levels of stigma. It doesn’t take as much skill as it used to, but still needs skilled actors at the higher levels of complexity. Yet it flourishes in many jurisdictions, including with the acquiescence (if not outright support) of nation-states. So that might be another “industry” to consider.
I suspect there will also be quite a bit of overlap between cybercrime and advanced AI (esp. for ‘social engineering’ attacks) in the coming years. Just as crypto’s (media-exaggerated) association with cybercrime in the early 2010s led to increased stigma against crypto, any association between advanced AI and cybercrime might increase stigma against AI.
I believe PornHub is a bigger company than most of today’s AI companies (~150 employees, half software engineers according to Glass Door)? If Brave AI is to be believed, they have $100B in annual revenue and handle 15TB of uploads per day.
If this is the benchmark for the limits of an AI company in a world where AI research is stigmatized, then I am of the opinion that all that stigmatization will accomplish is to make it so people who are OK working in the dark get to make decisions on what gets built. I feel like PornHub sized companies are big enough to produce AGI.
I agree with you that Porn is a very distributed industry overall, and I do suspect that is partially because of the stigmatization. However, this has resulted in a rather robust organization arrangement where individuals work independently and these large companies (like PornHub) focus on handling the IT side of things.
In a stigmatized AI future, perhaps individuals all over the world will work on different pieces of AI stuff while a small number of big AI companies perhaps do bulk training or coordination. Interestingly, this sort of decentralized approach to building could result in a better AI outcome because we wouldn’t end up with a small number of very powerful people deciding trajectory, and instead would have a large number of individuals working independently and in competition with each other.
I do like your idea about comparing to other stigmatized industries! Gambling and drugs are, of course, other great examples of how an absolutely massive industry can grow in the face of weak stigmatization!
The PornHub example raises something a lot of people seem not to understand very well about the porn industry. PornHub and its associated sites (owned by MindGeek) are ‘content aggregators’ that basically act as free advertising for the porn content produced by independent operators and small production companies—which all make their real money through subscription services. PornHub is a huge aggregator site, but as far as I know, it doesn’t actually produce any content of its own. So it’s quite unlike Netflix in this regard—Netflix spent about $17 billion in 2022 on original content, whereas PornHub spent roughly zero on original content, as far as I can tell.
So, one could imagine ‘AI aggregator sites’ that offer a range of AI services produced by small independent AI developers. These could potentially compete with Big Tech outfits like OpenAI or DeepMind (which would be more analogous to Netflix, in terms of investing large sums in ‘original content’, i.e. original software).
But, whether that would increase or decrease AI risk, I’m not sure. My hunch is that the more people and organizations who are involved in AI development, the higher the risk that a few bad actors will produce truly dangerous AI systems, whether accidentally or deliberately. But, as you say, a more diverse AI ecosystem could reduce the change that a few big AI companies acquire and abuse a lot of power.
Micah—well, it’s an interesting empirical question how much stigma would be required to slow down large-scale AI development.
In terms of ‘ethical investment’, investors might easily be scared away from investing in tech that is stigmatized, given that it faces radically increased regulatory risk, adverse PR, and might be penalized under ESG standards.
In terms of talent recruitment & retention, stigma could be very powerful in dissuading smart, capable young people from joining an industry that would make them unpopular as friends, unattractive as mates, and embarrassments to their parents and family.
Without money and people, the AI industry would starve and slow down.
Of course, terrorist cells and radical activists might still try to develop and deploy AI, but they’re not likely to make much progress without large-scale institutional support.
I think your reasoning here is sound, but we have what I believe is a strong existence proof that when there is money to be made weak stigma doesn’t do much:
Porn.
I think the porn industry fits nicely into your description of a weakly stigmatized industry, yet it is a booming industry that has many smart/talented people working in it even though it is weakly stigmatized.
If we are all correct, AI will be bigger (in terms of money) than the porn industry (which is huge) and I suspect demand will be higher than for porn. People may use VPNs and private browsers when using AIs, but it won’t stop them I don’t think.
Micah—that’s a fascinating comparison actually. I’ll have to think about it further.
My first reaction is, well, porn’s a huge industry overall. But it’s incredibly decentralized among a lot of very small-scale producers (down to the level of individual OnlyFans producers). The capital and talent required to make porn videos seems relatively modest: a couple of performers, a crew of 2-3 people with some basic A/V training, a few thousand dollars of equipment (camera, sound, lights), a rental property for a day, and some basic video editing services. By contrast, the capital and talent required to make or modify an AI seems substantially higher. (Epistemic status: I know about the porn industry mostly from teaching human sexuality classes for 20 years, and lecturing about the academic psychology research concerning it; I’m not an expert on its economics.)
If porn was more like AI, and required significant investment capital (e.g. a tens of millions of dollars, rather than tens of thousands), if it required recruiting and managing several smart and skilled developers, if it required access to cloud computing resources, and if it required long-term commercial property rental, it seems like there are lot more chokepoints where moral stigmatization could slow down AI progress.
But it’s certainly worth doing some compare-and-contrast studies of morally stigmatized industries (which might include porn, sex work, guns, gambling, drugs, etc).
Cybercrime probably has somewhat higher barriers to entry than porn (although less than creating an AGI) and arguably higher levels of stigma. It doesn’t take as much skill as it used to, but still needs skilled actors at the higher levels of complexity. Yet it flourishes in many jurisdictions, including with the acquiescence (if not outright support) of nation-states. So that might be another “industry” to consider.
Jason—yes, that’s another good example.
I suspect there will also be quite a bit of overlap between cybercrime and advanced AI (esp. for ‘social engineering’ attacks) in the coming years. Just as crypto’s (media-exaggerated) association with cybercrime in the early 2010s led to increased stigma against crypto, any association between advanced AI and cybercrime might increase stigma against AI.
I believe PornHub is a bigger company than most of today’s AI companies (~150 employees, half software engineers according to Glass Door)? If Brave AI is to be believed, they have $100B in annual revenue and handle 15TB of uploads per day.
If this is the benchmark for the limits of an AI company in a world where AI research is stigmatized, then I am of the opinion that all that stigmatization will accomplish is to make it so people who are OK working in the dark get to make decisions on what gets built. I feel like PornHub sized companies are big enough to produce AGI.
I agree with you that Porn is a very distributed industry overall, and I do suspect that is partially because of the stigmatization. However, this has resulted in a rather robust organization arrangement where individuals work independently and these large companies (like PornHub) focus on handling the IT side of things.
In a stigmatized AI future, perhaps individuals all over the world will work on different pieces of AI stuff while a small number of big AI companies perhaps do bulk training or coordination. Interestingly, this sort of decentralized approach to building could result in a better AI outcome because we wouldn’t end up with a small number of very powerful people deciding trajectory, and instead would have a large number of individuals working independently and in competition with each other.
I do like your idea about comparing to other stigmatized industries! Gambling and drugs are, of course, other great examples of how an absolutely massive industry can grow in the face of weak stigmatization!
Micah—very interesting points.
The PornHub example raises something a lot of people seem not to understand very well about the porn industry. PornHub and its associated sites (owned by MindGeek) are ‘content aggregators’ that basically act as free advertising for the porn content produced by independent operators and small production companies—which all make their real money through subscription services. PornHub is a huge aggregator site, but as far as I know, it doesn’t actually produce any content of its own. So it’s quite unlike Netflix in this regard—Netflix spent about $17 billion in 2022 on original content, whereas PornHub spent roughly zero on original content, as far as I can tell.
So, one could imagine ‘AI aggregator sites’ that offer a range of AI services produced by small independent AI developers. These could potentially compete with Big Tech outfits like OpenAI or DeepMind (which would be more analogous to Netflix, in terms of investing large sums in ‘original content’, i.e. original software).
But, whether that would increase or decrease AI risk, I’m not sure. My hunch is that the more people and organizations who are involved in AI development, the higher the risk that a few bad actors will produce truly dangerous AI systems, whether accidentally or deliberately. But, as you say, a more diverse AI ecosystem could reduce the change that a few big AI companies acquire and abuse a lot of power.