The PornHub example raises something a lot of people seem not to understand very well about the porn industry. PornHub and its associated sites (owned by MindGeek) are ‘content aggregators’ that basically act as free advertising for the porn content produced by independent operators and small production companies—which all make their real money through subscription services. PornHub is a huge aggregator site, but as far as I know, it doesn’t actually produce any content of its own. So it’s quite unlike Netflix in this regard—Netflix spent about $17 billion in 2022 on original content, whereas PornHub spent roughly zero on original content, as far as I can tell.
So, one could imagine ‘AI aggregator sites’ that offer a range of AI services produced by small independent AI developers. These could potentially compete with Big Tech outfits like OpenAI or DeepMind (which would be more analogous to Netflix, in terms of investing large sums in ‘original content’, i.e. original software).
But, whether that would increase or decrease AI risk, I’m not sure. My hunch is that the more people and organizations who are involved in AI development, the higher the risk that a few bad actors will produce truly dangerous AI systems, whether accidentally or deliberately. But, as you say, a more diverse AI ecosystem could reduce the change that a few big AI companies acquire and abuse a lot of power.
Micah—very interesting points.
The PornHub example raises something a lot of people seem not to understand very well about the porn industry. PornHub and its associated sites (owned by MindGeek) are ‘content aggregators’ that basically act as free advertising for the porn content produced by independent operators and small production companies—which all make their real money through subscription services. PornHub is a huge aggregator site, but as far as I know, it doesn’t actually produce any content of its own. So it’s quite unlike Netflix in this regard—Netflix spent about $17 billion in 2022 on original content, whereas PornHub spent roughly zero on original content, as far as I can tell.
So, one could imagine ‘AI aggregator sites’ that offer a range of AI services produced by small independent AI developers. These could potentially compete with Big Tech outfits like OpenAI or DeepMind (which would be more analogous to Netflix, in terms of investing large sums in ‘original content’, i.e. original software).
But, whether that would increase or decrease AI risk, I’m not sure. My hunch is that the more people and organizations who are involved in AI development, the higher the risk that a few bad actors will produce truly dangerous AI systems, whether accidentally or deliberately. But, as you say, a more diverse AI ecosystem could reduce the change that a few big AI companies acquire and abuse a lot of power.