Edit: Sincere apologies—when I read this, I read through the previous chain of comments quickly, and missed the importance of AI art specifically in titotal’s comment above. This makes Lark’s comment more reasonable than I assumed. It seems like we do disagree on a bunch of this topic, but much of my comment wasn’t correct.
---
This comment makes me uncomfortable, especially with the upvotes. I have a lot of respect for you, and I agree with this specific example. I don’t think you were meaning anything bad here. But I’m very suspicious that this specific example is really representative in a meaningful sense.
Often, when one person cites one and only single example of a thing, they are making an implicit argument that this example is decently representative. See the Cooperative Principle (I’ve been paying more attention to this recently). So I assume readers might take, “Here’s one example, it’s probably the main one that matters. People seem to agree with the example, so they probably agree with the implication from it being the only example.”
Some specifics that come to my mind: - In this specific example, it arguably makes it very difficult for Studio Ghibli to have control over a lot of their style. I’m sure that people at Studio Ghibli are very upset about this. Instead, OpenAI gets to make this accessible—but this is less an ideological choice but instead something that’s clearly commercially beneficial for OpenAI. If OpenAI wanted to stop this, it could (at least, until much better open models come out). More broadly, it can be argued that a lot of forms of power are being brought from media groups like Studio Ghibli, to a few AI companies like OpenAI. You can definitely argue that this is a good or bad thing on the net, but I think this is not exactly “power is now more decentralized.”— I think it’s easy to watch the trend lines and see where we might expect things to change. Generally, startups are highly subsidized in the short-term. Then they eventually “go bad” (see Enshittification). I’m absolutely sure that if/when OpenAI has serious monopoly power, they will do things that will upset a whole lot of people. - China has been moderating the ability of their LLMs to say controversial things that would look bad for China. I suspect that the US will do this shortly. I’m not feeling very optimistic with Elon Musk with X.AI, though that is a broader discussion.
On the flip side of this, I could very much see it being frustrating as “I just wanted to leave a quick example. Can’t there be some way to enter useful insights without people complaining about a lack of context?”
I’m honestly not sure what the solution is here. I think online discussions are very hard, especially when people don’t know each other very well, for reasons like this.
But in the very short-term, I just want to gently push back on the implication of this example, for this audience.
I could very much imagine a more extensive analysis suggesting that OpenAI’s image work promotes decentralization or centralization. But I think it’s clearly a complex question, at very least. I personally think that people broadly being able to do AI art now is great, but I still find it a tricky issue.
I’m not sure why you chose to frame your comment in such an unnecessarily aggressive way so I’m just going to ignore that and focus on the substance.
Yes, the Studio Ghibli example is representative of AI decentralizing power:
Previously, only a small group of people had an ability (to make good art, or diagnose illnesses, or translate a document, or do contract review, or sell a car, or be a taxi driver, etc.)
Now, due to a large tech company (e.g. Google, Uber, AirBnB, OpenAI) everyone who used to be able to still can, and also ordinary people can as well. This is a decentralization of power.
The fact that this was not due to an ideological choice made by AI companies is irrelevant. Centralization and decentralization often occurs for non ideological reasons.
The fact that things might change in the future is also not relevant. Yes, maybe one day Uber will raise prices to twice the level taxis used to charge, with four times the wait time and ten times the odor. But for now, they have helped decentralize power.
The group of producers who are now subject to increased competition are unsurprisingly upset. For fairly nakedly self-interested reasons they demand regulation.
Ideological leftists provide rhetorical ammunition to the rent-seekers, in classic baptists and bootleggers style.
These demands for regulation affect four different levels of the power hierarchy:
The government (most powerful): increases power
Tech platform: reduces power
Incumbent producers: increases power
Ordinary people (least powerful): reduces power
Because leftists focus on the second and third bullet points, characterizing it as a battle between small artisans and big business, they falsely claim to be pushing for power to be decentralized.
But actually they are pushing for power to be more centralized: from tech companies towards the leviathan, and from ordinary people towards incumbent producers.
Really sorry to hear that my comment above came off as aggressive. It was very much not meant like that. One mistake is that I too quickly read the comments above—that was my bad.
In terms of the specifics, I find your longer take interesting, though as I’m sure you expect, I disagree with a lot of it. There seem to be a lot of important background assumptions on this topic that both of us have.
I agree that there are a bunch of people on the left who are pushing for many bad regulations and ideas on this. But I think at the same time, some of them raise some certain good points (i.e. paranoia about power consolidation)
I feel like it’s fair to say that power is complex. Things like ChatGPT’s AI art will centralize power in some ways and decentralize it in others. On one hand, it’s very much true that many people can create neat artwork that they couldn’t before. But on the other, a bunch of key decisions and influence are being put into the hands of a few corporate actors, particularly ones with histories of being shady.
I think that some forms of IP protection make sense. I think this conversation gets much messier when it comes to LLMs, for which there just hasn’t been good laws yet on how to adjust for them. I’d hope that future artists who come up with innovative techniques could get some significant ways of being compensated for their contributions. I’d hope that writers and innovators could similarly get certain kinds of credit and rewards for their work.
AI art seems like a case of power becoming decentralized: before this week, few people could make Studio Ghibli art. Now everyone can.
Edit: Sincere apologies—when I read this, I read through the previous chain of comments quickly, and missed the importance of AI art specifically in titotal’s comment above. This makes Lark’s comment more reasonable than I assumed. It seems like we do disagree on a bunch of this topic, but much of my comment wasn’t correct.
---
This comment makes me uncomfortable, especially with the upvotes. I have a lot of respect for you, and I agree with this specific example.
I don’t think you were meaning anything bad here. But I’m very suspicious that this specific example is really representative in a meaningful sense.Often, when one person cites one and only single example of a thing, they are making an implicit argument that this example is decently representative. See theCooperative Principle(I’ve been paying more attention to this recently). So I assume readers might take, “Here’s one example, it’s probably the main one that matters. People seem to agree with the example, so they probably agree with the implication from it being the only example.”Some specifics that come to my mind:
- In this specific example, it arguably makes it very difficult for Studio Ghibli to have control over a lot of their style. I’m sure that people at Studio Ghibli are very upset about this. Instead, OpenAI gets to make this accessible—but this is less an ideological choice but instead something that’s clearly commercially beneficial for OpenAI. If OpenAI wanted to stop this, it could (at least, until much better open models come out). More broadly, it can be argued that a lot of forms of power are being brought from media groups like Studio Ghibli, to a few AI companies like OpenAI. You can definitely argue that this is a good or bad thing on the net, but I think this is not exactly “power is now more decentralized.”—
I think it’s easy to watch the trend lines and see where we might expect things to change. Generally, startups are highly subsidized in the short-term. Then they eventually “go bad” (see Enshittification). I’m absolutely sure that if/when OpenAI has serious monopoly power, they will do things that will upset a whole lot of people.
- China has been moderating the ability of their LLMs to say controversial things that would look bad for China. I suspect that the US will do this shortly. I’m not feeling very optimistic with Elon Musk with X.AI, though that is a broader discussion.
On the flip side of this, I could very much see it being frustrating as“I just wanted to leave a quick example. Can’t there be some way to enter useful insights without people complaining about a lack of context?”I’m honestly not sure what the solution is here. I think online discussions are very hard, especially when people don’t know each other very well, for reasons like this.But in the very short-term, I just want to gently push back on the implication of this example, for this audience.
I could very much imagine a more extensive analysis suggesting that OpenAI’s image work promotes decentralization or centralization. But I think it’s clearly a complex question, at very least. I personally think that people broadly being able to do AI art now is great, but I still find it a tricky issue.I’m not sure why you chose to frame your comment in such an unnecessarily aggressive way so I’m just going to ignore that and focus on the substance.
Yes, the Studio Ghibli example is representative of AI decentralizing power:
Previously, only a small group of people had an ability (to make good art, or diagnose illnesses, or translate a document, or do contract review, or sell a car, or be a taxi driver, etc.)
Now, due to a large tech company (e.g. Google, Uber, AirBnB, OpenAI) everyone who used to be able to still can, and also ordinary people can as well. This is a decentralization of power.
The fact that this was not due to an ideological choice made by AI companies is irrelevant. Centralization and decentralization often occurs for non ideological reasons.
The fact that things might change in the future is also not relevant. Yes, maybe one day Uber will raise prices to twice the level taxis used to charge, with four times the wait time and ten times the odor. But for now, they have helped decentralize power.
The group of producers who are now subject to increased competition are unsurprisingly upset. For fairly nakedly self-interested reasons they demand regulation.
Ideological leftists provide rhetorical ammunition to the rent-seekers, in classic baptists and bootleggers style.
These demands for regulation affect four different levels of the power hierarchy:
The government (most powerful): increases power
Tech platform: reduces power
Incumbent producers: increases power
Ordinary people (least powerful): reduces power
Because leftists focus on the second and third bullet points, characterizing it as a battle between small artisans and big business, they falsely claim to be pushing for power to be decentralized.
But actually they are pushing for power to be more centralized: from tech companies towards the leviathan, and from ordinary people towards incumbent producers.
Thanks for providing more detail into your views.
Really sorry to hear that my comment above came off as aggressive. It was very much not meant like that. One mistake is that I too quickly read the comments above—that was my bad.
In terms of the specifics, I find your longer take interesting, though as I’m sure you expect, I disagree with a lot of it. There seem to be a lot of important background assumptions on this topic that both of us have.
I agree that there are a bunch of people on the left who are pushing for many bad regulations and ideas on this. But I think at the same time, some of them raise some certain good points (i.e. paranoia about power consolidation)
I feel like it’s fair to say that power is complex. Things like ChatGPT’s AI art will centralize power in some ways and decentralize it in others. On one hand, it’s very much true that many people can create neat artwork that they couldn’t before. But on the other, a bunch of key decisions and influence are being put into the hands of a few corporate actors, particularly ones with histories of being shady.
I think that some forms of IP protection make sense. I think this conversation gets much messier when it comes to LLMs, for which there just hasn’t been good laws yet on how to adjust for them. I’d hope that future artists who come up with innovative techniques could get some significant ways of being compensated for their contributions. I’d hope that writers and innovators could similarly get certain kinds of credit and rewards for their work.