Should EA avoid using AI art for non-research purposes?
I voted 100% agree.
As a young person, I know that my first impression seeing AI art on this forum, a forum that’s specifically meant to be concerned about the harms of AI came off as extremely hypocritical, and I know for a fact that many of my friends sympathetic to EA principles would feel the same.
Even if EAs could argue that using AI art is a net positive, (which I don’t considering how little the benefits are in the vast majority of cases), I just don’t think this is a hill EAs should be willing to die on.
An overwhelming majority of young people, leftists, and people concerned about AI (basically our target audience) strongly oppose AI art, and in plastering AI art everywhere we basically guarantee that our target audience will at least have an initial sour taste in their mouth seeing this forum.
AI art is far from the biggest potential concern surrounding AI. However, it’s the concern that people seem to care the most about, and it’s crucial to use this concern to get people on our side, and promote other more pressing issues related to AI.
Do we really want to promote something that’s wildly unpopular among our target audience and make ourselves seem like hypocrites for very little apparent benefit?
An overwhelming majority of young people, leftists, and people concerned about AI (basically our target audience) strongly oppose AI art
Can you say why you think this?
I would also say that I think it would be helpful to get people who aren’t currently concerned about AI to be concerned, so I don’t strictly agree that the target audience is only people who currently care.
Let’s suppose we agree this is so, as a working hypothesis.
How do you propose a community which would cater to the aesthetic tastes of its majority would avoid evaporative cooling of group beliefs? This a grave concern of mine. That and O’Sullivan’s Curse, itself related to group polarization.
I voted 100% agree.
As a young person, I know that my first impression seeing AI art on this forum, a forum that’s specifically meant to be concerned about the harms of AI came off as extremely hypocritical, and I know for a fact that many of my friends sympathetic to EA principles would feel the same.
Even if EAs could argue that using AI art is a net positive, (which I don’t considering how little the benefits are in the vast majority of cases), I just don’t think this is a hill EAs should be willing to die on.
An overwhelming majority of young people, leftists, and people concerned about AI (basically our target audience) strongly oppose AI art, and in plastering AI art everywhere we basically guarantee that our target audience will at least have an initial sour taste in their mouth seeing this forum.
AI art is far from the biggest potential concern surrounding AI. However, it’s the concern that people seem to care the most about, and it’s crucial to use this concern to get people on our side, and promote other more pressing issues related to AI.
Do we really want to promote something that’s wildly unpopular among our target audience and make ourselves seem like hypocrites for very little apparent benefit?
Can you say why you think this?
I would also say that I think it would be helpful to get people who aren’t currently concerned about AI to be concerned, so I don’t strictly agree that the target audience is only people who currently care.
Let’s suppose we agree this is so, as a working hypothesis.
How do you propose a community which would cater to the aesthetic tastes of its majority would avoid evaporative cooling of group beliefs? This a grave concern of mine. That and O’Sullivan’s Curse, itself related to group polarization.