Many in EA focus on preventing a future self-improving superintelligent agent that might pursue some alien goal misaligned with human values. But this podcast made me realise that such an agent already exists—not as a conscious entity, but as an emergent, decentralized system. It’s what Scott Alexander called Moloch: the dynamics of markets, algorithms, status games, and incentive structures that collectively form a kind of self-improving, misaligned intelligence.
Screen time is one of the proxy goals it optimises for—not because anyone chose it, but because attention is monetisable. And now, Moloch is building more powerful AI, which risks accelerating its agenda, including screentime. A generation raised like this could bring us closer to something like Idiocracy—a society overwhelmed by problems, but cognitively unequipped to solve them. Maybe reducing harmful-type screentime isn’t just a public health move, maybe it’s part of fighting back.
Many in EA focus on preventing a future self-improving superintelligent agent that might pursue some alien goal misaligned with human values. But this podcast made me realise that such an agent already exists—not as a conscious entity, but as an emergent, decentralized system. It’s what Scott Alexander called Moloch: the dynamics of markets, algorithms, status games, and incentive structures that collectively form a kind of self-improving, misaligned intelligence.
Screen time is one of the proxy goals it optimises for—not because anyone chose it, but because attention is monetisable. And now, Moloch is building more powerful AI, which risks accelerating its agenda, including screentime. A generation raised like this could bring us closer to something like Idiocracy—a society overwhelmed by problems, but cognitively unequipped to solve them. Maybe reducing harmful-type screentime isn’t just a public health move, maybe it’s part of fighting back.