The point, presumably, is that people would feel better because of the expectation that things would improve.
1/1000 people supposedly feels better, but then 999/1000 people will feel slightly worse, because they are given a scenario where they think that things may get worse, when we have the power to give them a guaranteed good scenario instead. It’s just shifting expectations around, trying to create a free lunch.
It also requires that people in bad situations actually believe that someone is going to build an AI that does this. As far as ways of making people feel more optimistic about life go, this is perhaps the most convoluted one that I have seen. Really there are easier ways of doing that: for instance, make them believe that someone is going to build an AI which actually solves their problem.
1/1000 people supposedly feels better, but then 999/1000 people will feel slightly worse, because they are given a scenario where they think that things may get worse, when we have the power to give them a guaranteed good scenario instead. It’s just shifting expectations around, trying to create a free lunch.
It also requires that people in bad situations actually believe that someone is going to build an AI that does this. As far as ways of making people feel more optimistic about life go, this is perhaps the most convoluted one that I have seen. Really there are easier ways of doing that: for instance, make them believe that someone is going to build an AI which actually solves their problem.