I agree that points 1 and 2 are unrelated, but I think most people outside EA would agree that a universe of happy bricks is bad. (As I argued in a previous post, it’s pretty indistinguishable from a universe of paperclips.) This is one problem that I (and possibly others) have with EA.
I second this! I’m one of the many people who think that maximizing happiness would be terrible. (I mean, there would be worse things you could do, but compared to what a normal, decent person would do, it’s terrible.)
The reason is simple: when you maximize something, by definition that means being willing to sacrifice everything else for the sake of that thing. Depending on the situation you are in, you might not need to sacrifice anything else; in fact, depending on the situation, maximizing that one thing might lead to lots of other things as a bonus—but in principle, if you are maximizing something, then you are willing to sacrifice everything else for the sake of it. Justice. Beauty. Fairness. Equality. Friendship. Art. Wisdom. Knowledge. Adventure. The list goes on and on. If maximizing happiness required sacrificing all of those things, such that the world contained none of them, would you still think it was the right thing to do? I hope not.
(Moreover, based on the laws of physics as we currently understand them, maximizing happiness WILL require us to sacrifice all of the things mentioned above, except possibly Wisdom and Knowledge, and even they will be concentrated in one being or kind of being.)
This is a problem with utilitarianism, not EA, but EA is currently dominated by utilitarians.
I agree that points 1 and 2 are unrelated, but I think most people outside EA would agree that a universe of happy bricks is bad. (As I argued in a previous post, it’s pretty indistinguishable from a universe of paperclips.) This is one problem that I (and possibly others) have with EA.
I second this! I’m one of the many people who think that maximizing happiness would be terrible. (I mean, there would be worse things you could do, but compared to what a normal, decent person would do, it’s terrible.)
The reason is simple: when you maximize something, by definition that means being willing to sacrifice everything else for the sake of that thing. Depending on the situation you are in, you might not need to sacrifice anything else; in fact, depending on the situation, maximizing that one thing might lead to lots of other things as a bonus—but in principle, if you are maximizing something, then you are willing to sacrifice everything else for the sake of it. Justice. Beauty. Fairness. Equality. Friendship. Art. Wisdom. Knowledge. Adventure. The list goes on and on. If maximizing happiness required sacrificing all of those things, such that the world contained none of them, would you still think it was the right thing to do? I hope not.
(Moreover, based on the laws of physics as we currently understand them, maximizing happiness WILL require us to sacrifice all of the things mentioned above, except possibly Wisdom and Knowledge, and even they will be concentrated in one being or kind of being.)
This is a problem with utilitarianism, not EA, but EA is currently dominated by utilitarians.