I’m pretty happy with how this “Where should I donate, under my values?” Manifold market has been turning out. Of course all the usual caveats pertaining to basically-fake “prediction” markets apply, but given the selection effects of who spends manna on an esoteric market like this I put a non-trivial weight into the (live) outcomes.
I guess I’d encourage people with a bit more money to donate to do something similar (or I guess defer, if you think I’m right about ethics!), if just as one addition to your portfolio of donation-informing considerations.
Thanks! Let me write them as a loss function in python (ha)
For real though:
Some flavor of hedonic utilitarianism
I guess I should say I have moral uncertainty (which I endorse as a thing) but eh I’m pretty convinced
Longtermism as explicitly defined is true
Don’t necessarily endorse the cluster of beliefs that tend to come along for the ride though
“Suffering focused total utilitarian” is the annoying phrase I made up for myself
I think many (most?) self-described total utilitarians give too little consideration/weight to suffering, and I don’t think it really matters (if there’s a fact of the matter) whether this is because of empirical or moral beliefs
Maybe my most substantive deviation from the default TU package is the following (defended here):
“Under a form of utilitarianism that places happiness and suffering on the same moral axis and allows that the former can be traded off against the latter, one might nevertheless conclude that some instantiations of suffering cannot be offset or justified by even an arbitrarily large amount of wellbeing.”
Moral realism for basically all the reasons described by Rawlette on 80k but I don’t think this really matters after conditioning on normative ethical beliefs
Nothing besides valenced qualia/hedonic tone has intrinsic value
I think that might literally be it—everything else is contingent!
I’m pretty happy with how this “Where should I donate, under my values?” Manifold market has been turning out. Of course all the usual caveats pertaining to basically-fake “prediction” markets apply, but given the selection effects of who spends manna on an esoteric market like this I put a non-trivial weight into the (live) outcomes.
I guess I’d encourage people with a bit more money to donate to do something similar (or I guess defer, if you think I’m right about ethics!), if just as one addition to your portfolio of donation-informing considerations.
This is a really interesting idea! What are your values, so I can make an informed decision?
Thanks! Let me write them as a loss function in python (ha)
For real though:
Some flavor of hedonic utilitarianism
I guess I should say I have moral uncertainty (which I endorse as a thing) but eh I’m pretty convinced
Longtermism as explicitly defined is true
Don’t necessarily endorse the cluster of beliefs that tend to come along for the ride though
“Suffering focused total utilitarian” is the annoying phrase I made up for myself
I think many (most?) self-described total utilitarians give too little consideration/weight to suffering, and I don’t think it really matters (if there’s a fact of the matter) whether this is because of empirical or moral beliefs
Maybe my most substantive deviation from the default TU package is the following (defended here):
“Under a form of utilitarianism that places happiness and suffering on the same moral axis and allows that the former can be traded off against the latter, one might nevertheless conclude that some instantiations of suffering cannot be offset or justified by even an arbitrarily large amount of wellbeing.”
Moral realism for basically all the reasons described by Rawlette on 80k but I don’t think this really matters after conditioning on normative ethical beliefs
Nothing besides valenced qualia/hedonic tone has intrinsic value
I think that might literally be it—everything else is contingent!
I was inspired to create this market! I would appreciate it if you weighed in. :)
Some shrinsight (shrimpsight?) from the comments: