Thanks so much for this summary, this is one of my favorite easy-to-read pieces on the forum on recent times. You were so measured and thoughtful and it was super accessible. Here’s a couple of things I found super interesting.
Even if (like me) you reject hedonistic utilitarianism pretty hard, I agree it doesn’t necessarily change the practical considerations too much. I like the framework where if you weaken hedonism, long termism, impartialism or moral realism (like I do for all of them quite a lot), this might still end up not being able to dismiss invertebrates or digital minds. I don’t like that I can’t dismiss these things completely, but I feel I can’t all the same.
I also liked the threshold point you discuss where a low percentage might become “pascal’s-mugging-ish”. Or even just what percentage chance of disastrous suffering is enough for us to care about something. Some people can motivate themselves by pure expected value numbers, I Cannot. I’m not sure if there’s been a survey among EAs which checks what percentage possibilities of suffering or success we are willing to engage with (e.g. 1% 0.0001,% any no matter how now low) and on what level we are willing to engage (discuss, donate, devote whole life).
At even as high as 1 percent sentience chance for insects, I don’t think I could motivate myself to dedicate my life’s work on that (given there’s a 99 percent chance everything I do is almost useless), but with that probability I might consider never eating them, going to a protest against farming insects or donating to an insect welfare org. Others will have different thresholds, and I really respect folks who can motivate themselves purely based on the raw expected value of the work they do.
Thanks so much for this summary, this is one of my favorite easy-to-read pieces on the forum on recent times. You were so measured and thoughtful and it was super accessible. Here’s a couple of things I found super interesting.
Even if (like me) you reject hedonistic utilitarianism pretty hard, I agree it doesn’t necessarily change the practical considerations too much. I like the framework where if you weaken hedonism, long termism, impartialism or moral realism (like I do for all of them quite a lot), this might still end up not being able to dismiss invertebrates or digital minds. I don’t like that I can’t dismiss these things completely, but I feel I can’t all the same.
I also liked the threshold point you discuss where a low percentage might become “pascal’s-mugging-ish”. Or even just what percentage chance of disastrous suffering is enough for us to care about something. Some people can motivate themselves by pure expected value numbers, I Cannot. I’m not sure if there’s been a survey among EAs which checks what percentage possibilities of suffering or success we are willing to engage with (e.g. 1% 0.0001,% any no matter how now low) and on what level we are willing to engage (discuss, donate, devote whole life).
At even as high as 1 percent sentience chance for insects, I don’t think I could motivate myself to dedicate my life’s work on that (given there’s a 99 percent chance everything I do is almost useless), but with that probability I might consider never eating them, going to a protest against farming insects or donating to an insect welfare org. Others will have different thresholds, and I really respect folks who can motivate themselves purely based on the raw expected value of the work they do.