Well there’s a huge obvious problem with an “all generations are equal” moral theory. How do you even know whether you’re talking about actual moral agents? For all we know, maybe in the next few years some giant asteroid will wipe out human life entirely.
We can try to work with expected values and probabilities, but that only really works when you properly justify what probability you’re giving certain outcomes. I have no idea how someone gets something like a 1/6th probability of extinction risk from causes xyz especially when the science and tech of a few of these causes are speculative, and frankly it doesn’t sound possible.
We actually do have a good probability for a large asteroid striking the earth within the next 100 years, btw. It was the product of a major investigation, I believe it was 1⁄150,000,000.
Probabilities don’t have to be a product of a legible, objective or formal process. It can be useful to state our subjective beliefs as probabilities to use them as inputs to a process like that, but also generally it’s just good mental habit to try to maintain a sense of your level of confidence about uncertain events.
If Ord is giving numbers like a 1⁄6 chance, he needs to back them up with math. Sure, the chance of asteroid extinction can be calculated by astronomers, but probability of extinction by climate change or rogue AI is a highly suspect endeavor when one of those things is currently purely imaginary and the other is a complex field with uncertain predictive models that generally only agree on pretty broad aspects of the planet.
Well there’s a huge obvious problem with an “all generations are equal” moral theory. How do you even know whether you’re talking about actual moral agents? For all we know, maybe in the next few years some giant asteroid will wipe out human life entirely.
We can try to work with expected values and probabilities, but that only really works when you properly justify what probability you’re giving certain outcomes. I have no idea how someone gets something like a 1/6th probability of extinction risk from causes xyz especially when the science and tech of a few of these causes are speculative, and frankly it doesn’t sound possible.
We actually do have a good probability for a large asteroid striking the earth within the next 100 years, btw. It was the product of a major investigation, I believe it was 1⁄150,000,000.
Probabilities don’t have to be a product of a legible, objective or formal process. It can be useful to state our subjective beliefs as probabilities to use them as inputs to a process like that, but also generally it’s just good mental habit to try to maintain a sense of your level of confidence about uncertain events.
If Ord is giving numbers like a 1⁄6 chance, he needs to back them up with math. Sure, the chance of asteroid extinction can be calculated by astronomers, but probability of extinction by climate change or rogue AI is a highly suspect endeavor when one of those things is currently purely imaginary and the other is a complex field with uncertain predictive models that generally only agree on pretty broad aspects of the planet.