Well there’s a huge obvious problem with an “all generations are equal” moral theory. How do you even know whether you’re talking about actual moral agents? For all we know, maybe in the next few years some giant asteroid will wipe out human life entirely.
We can try to work with expected values and probabilities, but that only really works when you properly justify what probability you’re giving certain outcomes. I have no idea how someone gets something like a 1/6th probability of extinction risk from causes xyz especially when the science and tech of a few of these causes are speculative, and frankly it doesn’t sound possible.
If Ord is giving numbers like a 1⁄6 chance, he needs to back them up with math. Sure, the chance of asteroid extinction can be calculated by astronomers, but probability of extinction by climate change or rogue AI is a highly suspect endeavor when one of those things is currently purely imaginary and the other is a complex field with uncertain predictive models that generally only agree on pretty broad aspects of the planet.