My estimate was just one estimate. I could have included it in the table but when I did the table it seemed like such an outlier, and done with a totally different method as well, perhaps useful for a different purpose… It might be worth adding it into the table? Not sure.
Interesting consideration! If we expect humanity to at one point technologize the LS, and extinction prevents that, don’t we still lose all those lives? It would not eradicate all life if there were aliens, but still the same amount of life in total. (I’m not endorsing any one prediction for how large the future will be.) My formulas here don’t quantify how much worse it is to lose 100% of life than 99% of life.
Sure, you could set your threshold differently depending on your purpose. I could have made this clearer!
Exactly as you say, comparing across cause areas, you might want to keep the cost you’re willing to pay for an outcome (a life) consistent.
If you’ve decided on a worldview diversification strategy that gives you separate buckets for different cause areas (e.g. by credence instead of by stakes), then you’d want to set your threshold separately for different cause areas, and use each threshold to compare within a cause area. If you set a threshold for what you’re willing to pay for a life within longtermist interventions, and fewer funding opportunities live up to that compared to the amount of money you have available, you can save some of your money in that bucket and donate it later, in the hopes that new opportunities that meet your threshold can arise. For an example of giving later based on a threshold, Open Philanthropy wants to give money each year to projects that are more cost-effective than what they will spend their “last dollar” on.
Re 2 - ah yeah, I was assuming that at least one alien civilisation would aim to ‘technologize the Local Supercluster’ if humans didn’t. If they all just decided to stick to their own solar system or not spread sentience/digital minds, then of course that would be a loss of experiences.
Thanks Matt!
My estimate was just one estimate. I could have included it in the table but when I did the table it seemed like such an outlier, and done with a totally different method as well, perhaps useful for a different purpose… It might be worth adding it into the table? Not sure.
Interesting consideration! If we expect humanity to at one point technologize the LS, and extinction prevents that, don’t we still lose all those lives? It would not eradicate all life if there were aliens, but still the same amount of life in total. (I’m not endorsing any one prediction for how large the future will be.) My formulas here don’t quantify how much worse it is to lose 100% of life than 99% of life.
Sure, you could set your threshold differently depending on your purpose. I could have made this clearer!
Exactly as you say, comparing across cause areas, you might want to keep the cost you’re willing to pay for an outcome (a life) consistent.
If you’ve decided on a worldview diversification strategy that gives you separate buckets for different cause areas (e.g. by credence instead of by stakes), then you’d want to set your threshold separately for different cause areas, and use each threshold to compare within a cause area. If you set a threshold for what you’re willing to pay for a life within longtermist interventions, and fewer funding opportunities live up to that compared to the amount of money you have available, you can save some of your money in that bucket and donate it later, in the hopes that new opportunities that meet your threshold can arise. For an example of giving later based on a threshold, Open Philanthropy wants to give money each year to projects that are more cost-effective than what they will spend their “last dollar” on.
Thanks, me too!
Re 2 - ah yeah, I was assuming that at least one alien civilisation would aim to ‘technologize the Local Supercluster’ if humans didn’t. If they all just decided to stick to their own solar system or not spread sentience/digital minds, then of course that would be a loss of experiences.
Thanks for clarifying 1 and 3!