Thanks. Going back to your original impact estimate, I think the bigger difficulty I have in swallowing your impact estimate and claims related to it (e.g. “the ultimate weight of small decisions you make is measured not in dollars or relative status, but in stars”) is not the probabilities of AI or space expansion, but what seems to me to be a pretty big jump from the potential stakes of a cause area or value possible in the future without any existential catastrophes, to the impact that researchers working on that cause area might have.
Can you be less abstract and point, quantitatively, to which numbers I gave seem vastly off to you and insert your own numbers? I definitely think my numbers are pretty fuzzy but I’d like to see different ones before just arguing verbally instead.
(Also I think my actual original argument was a conditional claim, so it feels a little bit weird to be challenged on the premises of them! :)).
Thanks. Going back to your original impact estimate, I think the bigger difficulty I have in swallowing your impact estimate and claims related to it (e.g. “the ultimate weight of small decisions you make is measured not in dollars or relative status, but in stars”) is not the probabilities of AI or space expansion, but what seems to me to be a pretty big jump from the potential stakes of a cause area or value possible in the future without any existential catastrophes, to the impact that researchers working on that cause area might have.
Can you be less abstract and point, quantitatively, to which numbers I gave seem vastly off to you and insert your own numbers? I definitely think my numbers are pretty fuzzy but I’d like to see different ones before just arguing verbally instead.
(Also I think my actual original argument was a conditional claim, so it feels a little bit weird to be challenged on the premises of them! :)).