taste: for some reason, EAs are able to (hopefully correctly) allocate more resources to AI alignment than overpopulation or the energy decline, for reasons not explained by the above.
Of course, in the eyes of the people warning about energy depletion, expecting energy growth to continue over decades is not the rational decision ^^
I mean, 85% of energy comes from a finite stock, and all renewables currently need this stock to build and maintain renewables, so from the outside that seems at least worth exploring seriously—but I feel like very few people really considered the issue in EA (as said here).
Which is normal, very little prominent figures are warning about it, and the best arguments are rarely put forward. There are a few people talking about this in France, but without them I think I’d have ignored this topic, like everybody.
So I’d argue that exposition to a problem matters greatly as well.
Of course, in the eyes of the people warning about energy depletion, expecting energy growth to continue over decades is not the rational decision ^^
I mean, 85% of energy comes from a finite stock, and all renewables currently need this stock to build and maintain renewables, so from the outside that seems at least worth exploring seriously—but I feel like very few people really considered the issue in EA (as said here).
Which is normal, very little prominent figures are warning about it, and the best arguments are rarely put forward. There are a few people talking about this in France, but without them I think I’d have ignored this topic, like everybody.
So I’d argue that exposition to a problem matters greatly as well.