Another question: Would you be worried that the impact of humanity on the world (more precisely, industrial civilization) could be net-negative if we aligned AI with human values ?
One of my fears is that if we include factory farms in the equation, humanity causes more suffering than wellbeing, simply because animals are more numerous than humans and often have horrible lives. (if we include wild animals, this gets more complicated). So if we were to align AI with human values only, this would boost factory farming and keep it running for a long time, making the overall situation much worse.
Wouldn’t interstellar travel close to the speed of light require a huge amount of energy, and a level of technological transformation that again seems much higher than most people expect?
Well, harnessing ALL of the energy produced by the sun (or even half of it) sounds pretty far away in time.
I’ll make a disgression: The risk of X-risks seems to increase with the amount of energy at disposal (only a correlation, yes, but a lot of power (=energy) seems necessary to destroy the conditions of life on this planet, and the more power we have, the easier it becomes). As I pointed out, in the book Power, Richard Heinberg makes the case that we are overpowered: we have so much energy that we risk wiping out ourselves by accident. Worse yet, the goal of our current economic and political structures is to get even more power—forever.
So I’d expect a society with this amount of power to face many other problems bafore getting to “harnessing the sun”. The Fermi paradox seems to point this way.
But even then, this doesn’t really adress the point I made above about animal suffering.
It’s very interesting to have your views on this.
Another question: Would you be worried that the impact of humanity on the world (more precisely, industrial civilization) could be net-negative if we aligned AI with human values ?
One of my fears is that if we include factory farms in the equation, humanity causes more suffering than wellbeing, simply because animals are more numerous than humans and often have horrible lives. (if we include wild animals, this gets more complicated).
So if we were to align AI with human values only, this would boost factory farming and keep it running for a long time, making the overall situation much worse.
I’m aware that cultivated meat could help solve the issue, but this seems far from automatic—many people in animal welfare don’t seem so optimistic about that. It could not work out for quite a number of reasons:
https://www.forbes.com/sites/briankateman/2022/09/06/optimistic-longtermism-is-terrible-for-animals/?sh=328a115d2059
https://www.forbes.com/sites/briankateman/2022/12/07/if-we-dont-end-factory-farming-soon-it-might-be-here-forever/?sh=63fa11527e3e
Not really—about six hours of the energy produced by the sun.
Well, harnessing ALL of the energy produced by the sun (or even half of it) sounds pretty far away in time.
I’ll make a disgression: The risk of X-risks seems to increase with the amount of energy at disposal (only a correlation, yes, but a lot of power (=energy) seems necessary to destroy the conditions of life on this planet, and the more power we have, the easier it becomes). As I pointed out, in the book Power, Richard Heinberg makes the case that we are overpowered: we have so much energy that we risk wiping out ourselves by accident. Worse yet, the goal of our current economic and political structures is to get even more power—forever.
So I’d expect a society with this amount of power to face many other problems bafore getting to “harnessing the sun”. The Fermi paradox seems to point this way.
But even then, this doesn’t really adress the point I made above about animal suffering.
Oh—sorry—I meant to reply to AnonymousAccount instead—it was their text that I was quoting. I’ve now put it there—should I delete this one?
Yeah, I though it was something like that ^^
But no, let’s keep that here.