Why do you think that data poisoning, scaling and water scarcity are a distraction to issues like AI alignment and safety? Am I missing something obvious? Did conflicts over water happen too few times (or not at all)? Can we easily deal with data poisoning and model scaling? Are AI alignment and safety that much bigger issues?
To clarify, I’m mainly just sceptical that water-scarcity is a significant consideration wrt the trajectory of transformative AI. I’m not here arguing against water-scarcity (or data poisoning) as an important cause to focus altruistic efforts on.
Hunches/reasons that I’m sceptical of water as a consideration for transformative AI:
I doubt water will be a bottleneck to scaling
My doubt here mainly just stems from a poorly-argued & uncertain intuition about other factors being more relevant. If I were to look into this more, I would try to find some basic numbers about:
How much water goes into the maintenance of data centers relative to other things fungible water-sources are used for?
What proportion of a data center’s total expenditures are used to purchase water?
I’m not sure how these things work, so don’t take my own scepticism as grounds to distrust your own (perhaps-better-informed) model of these things.
Assuming scaling is bottlenecked by water, I think great-power conflict are unlikely to be caused by it
Assuming conflicts happen due to water-bottleneck, I don’t think this will significantly influence the long-term outcome of transformative AI
Note: I’ll read if you respond, but I’m unlikely to respond in turn, since I’m trying to prioritize other things atm. Either way, thanks for an idea I hadn’t considered before! : )
Thanks a lot for your feedback!
Why do you think that data poisoning, scaling and water scarcity are a distraction to issues like AI alignment and safety? Am I missing something obvious? Did conflicts over water happen too few times (or not at all)? Can we easily deal with data poisoning and model scaling? Are AI alignment and safety that much bigger issues?
To clarify, I’m mainly just sceptical that water-scarcity is a significant consideration wrt the trajectory of transformative AI. I’m not here arguing against water-scarcity (or data poisoning) as an important cause to focus altruistic efforts on.
Hunches/reasons that I’m sceptical of water as a consideration for transformative AI:
I doubt water will be a bottleneck to scaling
My doubt here mainly just stems from a poorly-argued & uncertain intuition about other factors being more relevant. If I were to look into this more, I would try to find some basic numbers about:
How much water goes into the maintenance of data centers relative to other things fungible water-sources are used for?
What proportion of a data center’s total expenditures are used to purchase water?
I’m not sure how these things work, so don’t take my own scepticism as grounds to distrust your own (perhaps-better-informed) model of these things.
Assuming scaling is bottlenecked by water, I think great-power conflict are unlikely to be caused by it
Assuming conflicts happen due to water-bottleneck, I don’t think this will significantly influence the long-term outcome of transformative AI
Note: I’ll read if you respond, but I’m unlikely to respond in turn, since I’m trying to prioritize other things atm. Either way, thanks for an idea I hadn’t considered before! : )