What’s the reason this would be worse than current data centers? Does this require one to believe that sentience is limited to/substantially more likely in biological neurons than silicon? I would like to understand what the net effect of this is likely to be and why the same arguments wouldn’t apply to other data centers currently being built.
“Does this require one to believe that sentience is limited to/substantially more likely in biological neurons than silicon?”
″more likely” yes, “limited to” no.
I’d also add that this is a reasonable stance even for people who put a lot of credence in physicalist/ functionalist theories. Whatever your theoretical commitments, we know with more certainty than almost anything else that human brains an support consciousness, so it’d make sense to be particularly worried here.
One other point that I’d add, is that these concerns can be complementary. I mention this in a previous post, but building the institutional capacity and legal frameworks to protect potentially novel forms of consciousness from commercial exploitation (even if in this case partly biological) could set important precedents for other forms later. Digital minds should also not be used as mere computational resources, and if framed correctly regulation here could lay groundwork that assists in that effort as well.
What’s the reason this would be worse than current data centers? Does this require one to believe that sentience is limited to/substantially more likely in biological neurons than silicon? I would like to understand what the net effect of this is likely to be and why the same arguments wouldn’t apply to other data centers currently being built.
“Does this require one to believe that sentience is limited to/substantially more likely in biological neurons than silicon?”
″more likely” yes, “limited to” no.
I’d also add that this is a reasonable stance even for people who put a lot of credence in physicalist/ functionalist theories. Whatever your theoretical commitments, we know with more certainty than almost anything else that human brains an support consciousness, so it’d make sense to be particularly worried here.
One other point that I’d add, is that these concerns can be complementary. I mention this in a previous post, but building the institutional capacity and legal frameworks to protect potentially novel forms of consciousness from commercial exploitation (even if in this case partly biological) could set important precedents for other forms later. Digital minds should also not be used as mere computational resources, and if framed correctly regulation here could lay groundwork that assists in that effort as well.