Another very bad outcome relates to the risk of accidental mass suicide if a widespread belief in the consciousness of digital people is mistaken.
Supposing something like Integrated Information Theory is true, then digital people wouldn’t be conscious unless they were instantiated on neuromorphic computer architecture with much more integration than our existing von Neumann architecture. But if people volunteer to get their physical brains extracted, sliced open, scanned, and uploaded onto VN-architecture computers because they think it will give them immortality and other new abilities, they’d accidentally be killing themselves.
It therefore might be preferable to duplicate and modify a small number of initial uploads rather than let just anybody upload who wants to. Obviously there’s no suicide risk if digital people aren’t uploads or if uploading doesn’t require killing the bio-people though. Nevertheless, IIT would still complicate the question about granting civil rights.
Another very bad outcome relates to the risk of accidental mass suicide if a widespread belief in the consciousness of digital people is mistaken.
Supposing something like Integrated Information Theory is true, then digital people wouldn’t be conscious unless they were instantiated on neuromorphic computer architecture with much more integration than our existing von Neumann architecture. But if people volunteer to get their physical brains extracted, sliced open, scanned, and uploaded onto VN-architecture computers because they think it will give them immortality and other new abilities, they’d accidentally be killing themselves.
It therefore might be preferable to duplicate and modify a small number of initial uploads rather than let just anybody upload who wants to. Obviously there’s no suicide risk if digital people aren’t uploads or if uploading doesn’t require killing the bio-people though. Nevertheless, IIT would still complicate the question about granting civil rights.