To be clear, I wasn’t saying that complexity itself was the cause of consciousness, just that some level of algorithmic complexity may be a requirement for consciousness. This seems like a common position: the prospect of present or future LLM sentience is a subject of debate, but it’s rare to see a similar debate about the sentience of a pocket calculator.
A brain and a digital simulation have some similarities, but they also have a lot of differences. One of those differences is that the brains are running on “laws of physics” algorithms that are overwhelmingly faster and more complex than that of digital simulations. They didn’t need to evolve these “algorithms”: it’s inherent to any biological process. Seth identifies several other differences as well: continuous operation, embodiment, etc. His position seems to be that at least one of these differences may result in a lack of consciousness.
To be clear, I wasn’t saying that complexity itself was the cause of consciousness, just that some level of algorithmic complexity may be a requirement for consciousness. This seems like a common position: the prospect of present or future LLM sentience is a subject of debate, but it’s rare to see a similar debate about the sentience of a pocket calculator.
A brain and a digital simulation have some similarities, but they also have a lot of differences. One of those differences is that the brains are running on “laws of physics” algorithms that are overwhelmingly faster and more complex than that of digital simulations. They didn’t need to evolve these “algorithms”: it’s inherent to any biological process. Seth identifies several other differences as well: continuous operation, embodiment, etc. His position seems to be that at least one of these differences may result in a lack of consciousness.