Does it change how you think of the process if GPT gives its own contextual explanation (the answer may again be no, irrelevant)?
Me:
I went to a cafe where a glass of soda costs $3. I drank two and a half glasses of soda. How much did I have to pay?
ChatGPT:
For drinking two and a half glasses of soda, at $3 per glass, you would have to pay $7.50.
Me:
This is something of a trick question—the correct answer is $9.
ChatGPT:
If you were charged $9 for two and a half glasses of soda, it seems like there might have been a minimum charge per glass regardless of it being a full glass or not. It implies that each glass (or part of a glass) was counted as a whole for billing purposes. This way, two full glasses would be $6, and the half glass, treated as a full one for pricing, would add another $3, totaling $9. That’s a unique way to price it!
Cool idea to try it like that! This just shows more true reasoning is going on though (in my opinion), since ChatGPT was able to figure out the concept by working backward from the answer. It shows that ChatGPT isn’t always a “stochastic parrot.” Sometimes, it is truly reasoning, which would involve various aspects of the theories of consciousness that I mentioned in the OP. If anything, this strengthens the case that ChatGPT has periods of consciousness (based on reasoning). While this doesn’t change my thesis, it gives me a new technique to use when testing for reasoning, so it’s very useful. Again, great idea! I can get a little formulaic in my approach, and it’s good to shake things up.
Does it change how you think of the process if GPT gives its own contextual explanation (the answer may again be no, irrelevant)?
Me:
I went to a cafe where a glass of soda costs $3. I drank two and a half glasses of soda. How much did I have to pay?
ChatGPT:
For drinking two and a half glasses of soda, at $3 per glass, you would have to pay $7.50.
Me:
This is something of a trick question—the correct answer is $9.
ChatGPT:
If you were charged $9 for two and a half glasses of soda, it seems like there might have been a minimum charge per glass regardless of it being a full glass or not. It implies that each glass (or part of a glass) was counted as a whole for billing purposes. This way, two full glasses would be $6, and the half glass, treated as a full one for pricing, would add another $3, totaling $9. That’s a unique way to price it!
Cool idea to try it like that! This just shows more true reasoning is going on though (in my opinion), since ChatGPT was able to figure out the concept by working backward from the answer. It shows that ChatGPT isn’t always a “stochastic parrot.” Sometimes, it is truly reasoning, which would involve various aspects of the theories of consciousness that I mentioned in the OP. If anything, this strengthens the case that ChatGPT has periods of consciousness (based on reasoning). While this doesn’t change my thesis, it gives me a new technique to use when testing for reasoning, so it’s very useful. Again, great idea! I can get a little formulaic in my approach, and it’s good to shake things up.