there doesn’t seem to be any obvious mechanism for general quantum level truths to exert the kinds of very targeted influences that would be necessary for them to explain our beliefs about consciousness
I think it will turn out that the mechanism will not be obvious, mainly because quantum mechanics and fundamental physics more broadly are extraordinarily complex (and I expect that understanding consciousness will be just as difficult as understanding, say, quantum field theory). But, that being said, I do think there exist candidate quantum mechanisms that might explain the macro-level phenomenon of binding, such as entanglement.
Another assumption behind my position (which I also outlined in Indirect realism illustrated (and why it matters so much for consciousness debates)) is that, since I believe consciousness/qualia are real (and a thing, not a process), the only sense in which they can be really real is for their 3rd-person physical correlates to be found at the deepest level of reality/physics. Any correlates that are not at the deepest level—however elaborate—are just useful fictions, and thus (IMO) no different than what e.g. computational functionalists claim.
Thanks for reading and for your comment, Derek!
I think it will turn out that the mechanism will not be obvious, mainly because quantum mechanics and fundamental physics more broadly are extraordinarily complex (and I expect that understanding consciousness will be just as difficult as understanding, say, quantum field theory). But, that being said, I do think there exist candidate quantum mechanisms that might explain the macro-level phenomenon of binding, such as entanglement.
Another assumption behind my position (which I also outlined in Indirect realism illustrated (and why it matters so much for consciousness debates)) is that, since I believe consciousness/qualia are real (and a thing, not a process), the only sense in which they can be really real is for their 3rd-person physical correlates to be found at the deepest level of reality/physics. Any correlates that are not at the deepest level—however elaborate—are just useful fictions, and thus (IMO) no different than what e.g. computational functionalists claim.
Hope that makes my views a bit clearer.