I am far from an unbiased party since I briefly worked on the moral weight project as a (paid) intern for a couple of months, but for what itâs worth, as a philosophy of consciousness PhD, itâs not just that I, personally, from an inside point of view, think weighting by neuron count is bad idea, itâs that I canât think of any philosopher or scientist who maintains that âmore neurons make for more intense experiencesâ, or any philosophical or scientific theory of consciousness that clearly supports this. The only places Iâve ever encountered the view is EAs, usually without formal background in philosophy of mind or cognitive science, defending focusing near-termist EA money on humans. (Neuron count might correlate with other stuff we care about apart from experience intensity of course, and Iâm not a pure hedonist.)
For one thing, unless you are a mind-body dualist-and the majority of philosophers are not-it doesnât make sense to think of pain as like some sort of stuff/âsubstance like water or air, that the brain can produce more or less of. And I think that sort of picture lies behind the intuitive appeal of more neurons=more intense experiences.
Iâm sure thatâs true, but is anyone (or more than 1 or 2 people) outside of EA even really asking and publishing about that question about comparing intensity of pain between species and how much meeting count might matter regarding this?
I tried to read the referenced articles in the moral weights ticle about neuron count, and the articles I could read (that werenât behind a paywall) didnât talk about neuron count nor discussed comparing the intensity of pain between species? Id be interested to read anything which discussed this directly outside of the EA realm.
I kind of thought this was part of the reason why the moral weightâs project is so important, as groups of researchers are deeply considering these questions in ways that perhaps others havenât before.
Yeah, thatâs a fair point, maybe I havenât seen it because no one has considered how to do the weighting at all outside EA. But my sense is that at the very least many theories are unfriendly to weighting by neuron count (though probably not all).
My inclination was that they were more cognitive pathway theories of how pain works, that might help answer the question on how likely animals were to feel pain rather than thinking much about quantifying that ( which is where neuron count might come in). But I skim read and didnât understand about half of it well so could easily have missed things.
I canât be sure that they arenât somewhere as âphilosophy of consciousnessâ, let alone cognitive science is a surprisingly big field, and this is not what I specialised in directly. But I have never seen it proprosed in a paper (though I havenât deliberately searched.)
I am far from an unbiased party since I briefly worked on the moral weight project as a (paid) intern for a couple of months, but for what itâs worth, as a philosophy of consciousness PhD, itâs not just that I, personally, from an inside point of view, think weighting by neuron count is bad idea, itâs that I canât think of any philosopher or scientist who maintains that âmore neurons make for more intense experiencesâ, or any philosophical or scientific theory of consciousness that clearly supports this. The only places Iâve ever encountered the view is EAs, usually without formal background in philosophy of mind or cognitive science, defending focusing near-termist EA money on humans. (Neuron count might correlate with other stuff we care about apart from experience intensity of course, and Iâm not a pure hedonist.)
For one thing, unless you are a mind-body dualist-and the majority of philosophers are not-it doesnât make sense to think of pain as like some sort of stuff/âsubstance like water or air, that the brain can produce more or less of. And I think that sort of picture lies behind the intuitive appeal of more neurons=more intense experiences.
Iâm sure thatâs true, but is anyone (or more than 1 or 2 people) outside of EA even really asking and publishing about that question about comparing intensity of pain between species and how much meeting count might matter regarding this?
I tried to read the referenced articles in the moral weights ticle about neuron count, and the articles I could read (that werenât behind a paywall) didnât talk about neuron count nor discussed comparing the intensity of pain between species? Id be interested to read anything which discussed this directly outside of the EA realm.
I kind of thought this was part of the reason why the moral weightâs project is so important, as groups of researchers are deeply considering these questions in ways that perhaps others havenât before.
Yeah, thatâs a fair point, maybe I havenât seen it because no one has considered how to do the weighting at all outside EA. But my sense is that at the very least many theories are unfriendly to weighting by neuron count (though probably not all).
My inclination was that they were more cognitive pathway theories of how pain works, that might help answer the question on how likely animals were to feel pain rather than thinking much about quantifying that ( which is where neuron count might come in). But I skim read and didnât understand about half of it well so could easily have missed things.
Oh, this is a very interesting data point, I didnât know neuron counts werenât even proposed as a serious option in the literature.
I canât be sure that they arenât somewhere as âphilosophy of consciousnessâ, let alone cognitive science is a surprisingly big field, and this is not what I specialised in directly. But I have never seen it proprosed in a paper (though I havenât deliberately searched.)
I had a bit of a scan and I couldnât find it outside the EA sphere