Insect farming might cause more suffering than other animal farming
The edible insect market has been growing and is projected to continue increasing rapidly in the future. There has been significant research pointing to the fact that insects show signs of consciousness, and I think it’s important to seriously compare insect welfare interventions with other ones.
Brian Tomasik’s “How Much Direct Suffering Is Caused by Various Animal Foods?” compares the days of suffering per kilogram of different animals. I’d like to add on to this post by a guesstimate model of where insect suffering may lie in this list. I used cricket suffering to create this model and it’s important to note there are differences in the numbers depending on which insects are being farmed—mealworms and black soldier flies likely have a different weight and sentience multiplier.
https://www.getguesstimate.com/models/21398
My reasoning for the numbers I used is given in the sheet: if you hover over the fields, you will see a detailed description.
The model shows the mean value is 1300 days of suffering per kg of insects consumed, which is twice as high as the highest animal in Brian Tomasik’s list (farmed catfish). The median value is considerably lower (35), and the 5th and 95th percentile are approximately 0.32 and 2500.
It’s interesting to note that the insect sentience multiplier would have to be 10^-4 for the days of suffering / kg to be around the same as a broiler chicken.
If correct, these estimates suggest that insect welfare deserves a high priority compared to other farmed animal welfare.
Here are possible interventions. In addition to legislation to restrict insect farming (such as bans on using insect feed for livestock), I think there could be a lot of value in using insects other than crickets which have fewer neurons and therefore could have a smaller sentience multiplier (like mealworms). Interventions to ban likely more painful methods of killing insects such as boiling could also be impactful.
I’d very much appreciate discussion and feedback on the model and where you think it could be improved.
I made this model during a one-day hackathon at the Atlas Fellowship.
I don’t have time to write-up an argument for the claim now, but as a philosophy of mind PhD, I am extremely dubious that fewer neurons on its own means “less sentience” or less intense pains in any sense. Of course, lots of people think otherwise, and you should definitely give their views some weight in expectation, but I thought it was worth flagging that at least one person with some relevant expertise is highly skeptical of the assumption.
Agreed.
Also, those views are surprisingly common even though it doesn’t seem like there’s a lot of justification to be certain (or actually even, let’s say, ~>90% confident) of them—which a lot of people seem to be.
Humans making the claim that there’s “less sentience” or less intense pain in insects only have their mental experiences, combined with knowledge of their own nervous systems, as a reference.
Some relevant posts:
https://reducing-suffering.org/is-brain-size-morally-relevant/
https://forum.effectivealtruism.org/posts/H7KMqMtqNifGYMDft/differences-in-the-intensity-of-valenced-experience-across
RP has some posts coming out on this soon that I imagine will cover some of your arguments: https://forum.effectivealtruism.org/s/y5n47MfgrKvTLE3pw
I also lend significant25% credence to a similar view, which I’d summarize as “animals with fewer neurons aren’t less sentient, their sensations are just less complex”.
I used to think that babies had fewer neurons, but people would not believe babies to be lower on moral patienthood, but I was mistaken: apparently neurogenesis ends early (and adult neurogenesis is still in question):
However, perhaps we don’t just want to go by neuron count: number of synaptic connections seems perhaps just as important.
Speaking of synaptic connection, there’s another problem: Human adults have less than the average human infants. Peter Huttenlocher “showed that synaptic density in the human cerebral cortex increases rapidly after birth, peaking at 1 to 2 years of age, at about 50% above adult levels . It drops sharply during adolescence then stabilizes in adulthood, with a slight possible decline late in life.”
But that apparently didn’t make most humans to think that human babies are more sentient/more complex in experience. Actually, we pretty much thought the reverse. Untill mid 1980s, most doctors do not use anesthesia in operations for newborn human infants.
Yeah, I agree that it’s very uncertain. However, I was unsure how to quantitatively measure sentience with anything other than neuron count although my range should probably be wider. Do you have any other ideas on how to quantitatively approximate sentience? What do you think the range should be?
So if by measuring sentience, you mean to ask ‘is X sentient?’ and mean something like ‘does X experience at least something, anything at all?’, then one view is that it’s a binary property. All or nothing − 0 or 1 - the lights are either on or off, so to speak. I think this is why David scare-quoted “less sentience”.
In this case, we might be able to speak of probabilities of X being sentient or not sentient. I find that Brian Tomasik’s idea of a ‘sentience classifier’ (replace the moral value output with P(X is sentient)) to be a useful way of thinking about this.
There is a view that there could be fractional experiences. I myself don’t understand how that would be possible but there is speculation about it.
However, if by sentience you instead mean intensity of experiences etc. (given that at least something is experienced) then the 2020 moral weight series is a good starting point. I personally disagree with some of their conclusions but it’s a really good framework for thinking about it.
My own view, which is similar if not identical to Tomasik’s, is that when thinking about any of these, it’s ultimately ‘up to you to decide’. I would add that human intuitions weren’t evolved to find the ground truth about what is sentient and the content of such experiences. Instead, they developed due to the need to analyze the behavior of other agents. Our intuitions could be exactly right or entirely wrong or somewhere in between but there’s no way to really know because we aren’t other minds.
FWIW, I think black soldier fly larvae will be more representative of farmed insects. For example, this farm is anticipated to produce 80,000 tonnes of BSF larvae (dried weight, as 60,000 tonnes of protein meal and 20,000 tonnes of oil), and I think that comes out to around 1 to 2 trillion larvae per year: https://www.reuters.com/article/us-archer-daniels-innovafeed-insects-idUSKBN28001C
Ynsect also has a very large mealworm farm under construction, but most other insect farmers (by production and investment; Protix is another big one) seem to be focused on BSF larvae. I haven’t checked whether Ynsect alone is enough to skew them towards mealworms, though.
I think these projections for industry growth are pretty reasonable, with about 500,000 tonnes of insect protein (for farmed animals and pets) in 2030: https://insectfeed.nl/2021/02/24/no-longer-crawling-insect-protein-to-come-of-age-in-the-2020s/
I think that comes out to around 10 to 20 trillion insects in 2030.
The idea that keeping 5500 crickets in farmed insect conditions then killing them is equivalent to ~1500 days of a human’s suffering (the expected value from your analysis) seems very high and seems wildly incompatible with the behaviors and beliefs of everyone I’ve ever met, but that doesn’t mean it’s not true.
My guess is that your suffering/day of life is too high (expected value of 0.7!, they seem to have a chill life just bouncing around and eating) and that the sentience multiplier is too high (I think most people would give up a lot more than the torture of 6000 crickets for a day to prevent the torture of a human for a day)
I think the big problem with the suffering/day of life estimate is that it assumes suffering can’t go negative. If you think suffering can go as low as 0.015 suffering-units it doesn’t seem too much of a stretch to think their lives could be net positive.
(this is a general problem with reducing-suffering derived estimates imo)
I think that’s a really interesting point. Curious as to what you think the lower bound for suffering would be assuming it could go negative (basically, if they’re living net positive lives)?
What do you think would be most people’s cutoff? My guess would be that most people see the two types of suffering as qualitatively different such that no amount of insect suffering is comparable to human suffering.
I think another reason to think that insect farming should be given more attention is its longterm implications. It seems the most likely to be done on other planets or space stations, and it seems the least likely to be replaced by alternative proteins (and btw some people view them as alternative protein).
Could you explain the “Number of days of life equivalent to pain of suffering” parameter in your guesstimate model? I don’t get that one
Sorry, I meant number of days equivalent to pain of death (trying to convert it into the same units).