Regarding whether to focus on numbers or biomass, I think the following articles could be relevant:
Brian Tomasik’s Is Brain Size Morally Relevant? explores arguments for brain-size/complexity weighting and for ‘equality weighting’ (equal consideration of systems). Despite the title, it’s not just brain size that’s discussed in the article but sometimes also mental complexity (“Note: In this piece, I unfortunately conflate “brain size” with “mental complexity” in a messy way. Some of the arguments I discuss apply primarily to size, while some apply primarily to complexity.”).
The question of whether complexity matters is related to the intensity of experiences. Jason Schukraft’s Differences in the Intensity of Valenced Experience across Species explores this and seems to conclude that for less familiar systems (well, animals—I think only animals are explored in that piece) it’s really unclear whether the “intensity range” is larger or smaller. They could potentially suffer more intensely. They could potentially suffer less intensely.
Another article by Brian Tomasik: Fuzzy, Nested Minds Problematize Utilitarian Aggregation raises the question of how to subset/divide physical reality into minds. Like Brian, the approach that’s most intuitive for me is to “Sum over all sufficiently individuated “objects”″ and count subsystems (e.g. suffering subroutines) too. Although, I think the question is still super confusing. It’s also written in the article that “[p]erforming any exact computations seem intractable for now” although I don’t know enough math to say.
I think it’s ultimately all quite subjective though (which doesn’t mean it’s not important!). For me, I don’t see how we can definitively show that some approach is correct. (Brian Tomasik describes this problem as a “moral question” and frames it as there’s “[n]o objective answer” but I feel like his language is slightly confusing. Rather, I think there is an answer regarding suffering/sentience, but it’s just that we might not ever have the tools to know that we’ve reached it. In the absence of those tools, a lot of what we’re doing when saying “there’s more suffering here” or “there’s more suffering there” might be or might be compared to a discussion of morality.) We’re also using a bunch of intuitions that were shaped by factors that don’t necessarily guide us to truths about suffering/sentience (I explore this in the Factors affecting how we attribute suffering section in my recent microorganism post).
I think even if we have a robust scientific understanding of, e.g., the human brain, we would still think that human suffering exists. I don’t think understanding the physical mechanisms behind a particular system means that it can’t be associated with a first-person experience.
So I’m not looking at the topic only because “important, if true”. As I explain in the piece, I think there are signs pointing toward “there’s at least a small chance” based on reflecting on our concept of what we take to be evidence of suffering, and aspects of microbes that fit possible extended criteria.
I think there’s a lot of scientific evidence against astrology. On the other hand, how we should attribute mental experiences to physical systems seems to be an open question because of the problem of (not being) other minds.
I am actually concerned that there’s some chance various nonliving structures do suffer. Digital minds are one example. I don’t know much about astronomy/astrophysics but I doubt that stars have functioning that strongly matches, e.g., damage responses exhibited by animals. This isn’t to say there’s absolutely zero chance we can attribute negative experiences to such a system, but I might think it’s much more unlikely than microorganisms that better match damage responses of animals. It might be valuable for someone to look into whether some astronomical objects match some interpretations of what it means to be evidence of suffering, and to what extent. I think there’s quite a lot of uncertainty surrounding questions in the philosophy of mind (e.g., some philosophers endorse panpsychism while others might be more skeptical about the sentience of non-human animals).