“if you are only considering the impact on beings alive today...factory farming”
The interventions you are discussing don’t help any beings alive at the time, but only affect the conditions (or existence) of future ones. In particular cage-free campaigns, and campaigns for slower growth-genetics and lower crowding among chickens raised for meat are all about changing the conditions into which future chickens will be born, and don’t involve moving any particular chickens from the old to new systems.
I.e. the case for those interventions already involves rejecting a strong presentist view.
“That’s reasonable, though if the aim is just “benefits over the next 50 years” I think that campaigns against factory farming seem like the stronger comparison:”
Suppose there’s an intelligence explosion in 30 years (not wildly unlikely in expert surveys), and expansion of population by 3-12 orders of magnitude over the next 10 years (with AI life of various kinds outnumbering both human and non-human animals today, with vastly more total computation). Then almost all the well-being of the next 50 years lies in that period.
Also in that scenario existing beings could enjoy accelerated subjective speed of thought and greatly enhanced well-being, so most of the QALY-equivalents for long-lived existing beings could lie there.
Agree with the above, but wanted to ask: what do you mean by a ‘strong presentist’ view? I’ve not heard/seen the term and am unsure what it is contrasted with.
Is ‘weak presentism’ that you give some weight to non-presently existing people, ‘strong presentism’ that you give none?
“if you are only considering the impact on beings alive today...factory farming”
The interventions you are discussing don’t help any beings alive at the time, but only affect the conditions (or existence) of future ones. In particular cage-free campaigns, and campaigns for slower growth-genetics and lower crowding among chickens raised for meat are all about changing the conditions into which future chickens will be born, and don’t involve moving any particular chickens from the old to new systems.
I.e. the case for those interventions already involves rejecting a strong presentist view.
“That’s reasonable, though if the aim is just “benefits over the next 50 years” I think that campaigns against factory farming seem like the stronger comparison:”
Suppose there’s an intelligence explosion in 30 years (not wildly unlikely in expert surveys), and expansion of population by 3-12 orders of magnitude over the next 10 years (with AI life of various kinds outnumbering both human and non-human animals today, with vastly more total computation). Then almost all the well-being of the next 50 years lies in that period.
Also in that scenario existing beings could enjoy accelerated subjective speed of thought and greatly enhanced well-being, so most of the QALY-equivalents for long-lived existing beings could lie there.
Mea culpa that I switched from “impact on beings alive today” to “benefits over the next 50 years” without noticing.
Agree with the above, but wanted to ask: what do you mean by a ‘strong presentist’ view? I’ve not heard/seen the term and am unsure what it is contrasted with.
Is ‘weak presentism’ that you give some weight to non-presently existing people, ‘strong presentism’ that you give none?
“Is ‘weak presentism’ that you give some weight to non-presently existing people, ‘strong presentism’ that you give none?”
In my comment, yes.