Not sure what your “a claim about the material world or a claim about anthropics” distinction means; my instinct is that eg “we are not simulated” is an empirical proposition and the reasons we have to assign a certain probability to that proposition are related to anthropics.
If I know with P~=1 certainty that there are 1000 observers-like-me and 999 of them are in a simulation (or Boltzmann brains, etc), then there’s at least two reasonable interpretations of probability.
The algorithm that initiates me has at least one representation with ~100% certainty outside the simulation, therefore the “I” that matters is not in a simulation, P~=1.
Materially, for the vast majority of observers like me, they are in a simulation. P~=0.1% that I happen to be the instance that’s outside the simulation. P~=0.001
Put another way, the philosophical question here is whether P(we’re in a simulation) should most naturally be understood as “there exists a copy of me outside of simulation” vs “of the copies of me that exists, how many of them are in a simulation” is the relevant empirical operationalization.
If I know with P~=1 certainty that there are 1000 observers-like-me and 999 of them are in a simulation (or Boltzmann brains, etc), then there’s at least two reasonable interpretations of probability.
The algorithm that initiates me has at least one representation with ~100% certainty outside the simulation, therefore the “I” that matters is not in a simulation, P~=1.
Materially, for the vast majority of observers like me, they are in a simulation. P~=0.1% that I happen to be the instance that’s outside the simulation. P~=0.001
Put another way, the philosophical question here is whether P(we’re in a simulation) should most naturally be understood as “there exists a copy of me outside of simulation” vs “of the copies of me that exists, how many of them are in a simulation” is the relevant empirical operationalization.