Executive summary: The post discusses different types of uncertainty in cost-effectiveness analyses (CEAs), specifically epistemic uncertainty due to lack of knowledge vs. statistical uncertainty inherent to the data.
Key points:
Epistemic uncertainty arises when generalizing results to new contexts, while statistical uncertainty is irreducible randomness.
Monte Carlo simulation is used in CEAs to quantify uncertainty by sampling input variables.
Care is needed when using summary statistics like effect sizes in models, as they contain both types of uncertainty.
Modeling external validity as epistemic uncertainty is an open challenge.
Overall, epistemic uncertainty around averages aggregates statistical uncertainty within data.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact us if you have feedback.
Executive summary: The post discusses different types of uncertainty in cost-effectiveness analyses (CEAs), specifically epistemic uncertainty due to lack of knowledge vs. statistical uncertainty inherent to the data.
Key points:
Epistemic uncertainty arises when generalizing results to new contexts, while statistical uncertainty is irreducible randomness.
Monte Carlo simulation is used in CEAs to quantify uncertainty by sampling input variables.
Confidence intervals express statistical uncertainty, while credible intervals represent epistemic uncertainty.
Care is needed when using summary statistics like effect sizes in models, as they contain both types of uncertainty.
Modeling external validity as epistemic uncertainty is an open challenge.
Overall, epistemic uncertainty around averages aggregates statistical uncertainty within data.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.