EA Undervalues Unseen Data

Introduction

You rarely have access to all the information that would be useful when making a decision. When I talk about ‘unseen data’, this kind of missing information is what I mean.

For now, let’s keep the idea of ‘data’ as broad as possible, and say that it can include facts, scientific studies, but also life experiences, files on a server somewhere, an understanding of useful analytic processes to apply to a problem, among many other things.

As you have probably guessed, I would like to see EA attribute more value to this kind of unseen data. In particular, I would like to see EAs attribute more value to data they have not seen but that others might have.

I think this would be a good way both to increase EA’s chances of identifying problem areas it might have missed, and to generally increase the tractability of priority EA problem areas.

How I am estimating EA’s valuation of unseen data

I have a few hypothetical actions in mind that EAs can take (or not take) to demonstrate higher or lower valuations of unseen data. Taking more of these actions means a higher valuation and vice-versa:

Promotion of useful and undertaught decision-making tools. We can promote our values and suggest concrete actions, but we can also promote more general tools. An agent who is somewhat aligned with EA and who has data that lets them see a great opportunity others can’t will likely do a better job of noticing it and executing on it if they understand basic statistics, for example.

Dedicated funding of speculative bets. I have something like Tyler Cohen’s Emergent Ventures in mind. An EA organisation could try to get good at identifying exceptional individuals whose values overlap with EA and who have ideas for interventions outside of EA’s recommended problem areas. Among other things this would act as a hedge on existing problem-area analysis.

Grassroots politics. Support the kinds of groups that Momentum might incubate. Diversity of experience means more unseen data, and activism is a place where otherwise underrepresented people are likely to show up. I suspect this is my most controversial suggestion as politics is the mind-killer, but done in a careful and evidence-based manner I think that like SBF donating to Biden it would not create a discourse problem.

How much I would like to see EA value unseen data

I don’t know how to identify an optimal amount here, but I do feel confident about specifying an approximate lower bound of action that would move EA’s valuation in the right direction.

I would like to see EA study examples of initiatives comparable to the ones above. For example, very good things happened when people in medicine became more scientifically minded—in what ways might this phenomenon generalize to or be reproducible within other industries? What can we learn so far from Emergent Ventures’ outcomes, from IDInsight’s endorsement of Sunrise Movement or the impact of historical protest movements?

This seems like a relatively low-cost bet with a plausible shot at uncovering excellent interventions.

One step up from this would be experimenting with low-cost forms of these kinds of interventions directly. Cambridge University runs an AGI Safety Fundamentals Course, what about something comparable for gifted teens on Bayesian statistics?

I see almost no interest in these sorts of initiatives at present, which I think represents an undervaluation. (And if I am simply not aware of existing work which fits these criteria, I look forward to learning about it in the comments!)

How this is different from outreach (which EA already does)

I first encountered effective altruism when I met 80K at a careers fair, and I have received 80K 1:1 coaching, which is an amazing service. I’m also recently aware of Peter McIntyre’s https://​​non-trivial.org/​​.

These are all good initiatives, but I would also like to see EA trying to harness more people’s unseen data before or even without trying to convince them of the EA worldview.

Final Thoughts

This problem has an inverse, which is overvaluing seen data, but if I end up writing on it I’ll do so in a separate post. I hope I have encouraged you to consider your valuation of unseen data for now.