Someone shared a project idea with me and, after I indicating I didn’t feel very enthusiastic about it at first glance, asked me what reservations I have. Their project idea was focused on reducing political polarization and is framed as motivated by longtermism. I wrote the following and thought maybe it’d be useful for other people too, since I have similar thoughts in reaction to a large fraction of project ideas.
“My main ‘reservations’ at first glance aren’t so much specific concerns or downside risks as just ‘I tentatively think that this doesn’t sound at first glance like the kind of thing that will be very well-targeted or high-leverage for affecting the very most important things that happen this century and shape the course of the long-term future.’
Or to come at it from different angle: I tentatively think that this doesn’t sound like something that was arrived at or would be arrived at by a search process that:
Started with the long-term future and our best understanding of the key risks, risk factors, security factors, points for intervention, etc. in mind,
Worked backwards from there, thinking about what a given person or set of people can most impactfully do to affect the most important stories, and
Considered many options.
It sounds more like a project that was arrived at either:
before becoming focused on improving the long-term future, or
without forcing oneself to come up with and red-team a theory of change for how this makes a major difference to the long-term future, or
without considering at least 10 other options.
A similar but more concrete framing: If I imagine an existential catastrophe has occurred by 2100, and I ask myself what were the top 20 things that contributed to that and the top 20 things that could’ve tractably been done to prevent it, is the level of polarization in liberal democracies on those lists? How high up?
Possibly you’d find these slides from a workshop I gave recently on theory of change useful, though that’s just sort of overviewing the topic as a whole (rather than being at all tailored to your project) and is somewhat focused on research projects.
But these views really are just ‘tentative’ and ‘at first glance’.
I do in fact think there’s a plausible case for reducing polarization being on that ‘top 20 list’, and being one of the things I’d land on if ‘backchaining’ from what matters most.
And things that aren’t at the very top of the list can still be worth doing if there’s a team who’d be damn good at them, and better at them than at other things.
So I think if I was evaluating this for a grant, I wouldn’t just quickly reject, but rather try to:
(a) assess to what extent you really are motivated by longtermism/x-risk-reduction and hence will make lots of strategic and tactical decisions in ways subtly tailored to that
(b) hear your case for this being a top longtermist priority, and think further about that myself
(c) see whether you seem to have a great team and plan
And based on the tiny amount I currently know about your project, it’s probably 10-40% likely that after doing (a), (b), and (c), I’d ultimately recommend funding of ~$10-80k (if you thought that was an amount you could usefully use).”
Notes to readers:
This might sound either weirdly blunt or weirdly vague, but please bear in mind I’m lifting it out of context here!
If you can think of a good title to give this shortform to make it clearer who and what it’d be useful for, please let me know!
Feel free to share this with people or suggest I do something else with it, if that seems useful.
Someone shared a project idea with me and, after I indicating I didn’t feel very enthusiastic about it at first glance, asked me what reservations I have. Their project idea was focused on reducing political polarization and is framed as motivated by longtermism. I wrote the following and thought maybe it’d be useful for other people too, since I have similar thoughts in reaction to a large fraction of project ideas.
“My main ‘reservations’ at first glance aren’t so much specific concerns or downside risks as just ‘I tentatively think that this doesn’t sound at first glance like the kind of thing that will be very well-targeted or high-leverage for affecting the very most important things that happen this century and shape the course of the long-term future.’
Or to come at it from different angle: I tentatively think that this doesn’t sound like something that was arrived at or would be arrived at by a search process that:
Started with the long-term future and our best understanding of the key risks, risk factors, security factors, points for intervention, etc. in mind,
Worked backwards from there, thinking about what a given person or set of people can most impactfully do to affect the most important stories, and
Considered many options.
It sounds more like a project that was arrived at either:
before becoming focused on improving the long-term future, or
without forcing oneself to come up with and red-team a theory of change for how this makes a major difference to the long-term future, or
without considering at least 10 other options.
A similar but more concrete framing: If I imagine an existential catastrophe has occurred by 2100, and I ask myself what were the top 20 things that contributed to that and the top 20 things that could’ve tractably been done to prevent it, is the level of polarization in liberal democracies on those lists? How high up?
Possibly you’d find these slides from a workshop I gave recently on theory of change useful, though that’s just sort of overviewing the topic as a whole (rather than being at all tailored to your project) and is somewhat focused on research projects.
But these views really are just ‘tentative’ and ‘at first glance’.
I do in fact think there’s a plausible case for reducing polarization being on that ‘top 20 list’, and being one of the things I’d land on if ‘backchaining’ from what matters most.
And things that aren’t at the very top of the list can still be worth doing if there’s a team who’d be damn good at them, and better at them than at other things.
So I think if I was evaluating this for a grant, I wouldn’t just quickly reject, but rather try to:
(a) assess to what extent you really are motivated by longtermism/x-risk-reduction and hence will make lots of strategic and tactical decisions in ways subtly tailored to that
(b) hear your case for this being a top longtermist priority, and think further about that myself
(c) see whether you seem to have a great team and plan
And based on the tiny amount I currently know about your project, it’s probably 10-40% likely that after doing (a), (b), and (c), I’d ultimately recommend funding of ~$10-80k (if you thought that was an amount you could usefully use).”
Notes to readers:
This might sound either weirdly blunt or weirdly vague, but please bear in mind I’m lifting it out of context here!
If you can think of a good title to give this shortform to make it clearer who and what it’d be useful for, please let me know!
Feel free to share this with people or suggest I do something else with it, if that seems useful.