I think this is an excellent idea but one thing I didnât understand: you said âcatastrophicâ risks and then mentioned foot and mouth disease which doesnât seem very catastrophic to me.
Are you proposing this for what the EA community would call âexistentialâ risks (e.g. unfriendly AI)? Or just things on the order of a few billion dollars of damage?
This is really aimed at things which could cause damages in perhaps the $100 million - $1 trillion range. I think this would have a broadly positive effect on larger risks through two routes:
First, some larger risks come with associated smaller-scale risks, and youâd do similar things to reduce each of them. I think this is the case with the potential pandemic pathogen research. Requiring liability insurance wonât get people to fully internalise the externalities associated with the tail risk, but it should make them take substantial steps in the right direction.
Second, a society which takes seriously a wider class risks of unprecedented low-probability high stakes events will probably be better at responding to existential risks as well.
I think this is an excellent idea but one thing I didnât understand: you said âcatastrophicâ risks and then mentioned foot and mouth disease which doesnât seem very catastrophic to me.
Are you proposing this for what the EA community would call âexistentialâ risks (e.g. unfriendly AI)? Or just things on the order of a few billion dollars of damage?
This is really aimed at things which could cause damages in perhaps the $100 million - $1 trillion range. I think this would have a broadly positive effect on larger risks through two routes:
First, some larger risks come with associated smaller-scale risks, and youâd do similar things to reduce each of them. I think this is the case with the potential pandemic pathogen research. Requiring liability insurance wonât get people to fully internalise the externalities associated with the tail risk, but it should make them take substantial steps in the right direction.
Second, a society which takes seriously a wider class risks of unprecedented low-probability high stakes events will probably be better at responding to existential risks as well.