Do you have any thoughts on how the expected impact of the few hundred people working most directly on AGI safety compares with that of the rest of the world (on mitigating the risks from advanced AI)? I suppose a random person from the few hundred people will have much greater (positive/​negative) impact than a random person, but it is not obvious to me that we can round the (positive/​negative) impact of the rest of the world (on mitigating the risks from advanced AI) to zero, and increased awareness of the rest of the world will tend to increase its expected impact (for better or worse).
In terms of bio risk, even if covid did not result in more people working towards fighting existential pandemics, it may still have increased the number of people working on pandemics, which arguably has positive flow-through effects to mitigate the existential ones. In addition, covid may have increased the amount of resources that would be directed towards fighting a pandemic if one happens. It is unclear to me how large these effects are relative to a given increase in the number of people working most directly on reducing the risk from existential pandemics.
I think there are whole categories of activity that are not being tried by the broader world, but that people focused on the problem attend to, with big impacts in both bio and AI. It has its own diminishing returns curve.
Hi Carl,
Do you have any thoughts on how the expected impact of the few hundred people working most directly on AGI safety compares with that of the rest of the world (on mitigating the risks from advanced AI)? I suppose a random person from the few hundred people will have much greater (positive/​negative) impact than a random person, but it is not obvious to me that we can round the (positive/​negative) impact of the rest of the world (on mitigating the risks from advanced AI) to zero, and increased awareness of the rest of the world will tend to increase its expected impact (for better or worse).
In terms of bio risk, even if covid did not result in more people working towards fighting existential pandemics, it may still have increased the number of people working on pandemics, which arguably has positive flow-through effects to mitigate the existential ones. In addition, covid may have increased the amount of resources that would be directed towards fighting a pandemic if one happens. It is unclear to me how large these effects are relative to a given increase in the number of people working most directly on reducing the risk from existential pandemics.
I think there are whole categories of activity that are not being tried by the broader world, but that people focused on the problem attend to, with big impacts in both bio and AI. It has its own diminishing returns curve.