It looks to me like you are trying to estimate the difficulty of automating tasks by comparing to the size of brains of animals that perform the task (and in particular human brains). And you are saying that you expect it to take about 1e7 flops for each synapse in a human brain, and then define a probability distribution around there. Am I misunderstanding what’s going on here or is that a fair summary?
Pretty fair summary. 1e6, though, not 1e7. And honestly I could be pretty easily persuaded to go a bit lower by arguments such as:
Max firing rate of 100 Hz is not the informational content of the channel (that buys maybe 1 OOM)
Maybe a smaller DNN could be found, but wasn’t
It might take a lot of computational neurons to simulate the I/O of a single synapse, but it also probably takes a lot of synapses to simulate the I/O of a single computational neuron
Dropping our estimate by 1-2 OOMs would increase step 3 by 10%abs-20%abs. It wouldn’t have much effect on later estimates, as they are already conditional on success in step 3.
Pretty fair summary. 1e6, though, not 1e7. And honestly I could be pretty easily persuaded to go a bit lower by arguments such as:
Max firing rate of 100 Hz is not the informational content of the channel (that buys maybe 1 OOM)
Maybe a smaller DNN could be found, but wasn’t
It might take a lot of computational neurons to simulate the I/O of a single synapse, but it also probably takes a lot of synapses to simulate the I/O of a single computational neuron
Dropping our estimate by 1-2 OOMs would increase step 3 by 10%abs-20%abs. It wouldn’t have much effect on later estimates, as they are already conditional on success in step 3.