To defend the critique of Jeff’s reasoning a little, I do think Eliezer has a point with the 1a fallacy when he says that it’s hard to properly condition on the fact that you’ve been surprised. For example, according to Jeff’s estimates there’s a ~3% chance that getting and keeping you frozen goes well. If this does happen you’d be hugely surprised at how well things go for cryonicists. There should be some explanation for that other than pure chance. (The problem is you can’t search the space of explanations for each of the ~30 probabilities and adjust them appropriately. ) Here’s one simple explanation: Cryonics gets big and successful. Perhaps that’s unlikely a priori but given that something very weird happened it becomes plausible. This will strongly mess with the probabilities that determine if something goes wrong with reviving. The biggest one, ‘The technology is never developed to extract the information’, would certainly be lower. In fact, 9⁄10 probabilities would be go down. Sometimes they could also go up.
I doubt that Jeff managed to take all of these possibilities into account. Properly conditioning each of the ~30 events on each of the ones before it going well seems like a pretty daunting task. That doesn’t mean Jeff is horrendously wrong but he probably does make mistake 1a because that’s just hard to avoid with this type of reasoning.
To defend the critique of Jeff’s reasoning a little, I do think Eliezer has a point with the 1a fallacy when he says that it’s hard to properly condition on the fact that you’ve been surprised. For example, according to Jeff’s estimates there’s a ~3% chance that getting and keeping you frozen goes well. If this does happen you’d be hugely surprised at how well things go for cryonicists. There should be some explanation for that other than pure chance. (The problem is you can’t search the space of explanations for each of the ~30 probabilities and adjust them appropriately. ) Here’s one simple explanation: Cryonics gets big and successful. Perhaps that’s unlikely a priori but given that something very weird happened it becomes plausible. This will strongly mess with the probabilities that determine if something goes wrong with reviving. The biggest one, ‘The technology is never developed to extract the information’, would certainly be lower. In fact, 9⁄10 probabilities would be go down. Sometimes they could also go up.
I doubt that Jeff managed to take all of these possibilities into account. Properly conditioning each of the ~30 events on each of the ones before it going well seems like a pretty daunting task. That doesn’t mean Jeff is horrendously wrong but he probably does make mistake 1a because that’s just hard to avoid with this type of reasoning.