You are right, I mistook which Metaculus question you linked to. However, it seems that even that question is somewhat ill-defined due to referencing the “weak AGI” definition of the other question. For that reason alone, I wouldn’t bet large sums of money on it. But it is not as problematic as the weak AGI question itself.
My bet does not depend on Metaculus’ definition of “weak AGI”. I rely on Metaculus’ definition of SAI given in a question about the time from “weak AGI” until SAI. However, the bet I suggested is just about the date of SAI.
Your bet proposal talks about the Metaculus question “resolving non-ambiguously”. Since the question is about the duration of time between the “weak AGI” and “superintelligent AI”, it is possible that it cannot be resolved “non-ambiguously” due to the definition of weak AGI being ambiguous even if SAI is invented. This might discourage people who believe in short SAI timelines from accepting the bet.
The bet is neutral for both parties if the Metaculus’ question resolves ambiguously. In this case, no transfer of money would happen. A higher probability of the question resolving ambiguously decreases the expected value of the bet for both parties, but this could be mitigated by increasing the potential benefits.
You are right, I mistook which Metaculus question you linked to. However, it seems that even that question is somewhat ill-defined due to referencing the “weak AGI” definition of the other question. For that reason alone, I wouldn’t bet large sums of money on it. But it is not as problematic as the weak AGI question itself.
My bet does not depend on Metaculus’ definition of “weak AGI”. I rely on Metaculus’ definition of SAI given in a question about the time from “weak AGI” until SAI. However, the bet I suggested is just about the date of SAI.
Your bet proposal talks about the Metaculus question “resolving non-ambiguously”. Since the question is about the duration of time between the “weak AGI” and “superintelligent AI”, it is possible that it cannot be resolved “non-ambiguously” due to the definition of weak AGI being ambiguous even if SAI is invented. This might discourage people who believe in short SAI timelines from accepting the bet.
The bet is neutral for both parties if the Metaculus’ question resolves ambiguously. In this case, no transfer of money would happen. A higher probability of the question resolving ambiguously decreases the expected value of the bet for both parties, but this could be mitigated by increasing the potential benefits.