I think one clear disanalogy with startups is that eventually startups are judged by reality. Whereas we aren’t, because doing good and getting more money are not that strongly correlated. By just eating the risk of being wrong about something, the worst case is not failing, like it is for a startup, but rather sucking up all the resources into the wrong thing.
Also, small point, but I don’t think Bayesian decision theory is particularly important for EA.
Anyway, maybe eventually this might be worth considering, but as it is we’ve done several orders of magnitude too little analysis to start conceding.
I think one clear disanalogy with startups is that eventually startups are judged by reality. Whereas we aren’t, because doing good and getting more money are not that strongly correlated. By just eating the risk of being wrong about something, the worst case is not failing, like it is for a startup, but rather sucking up all the resources into the wrong thing.
Also, small point, but I don’t think Bayesian decision theory is particularly important for EA.
Anyway, maybe eventually this might be worth considering, but as it is we’ve done several orders of magnitude too little analysis to start conceding.