Bostrom defines a “crucial consideration” as one that would overturn a conclusion or reveal the need for a major change of direction. By this definition, something may or may not be a “crucial consideration” depending on our current set of conclusions and our current direction. The definition sneaks in a connotation that important new insights will tend to reveal the need for a major change of direction. But it’s also possible that important new insights will reaffirm our current direction. See conservation of expected evidence.
Regarding the precautionary principle, consider a reversibility test: Suppose there is some parameter of the world p which is gradually increasing, and you have the opportunity to interfere and stop this increase for no cost. By the precautionary principle, you should not interfere. Now suppose p is currently static, and you have the opportunity to interfere and trigger a gradual increase for no cost. Again, by the precautionary principle, you should not interfere.
For someone like me, who does not believe in the act/omission distinction and believes in fighting status quo bias, this seems a little silly. I think the best arguments for a policy of non-interference in both scenarios are:
In the real world, actions typically have costs.
It’s possible that our interference isn’t reversible, and by thinking more, we can better determine whether interference is the correct course of action. In other words, value of information is high. But this argument depends on cluelessness being tractable! If our current guess is as good as our guess will ever be, we might as well act on it.
I’m sympathetic to the idea that value of information is high, and I think cluelessness is tractable. I support EA groups like the Future of Humanity Institute which are trying to work out the best course of action. But at a certain point, the low-hanging information fruit will get picked, and then it’s likely time to act. If we aren’t going to take action under any circumstances, gathering information is a waste of time.