I think truth-seeking to the extent that you’re constantly trying to falsify your existing beliefs and find out if they should be different is too high a bar for this, but the first two conditions entail some lesser degree of truth-seeking. Like if you’re an eco-terrorist who bombs nuclear plants, but unbeknownst to you, coal plants are worse for the environment than nuclear plants, and someone informs you of that, you’d at a minimum switch to bombing coal plants, rather than ignoring the new information and continuing with your existing intervention. Seeking better opportunities and questioning your current plans is admirable and a positive thing many EAs do, but I don’t think it’s part of the minimum requirement for value alignment. I can think of a certain field where a lot of EAs work who don’t meet such a standard.
I think truth-seeking to the extent that you’re constantly trying to falsify your existing beliefs and find out if they should be different is too high a bar for this, but the first two conditions entail some lesser degree of truth-seeking. Like if you’re an eco-terrorist who bombs nuclear plants, but unbeknownst to you, coal plants are worse for the environment than nuclear plants, and someone informs you of that, you’d at a minimum switch to bombing coal plants, rather than ignoring the new information and continuing with your existing intervention. Seeking better opportunities and questioning your current plans is admirable and a positive thing many EAs do, but I don’t think it’s part of the minimum requirement for value alignment. I can think of a certain field where a lot of EAs work who don’t meet such a standard.