Against value drift

The idea of “value drift” strikes me as based on a naivety that peo­ple have al­tru­is­tic val­ues.

I don’t think that’s how peo­ple work. I think peo­ple fol­low their lo­cal in­cen­tives, and a line can always be traced from their ac­tions to some kind of per­sonal benefit.

This is not a cyn­i­cal idea. It is a tremen­dously hope­ful idea. It all adds up nor­mal­ity. The fact that we see peo­ple act al­tru­is­ti­cally all the time means that it is pos­si­ble to al­ign self­ish in­ter­ests with the “good”. that our en­vi­ron­ments are already shaped in such a way. It sug­gests that good out­comes are sim­ply a mat­ter of cre­at­ing and up­hold­ing the right in­cen­tive struc­tures. Heaven will fol­low by it­self.

So given that peo­ple fol­low their self­ish in­cen­tives, why do they drift away from the EA com­mu­nity?

Here’s an­other idea: that mo­ti­va­tion is always rel­a­tive. Peo­ple aren’t as pro­pel­led to­wards some­thing as its ob­jec­tive value. They’re as mo­ti­vated as the bad­ness of the next best al­ter­na­tive. If your plan B be­comes bet­ter, your plan A is sud­denly not as in­ter­est­ing any­more, even if it’s “ob­jec­tive value” is still the same. Peo­ple might try to ex­plain their be­hav­ior, lament­ing that they’ve changed. Maybe they just got bet­ter op­tions.

How does this re­late to value drift? The naive model might be that you have some vari­ables in your head. Each one of them gives some nu­meric value to some vir­tu­ous and lofty good, like “global health” and “the long term fu­ture” and “free­dom of speech” and what­not.

I’d like to pro­pose that the model is more like this: the vari­ables are there, but they don’t point to vir­tu­ous and lofty goods. They point to things about you. Power. Sur­vival. Pres­tige. Odds of pro­cre­ation. The val­ues are highly sta­ble, and mo­ti­va­tion only re­ally changes as the en­vi­ron­ment does.

And there’s ab­solutely pos­i­tively noth­ing bad about this what­so­ever. It all adds up to nor­mal­ity. In fact these “de­gen­er­ate” val­ues even add up to some­thing as no­ble as EA. Greed is good, as long as it’s prop­erly chan­neled.

Value drift isn’t some kind of un­ex­plain­able shift in at­ti­tude. It’s a shift in per­ceived in­cen­tives. Cor­rectly per­ceived or not.

One might have got­ten in­volved in EA on the heuris­tic that max­i­miz­ing im­pact will lead to the high­est pres­tige. They might have learned in EA that this heuris­tic doesn’t always work. Maybe they weren’t praised enough. Maybe they found that there was too much com­pe­ti­tion to be no­ticed. Maybe they found that the world doesn’t nec­es­sar­ily re­ward good in­ten­tions, so they dropped them.

Maybe they left be­cause there wasn’t much more to learn. Maybe they left be­cause they felt threat­ened by weird poli­ti­cal ideas. Maybe they felt cen­sored. Maybe they found a differ­ent so­cial en­vi­ron­ment that was much more re­ward­ing. Maybe they felt like the es­tab­lish­ment of EA wasn’t tak­ing them se­ri­ously.

I don’t have a sim­ple an­swer, but the cur­rent con­cept of “value drift” is very much a pet peeve. We have to ac­count for peo­ple’s self­ish sides. As long as we’re in de­nial about that, we won’t get as much of the al­tru­is­tic side.

So I pro­pose we call it in­cen­tive drift in­stead.