(Linkposting with permission from the author, Logan Strohl. Below, my—Will’s—excerpted summary of the post precedes the full text. The first person speaker is Logan.)
Summary
I think that EA burnout usually results from prolonged dedication to satisfying the values you think you should have, while neglecting the values you actually have.
...
Perhaps your true values just happen to exactly match the central set of EA values, and that is why you are an EA.
However, I think it’s much more common for people to be EAs because their true values have some overlap with the EA values; and I think it’s also common for EAs to dramatically overestimate the magnitude of that overlap. According to my model, this is why “EA burnout” is a thing.
...
If I am wrong about what I value, then I will miss-manage my motivational resources. Chronic mismanagement of motivational resources results in some really bad stuff.
...
Over a couple of years, I change my career, my friend group, and my hobbies to reflect my new values. I spend as little time as possible on Things That Don’t Matter, because now I care about Impact.
...
I’ve oriented my whole life around The Should Values for my longtermist EA strategy [...] while neglecting my True Values. As a result, my engines of motivation are hardly ever receiving any fuel.
...
It seems awfully important to me that EAs put fuel into their gas tanks, rather than dumping that fuel onto the pavement where fictional cars sit in their imaginations.
...
It is probably possible to recover even from severe cases of EA burnout. I think I’ve done a decent job of it myself, though there’s certainly room for improvement. But it takes years.
...
My advice to my past self would be: First, know who you are. If you’re in this for the long haul, build a life in which the real you can thrive. And then, from the abundance of that thriving, put the excess toward Impact.
Full Text
(Probably somebody else has said most of this. But I personally haven’t read it, and felt like writing it down myself, so here we go.)
I think that EA burnout usually results from prolonged dedication to satisfying the values you think you should have, while neglecting the values you actually have.
Setting aside for the moment what “values” are and what it means to “actually” have one, suppose that I actually value these things (among others):
True Values
Abundance
Power
Novelty
Social Harmony
Beauty
Growth
Comfort
The Wellbeing Of Others
Excitement
Personal Longevity
Accuracy
One day I learn about “global catastrophic risk”: Perhaps we’ll all die in a nuclear war, or an AI apocalypse, or a bioengineered global pandemic, and perhaps one of these things will happen quite soon.
I recognize that GCR is a direct threat to The Wellbeing Of Others and to Personal Longevity, and as I do, I get scared. I get scared in a way I have never been scared before, because I’ve never before taken seriously the possibility that everyone might die, leaving nobody to continue the species or even to remember that we ever existed—and because this new perspective on the future of humanity has caused my own personal mortality to hit me harder than the lingering perspective of my Christian upbringing ever allowed. For the first time in my life, I’m really aware that I, and everyone I will ever care about, may die.
My fear has me very focused on just two of my values: The Wellbeing Of Others and Personal Longevity. But as I read, think, and process, I realize that pretty much regardless of what my other values might be, they cannot possibly be satisfied if the entire species—or the planet, or the lightcone—is destroyed.
[This is, of course, a version of EA that’s especially focused on the far future; but I think it’s common for a very similar thing to happen when someone transitions from “soup kitchens” to “global poverty and animal welfare”. There’s an exponential increase in stakes, accompanied by a corresponding increase in the fear of lost value.]
So I reason that a new life strategy is called for.
Over time, under the influence of my “Accuracy” value as well as my “Social Harmony” value (since I’m now surrounded by people who are thinking about this stuff), I come to believe that I should value the following:
Should Values
Impact*
Calibration
Openness*
Collaboration*
Empiricism*
The Wellbeing Of Others
Personal Longevity
(The values on this new list with an asterisk beside them have a correlate on the original list (impact→power, collaboration→social harmony, empiricism→accuracy), but these new values are routed through The New Strategy, and are not necessarily plugged into their correlates from the first list.)
Over a couple of years, I change my career, my friend group, and my hobbies to reflect my new values. I spend as little time as possible on Things That Don’t Matter, because now I care about Impact, and designing computer games has very little Impact since it takes a lot of time and definitely doesn’t save the world (even though it’s pretty good on novelty, beauty, growth, and excitement).
Ok, so let’s talk now about what “values” are.
I think that in humans at least, values are drives to action. They are things that motivate a person to choose one possible action over another. If I value loyalty over honesty, I’ll readily lie to help my friend save face; if I value both about equally, I may be a little paralyzed in some situations while I consult the overall balance of my whole value system and try to figure out what to do. When I go for a hike with my field kit of watercolor paints, I tend to feel really good about that decision as I make it, as I hike and paint, and also as I look back on the experience, because it satisfies several of my values (such as novelty, growth, and beauty). When I choose to stay in and watch a movie rather than run errands out in the cold rain, that’s my comfort value expressing itself. Values are the engines of motivation.
It is one thing to recognize that a version of you who strategically prioritizes “collaboration” will be more effective at accomplishing goals that you really do care about. But it’s another to incorrectly believe that “collaboration” directly motivates your actions.
Perhaps “collaboration” really is one of your true values. Indeed, perhaps your true values just happen to exactly match the central set of EA values, and that is why you are an EA.
However, I think it’s much more common for people to be EAs because their true values have some overlap with the EA values; and I think it’s also common for EAs to dramatically overestimate the magnitude of that overlap. According to my model, this is why “EA burnout” is a thing.
[ETA: My working model is incomplete. I think there are probably other reasons also that EA burnout is a thing. But I’m nowhere near as satisfied with my understanding of the other reasons.]
If I am wrong about what I value, then I will miss-manage my motivational resources. Chronic mismanagement of motivational resources results in some really bad stuff.
Recall that in my hypothetical, I’ve oriented my whole life around The Should Values for my longtermist EA strategy—and I’ve done so by fiat, in a way that does not converse much with the values that drove me before. My career, my social connections, and my daily habits and routines all aim to satisfy my Should Values, while neglecting my True Values. As a result, my engines of motivation are hardly ever receiving any fuel.
Gradually, I find myself less and less able to take any actions whatsoever. Not at work, not with friends, not even when I’m by myself and could theoretically do anything I want. I can’t even think about my work without panicking. I am just so exhausted all of the time. Even when apparently excellent opportunities are right in front of me, I just cannot bring myself to care.
I think there are ways to prioritize world-saving or EA-type strategies without deceiving ourselves about what motivates us. I think it is possible to put skill points into calibration, for example, even when you’re not directly motivated by a drive to be well calibrated. It is often possible to choose a job that satisfices for your true values while also accomplishing instrumental goals. In fact, I think it’s crucial that many of us do this kind of thing a bunch of the time.
I also think it is devastatingly dangerous for most of us to be incorrect about what really drives us to act.
It is probably possible to recover even from severe cases of EA burnout. I think I’ve done a decent job of it myself, though there’s certainly room for improvement. But it takes years. Perhaps several of them, perhaps a whole decade. And that is time I am not at all confident our species has.
I am a bit wary of telling EAs what I think they Should do. It seems to me that as a movement, EAs are awfully tangled up about Shoulds, especially when it comes to the thoughts of other EAs.
Still, it seems awfully important to me that EAs put fuel into their gas tanks (or electricity into their batteries, if you prefer), rather than dumping that fuel onto the pavement where fictional cars sit in their imaginations.
And not just a little bit of fuel! Not just when you’re too exhausted to go on without a little hit. I think that no matter what you hope to accomplish, it is wise to act from your true values ALL of the time—to recognize instrumental principles as instrumental, and to coordinate with allies without allowing them to overwrite your self concept.
My advice to my past self would be: First, know who you are. If you’re in this for the long haul, build a life in which the real you can thrive. And then, from the abundance of that thriving, put the excess toward Impact (or wherever else you would like for it to go).
Maybe you think that you lack the time to read fiction, or to go rock climbing, or to spend the whole weekend playing board games with friends.
I think that you may lack the time not to.
This really spoke to me. Thanks for (re)posting this!
I agree with some points of the post but don’t like at all how it defines the things that cost you the less as “True values” and are recommending people to follow them.
As an agnostic on what is the best way of defining values and how good is for people to do the things they are the most motivated to do, I want to remind people that you could define what your “True values” are as the things you do even tho they cost you more energy; and that doing them could be better.
So I think we agree that it seems good to find the right balance between following costly and not costly values, but calling some of them the True ones seems to imply that you should focus on or do only them.