I’m really glad you wrote this post. Hearing critiques from prominent EAs promotes a valuable community norm of self-reflection and not just accepting EA as is, in my opinion.
A few thoughts:
It’s important to emphasize how much maximization can be normalized in EA subpockets. You touch on this in your post: “And I’m nervous about what I perceive as dynamics in some circles where people seem to “show off” how little moderation they accept—how self-sacrificing, “weird,” extreme, etc. they’re willing to be in the pursuit of EA goals”). I agree, and I think this is relevant to growing EA Hubs and cause-area silos. If you move to an EA hub that predominantly maximizes along one belief (e.g., AI safety in Berkeley), very natural human tendencies will draw you to also maximize along that belief. Maximizing will win you social approval and dissenting is hard, especially if you’re still young and impressionable (like meee). If you agree with this post’s reasoning, I think you should take active steps to correct for a social bias toward hard-core maximizing (see 2).
If you’re going to maximize along some belief, you should seriously engage with the best arguments for why you’re wrong. Scout mindset baby. Forming true beliefs about a complicated world is hard and motivated reasoning is easy.
Maximizing some things is still pretty cool. I think some readers (of the post and my comment) could come away with a mistaken impression that more moderation in all aspects of EA is always a good thing. I think it’s more nuanced than that: Most people who have done great things in the past have maximized much harder than their peers. I agree one should be cautious of maximizing things we are”conceptually confused about, can’t reliably define or measure, and have massive disagreements about even within EA.” But maximizing some things that are good across a variety of plausibly true beliefs can be pretty awesome for making progress on your goals (e.g., maximizing early-career success and learning). And even if the extreme of maximization is bad, more maximization might be directionally good, depending on how much you’re currently maximizing. We also live in a world that may end within the next 100 years, so you have permission to be desperate.
Upvoted for the first two points, but the third seems not entirely true to me. It’s good to want to increase some things a lot. But maximizing means taking them as more important than everything else, which isn’t good any more.
Example: I’ve been “maximizing for exploration” lately by trying to find new jobs/projects/environments which I can experience for a short period of time and gain information from. But I’m not actually maximizing, which would look more like taking random actions, or doing projects completely unrelated to anything I’ve done before.
This especially seems like bad advice to me:
We also live in a world that may end within the next 100 years, so you have permission to be desperate.
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
I think I still stand behind the sentiment in (3), I’m just not sure how to best express it.
I agree that 100% (naively) maximizing for something can quickly become counter-productive in practice. It is really hard to know what actions one should take if one is fully maximizing for X, so even if one wants to maximize X it makes sense to take into account things like optics, burnout, Systemic cascading effects, epistemic uncertainty, and whatever else gives you pause before maximizing.
This is the type of considerate maximization I was gesturing at when I said directionally more maximizing might be a good thing for some people (to the extent they genuinely endorse doing the most good), but I recognize that ‘maximizing’ can be understood differently here.
Caveat 1: I think there are lots of things it doesn’t make sense to 100% maximize for, and you shouldn’t tell yourself you are 100% maximizing for them. “Maximizing for exploration” might be such a thing. And even if you were 100% maximizing for exploration, it’s not like you wouldn’t take into account the cost of random actions, venturing into domains you have no context in, and the cost of spending a lot of time thinking about how to best maximize.
Caveat 2: it must be possible to maximize within one of multiple goals. I care a great deal about doing the most good I can, but I also care about feeling alive in this world. I’m lying to myself if I say that something like climbing is only instrumental towards more impact. When I’m working, I’ll maximize for impact (taking into account the uncertainty around how to maximize). When I’m not, I won’t.
[meta note: I have little experience in decision theory, formal consequentialist theory, or whatever else is relevant here, so might be overlooking concepts].
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
Fair. Nate Soares talks about desperation as a ‘dubious virtue’ in this post, a quality that can “easily turn into a vice if used incorrectly or excessively.” He argues though that you should give yourself permission to go all out for something, at least in theory. And then look around, and see if anything you care about – anything you’re fighting for – is “worthy of a little desperation.”
I’m really glad you wrote this post. Hearing critiques from prominent EAs promotes a valuable community norm of self-reflection and not just accepting EA as is, in my opinion.
A few thoughts:
It’s important to emphasize how much maximization can be normalized in EA subpockets. You touch on this in your post: “And I’m nervous about what I perceive as dynamics in some circles where people seem to “show off” how little moderation they accept—how self-sacrificing, “weird,” extreme, etc. they’re willing to be in the pursuit of EA goals”). I agree, and I think this is relevant to growing EA Hubs and cause-area silos. If you move to an EA hub that predominantly maximizes along one belief (e.g., AI safety in Berkeley), very natural human tendencies will draw you to also maximize along that belief. Maximizing will win you social approval and dissenting is hard, especially if you’re still young and impressionable (like meee). If you agree with this post’s reasoning, I think you should take active steps to correct for a social bias toward hard-core maximizing (see 2).
If you’re going to maximize along some belief, you should seriously engage with the best arguments for why you’re wrong. Scout mindset baby. Forming true beliefs about a complicated world is hard and motivated reasoning is easy.
Maximizing some things is still pretty cool. I think some readers (of the post and my comment) could come away with a mistaken impression that more moderation in all aspects of EA is always a good thing. I think it’s more nuanced than that: Most people who have done great things in the past have maximized much harder than their peers. I agree one should be cautious of maximizing things we are”conceptually confused about, can’t reliably define or measure, and have massive disagreements about even within EA.” But maximizing some things that are good across a variety of plausibly true beliefs can be pretty awesome for making progress on your goals (e.g., maximizing early-career success and learning). And even if the extreme of maximization is bad, more maximization might be directionally good, depending on how much you’re currently maximizing. We also live in a world that may end within the next 100 years, so you have permission to be desperate.
Upvoted for the first two points, but the third seems not entirely true to me. It’s good to want to increase some things a lot. But maximizing means taking them as more important than everything else, which isn’t good any more.
Example: I’ve been “maximizing for exploration” lately by trying to find new jobs/projects/environments which I can experience for a short period of time and gain information from. But I’m not actually maximizing, which would look more like taking random actions, or doing projects completely unrelated to anything I’ve done before.
This especially seems like bad advice to me:
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
I think I still stand behind the sentiment in (3), I’m just not sure how to best express it.
I agree that 100% (naively) maximizing for something can quickly become counter-productive in practice. It is really hard to know what actions one should take if one is fully maximizing for X, so even if one wants to maximize X it makes sense to take into account things like optics, burnout, Systemic cascading effects, epistemic uncertainty, and whatever else gives you pause before maximizing.
This is the type of considerate maximization I was gesturing at when I said directionally more maximizing might be a good thing for some people (to the extent they genuinely endorse doing the most good), but I recognize that ‘maximizing’ can be understood differently here.
Caveat 1: I think there are lots of things it doesn’t make sense to 100% maximize for, and you shouldn’t tell yourself you are 100% maximizing for them. “Maximizing for exploration” might be such a thing. And even if you were 100% maximizing for exploration, it’s not like you wouldn’t take into account the cost of random actions, venturing into domains you have no context in, and the cost of spending a lot of time thinking about how to best maximize.
Caveat 2: it must be possible to maximize within one of multiple goals. I care a great deal about doing the most good I can, but I also care about feeling alive in this world. I’m lying to myself if I say that something like climbing is only instrumental towards more impact. When I’m working, I’ll maximize for impact (taking into account the uncertainty around how to maximize). When I’m not, I won’t.
[meta note: I have little experience in decision theory, formal consequentialist theory, or whatever else is relevant here, so might be overlooking concepts].
Fair. Nate Soares talks about desperation as a ‘dubious virtue’ in this post, a quality that can “easily turn into a vice if used incorrectly or excessively.” He argues though that you should give yourself permission to go all out for something, at least in theory. And then look around, and see if anything you care about – anything you’re fighting for – is “worthy of a little desperation.”