Upvoted for the first two points, but the third seems not entirely true to me. It’s good to want to increase some things a lot. But maximizing means taking them as more important than everything else, which isn’t good any more.
Example: I’ve been “maximizing for exploration” lately by trying to find new jobs/projects/environments which I can experience for a short period of time and gain information from. But I’m not actually maximizing, which would look more like taking random actions, or doing projects completely unrelated to anything I’ve done before.
This especially seems like bad advice to me:
We also live in a world that may end within the next 100 years, so you have permission to be desperate.
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
I think I still stand behind the sentiment in (3), I’m just not sure how to best express it.
I agree that 100% (naively) maximizing for something can quickly become counter-productive in practice. It is really hard to know what actions one should take if one is fully maximizing for X, so even if one wants to maximize X it makes sense to take into account things like optics, burnout, Systemic cascading effects, epistemic uncertainty, and whatever else gives you pause before maximizing.
This is the type of considerate maximization I was gesturing at when I said directionally more maximizing might be a good thing for some people (to the extent they genuinely endorse doing the most good), but I recognize that ‘maximizing’ can be understood differently here.
Caveat 1: I think there are lots of things it doesn’t make sense to 100% maximize for, and you shouldn’t tell yourself you are 100% maximizing for them. “Maximizing for exploration” might be such a thing. And even if you were 100% maximizing for exploration, it’s not like you wouldn’t take into account the cost of random actions, venturing into domains you have no context in, and the cost of spending a lot of time thinking about how to best maximize.
Caveat 2: it must be possible to maximize within one of multiple goals. I care a great deal about doing the most good I can, but I also care about feeling alive in this world. I’m lying to myself if I say that something like climbing is only instrumental towards more impact. When I’m working, I’ll maximize for impact (taking into account the uncertainty around how to maximize). When I’m not, I won’t.
[meta note: I have little experience in decision theory, formal consequentialist theory, or whatever else is relevant here, so might be overlooking concepts].
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
Fair. Nate Soares talks about desperation as a ‘dubious virtue’ in this post, a quality that can “easily turn into a vice if used incorrectly or excessively.” He argues though that you should give yourself permission to go all out for something, at least in theory. And then look around, and see if anything you care about – anything you’re fighting for – is “worthy of a little desperation.”
Upvoted for the first two points, but the third seems not entirely true to me. It’s good to want to increase some things a lot. But maximizing means taking them as more important than everything else, which isn’t good any more.
Example: I’ve been “maximizing for exploration” lately by trying to find new jobs/projects/environments which I can experience for a short period of time and gain information from. But I’m not actually maximizing, which would look more like taking random actions, or doing projects completely unrelated to anything I’ve done before.
This especially seems like bad advice to me:
Maybe the linked post makes it mean something other than what I understand—but most people aren’t going to read it.
I think I still stand behind the sentiment in (3), I’m just not sure how to best express it.
I agree that 100% (naively) maximizing for something can quickly become counter-productive in practice. It is really hard to know what actions one should take if one is fully maximizing for X, so even if one wants to maximize X it makes sense to take into account things like optics, burnout, Systemic cascading effects, epistemic uncertainty, and whatever else gives you pause before maximizing.
This is the type of considerate maximization I was gesturing at when I said directionally more maximizing might be a good thing for some people (to the extent they genuinely endorse doing the most good), but I recognize that ‘maximizing’ can be understood differently here.
Caveat 1: I think there are lots of things it doesn’t make sense to 100% maximize for, and you shouldn’t tell yourself you are 100% maximizing for them. “Maximizing for exploration” might be such a thing. And even if you were 100% maximizing for exploration, it’s not like you wouldn’t take into account the cost of random actions, venturing into domains you have no context in, and the cost of spending a lot of time thinking about how to best maximize.
Caveat 2: it must be possible to maximize within one of multiple goals. I care a great deal about doing the most good I can, but I also care about feeling alive in this world. I’m lying to myself if I say that something like climbing is only instrumental towards more impact. When I’m working, I’ll maximize for impact (taking into account the uncertainty around how to maximize). When I’m not, I won’t.
[meta note: I have little experience in decision theory, formal consequentialist theory, or whatever else is relevant here, so might be overlooking concepts].
Fair. Nate Soares talks about desperation as a ‘dubious virtue’ in this post, a quality that can “easily turn into a vice if used incorrectly or excessively.” He argues though that you should give yourself permission to go all out for something, at least in theory. And then look around, and see if anything you care about – anything you’re fighting for – is “worthy of a little desperation.”