For a while, I’ve been thinking about the following problem: as you get better models of the world/ability to get better models of the world, you start noticing things that are inconvenient for others. Some of those inconvenient truths can break coordination games people are playing, and leave them with worse alternatives.
Some examples:
The reason why organization X is doing weird things is because their director is weirdly incompetent
XYZ is explained by people jockeying for influence inside some organization
Y is the case but would be super inconvenient to the ideology du jour
Z is the case, but our major funder is committed to believing that this is not the case
E.g., AI is important, but a big funder thinks that it will be important in a different way and there is no bandwidth to communicate.
Jobs in some area are very well paid, which creates incentives for people to justify that area
Someone builds their identity on a linchpin which is ultimately false (“my pet area is important”)
“If I stopped believing in [my religion] my wife would leave me”—true story.
Such and such a cluster of people systematically overestimate how altruistic they are, which has a bunch of bad effects for themselves and others when they interact with organizations focused on effectiveness
Poetically, if you stare into the abyss, the abyss then later stares at others through your eyes, and people don’t like that.
I don’t really have many conclusions here. So far when I notice a situation like the above I tend to just leave, but this doesn’t seem like a great solution, or like a solution at all sometimes. I’m wondering whether you’ve thought about this, about whether and how some parts of what EA does are premised on things that are false.
Perhaps relatedly or perhaps as a non-sequitur, I’m also curious about what changed since your post a year ago talking about how EA doesn’t bring out the best in you.
Perhaps relatedly or perhaps as a non-sequitur, I’m also curious about what changed since your post a year ago talking about how EA doesn’t bring out the best in you.
This seems related to me, and I don’t have a full answer here, but some things that come to mind:
For me personally, I feel a lot happier engaging with EA than I did previously. I don’t quite know why this is, I think some combination of: being more selective in terms of what I engage with and how, having a more realistic view of what EA is and what to expect from it, being part of other social environments which I get value from and make me feel less ‘attached to EA’ and my mental health improving. And also perhaps having a stronger view of what I want to be different with EA, and feeling more willing to stand behind that.
I still feel pretty wary of things which I feel that EA ‘brings out of me’ (envy, dismissiveness, self-centredness etc.) which I don’t like, and it still can feel like a struggle to avoid the pull of those things.
as you get better models of the world/ability to get better models of the world, you start noticing things that are inconvenient for others. Some of those inconvenient truths can break coordination games people are playing, and leave them with worse alternatives.
I haven’t thought about this particular framing before, and it’s interesting to me to think about—I don’t quite have an opinion on it at the moment. Here’s some of the things that are on my mind at the moment which feel related to this.
For a while, I’ve been thinking about the following problem: as you get better models of the world/ability to get better models of the world, you start noticing things that are inconvenient for others. Some of those inconvenient truths can break coordination games people are playing, and leave them with worse alternatives.
Some examples:
The reason why organization X is doing weird things is because their director is weirdly incompetent
XYZ is explained by people jockeying for influence inside some organization
Y is the case but would be super inconvenient to the ideology du jour
Z is the case, but our major funder is committed to believing that this is not the case
E.g., AI is important, but a big funder thinks that it will be important in a different way and there is no bandwidth to communicate.
Jobs in some area are very well paid, which creates incentives for people to justify that area
Someone builds their identity on a linchpin which is ultimately false (“my pet area is important”)
“If I stopped believing in [my religion] my wife would leave me”—true story.
Such and such a cluster of people systematically overestimate how altruistic they are, which has a bunch of bad effects for themselves and others when they interact with organizations focused on effectiveness
Poetically, if you stare into the abyss, the abyss then later stares at others through your eyes, and people don’t like that.
I don’t really have many conclusions here. So far when I notice a situation like the above I tend to just leave, but this doesn’t seem like a great solution, or like a solution at all sometimes. I’m wondering whether you’ve thought about this, about whether and how some parts of what EA does are premised on things that are false.
Perhaps relatedly or perhaps as a non-sequitur, I’m also curious about what changed since your post a year ago talking about how EA doesn’t bring out the best in you.
This seems related to me, and I don’t have a full answer here, but some things that come to mind:
For me personally, I feel a lot happier engaging with EA than I did previously. I don’t quite know why this is, I think some combination of: being more selective in terms of what I engage with and how, having a more realistic view of what EA is and what to expect from it, being part of other social environments which I get value from and make me feel less ‘attached to EA’ and my mental health improving. And also perhaps having a stronger view of what I want to be different with EA, and feeling more willing to stand behind that.
I still feel pretty wary of things which I feel that EA ‘brings out of me’ (envy, dismissiveness, self-centredness etc.) which I don’t like, and it still can feel like a struggle to avoid the pull of those things.
I haven’t thought about this particular framing before, and it’s interesting to me to think about—I don’t quite have an opinion on it at the moment. Here’s some of the things that are on my mind at the moment which feel related to this.