I agree that it’s well worth acknowledging that many of us have parts of ourselves that want social validation, and that the action that gets you the most social approval in the EA community is often not the same as the action that is best for the world.
I also think it’s very possible to believe that your main motivation for doing something is impact, when your true motivation is actually that people in the community will think more highly of you. [1]
Here are some quick ideas about how we might try to prevent our desire for social validation from reducing our impact:
We could acknowledge our need for social validation, and try to meet it in other areas of our lives, so that we care less about getting it from people in the EA community through appearing to have an impact, freeing us up to focus on actually having an impact.
We could strike a compromise between the part of us wants social validation from the EA community, and the part of us that wants to have an impact. For example, we might allow ourselves to spend some effort trying to get validation (e.g. writing forum posts, building our networks, achieving positions of status in the community), in the full knowledge that they’re mainly useful in satisfying our need for validation so that our main efforts (e.g. our full-time work) can be focused on what we think is directly most impactful.
We could spend time emotionally connecting with whatever drives us to help others, reflecting on the reality that others’ wellbeing really does depend on our actions, proactively noticing situations where we have a choice between more validation or more impact, and being intentional in choosing what we think is overall best.
We might try to align our parts by noticing that although people might be somewhat impressed with our short-term efforts to impress them, in the long run we will probably get even more social status in the EA community if we skill up and achieve something that is genuinely valuable.
Relatedly, we might think about who we would ideally like to impress. Perhaps we could impress some people by simply doing whatever is cool in the EA movement right now. But the people whose approval we might value most will be less impressed by that, and more impressed by us actually being strategic about how best to have an impact. In other words, we might stop considering people pursuing today’s hot topic as necessarily cool, and start thinking about people who genuinely pursue impact as the cool group we aspire to join.
I haven’t read it, but I think the premise of The Elephant in the Brain is that self deception like this is in our own interests, because we can truthfully claim to have a virtuous motivation even if that’s not the case.
I agree that it’s well worth acknowledging that many of us have parts of ourselves that want social validation, and that the action that gets you the most social approval in the EA community is often not the same as the action that is best for the world.
I also think it’s very possible to believe that your main motivation for doing something is impact, when your true motivation is actually that people in the community will think more highly of you. [1]
Here are some quick ideas about how we might try to prevent our desire for social validation from reducing our impact:
We could acknowledge our need for social validation, and try to meet it in other areas of our lives, so that we care less about getting it from people in the EA community through appearing to have an impact, freeing us up to focus on actually having an impact.
We could strike a compromise between the part of us wants social validation from the EA community, and the part of us that wants to have an impact. For example, we might allow ourselves to spend some effort trying to get validation (e.g. writing forum posts, building our networks, achieving positions of status in the community), in the full knowledge that they’re mainly useful in satisfying our need for validation so that our main efforts (e.g. our full-time work) can be focused on what we think is directly most impactful.
We could spend time emotionally connecting with whatever drives us to help others, reflecting on the reality that others’ wellbeing really does depend on our actions, proactively noticing situations where we have a choice between more validation or more impact, and being intentional in choosing what we think is overall best.
We might try to align our parts by noticing that although people might be somewhat impressed with our short-term efforts to impress them, in the long run we will probably get even more social status in the EA community if we skill up and achieve something that is genuinely valuable.
Relatedly, we might think about who we would ideally like to impress. Perhaps we could impress some people by simply doing whatever is cool in the EA movement right now. But the people whose approval we might value most will be less impressed by that, and more impressed by us actually being strategic about how best to have an impact. In other words, we might stop considering people pursuing today’s hot topic as necessarily cool, and start thinking about people who genuinely pursue impact as the cool group we aspire to join.
I haven’t read it, but I think the premise of The Elephant in the Brain is that self deception like this is in our own interests, because we can truthfully claim to have a virtuous motivation even if that’s not the case.