Props for writing! Some things I think strengthen this point even more: 1.
“I want to take people at their word – if they agree to something I ask, then I’ll believe them.”
If I ask three people for their time, they don’t know whether they’re helping me get from 0 to 1 person helping with this, 1 to 2 or 9 to 10 for all they know. I also may not know that, but it’s a different ask, and people can reasonably have their bar for 0 to 1 or after a certain amount of effort the asker has put in that they’ve assumed they have
2.
We all (probably) have goals that do not feel like hustling and striving, for example:
have meaningful and emotionally close relationships;
call my mum regularly;
be a caring friend.
I think people who care about EA and / or their impact should probably be willing to take a bunch of steps / take on a bunch of costs to avoid resenting EA in the medium to long term.
3. A frame I’ve found useful is to model EA as sort of an agent. If ten minutes of someone else’s time can save me two hours of mine, that can be a very reasonable trade. If I could have spent 30 minutes figuring something out to save someone else 20, I might value their time that much more, and then wouldn’t want to ask them to give that up.
If I ask three people for their time, they don’t know whether they’re helping me get from 0 to 1 person helping with this, 1 to 2 or 9 to 10 for all they know.
Nice, yup, agreed that the information asymmetry is a big part of the problem in this case. I wonder if it could be a good norm to give someone this information when making requests. Like, explicitly say in a message “I’ve asked one other person and think that they will be about as well placed as you to help with this” etc
I do also think that there’s a separate cost to making requests, in that it actually does impose a cost on the person. Like, saying no takes time/energy/decision-power. Obviously this is often small, and in many cases it’s worth asking. But it’s a cost worth considering.
(Now I’ve written this out, I realise that you weren’t claiming that the info asymmetry is the only problem, but I’m going to leave the last paragraph in).
to avoid resenting EA in the medium to long term
This is great and only something I’ve started modelling recently. Curious about what you think this looks like in practice. Like, is it more getting at a mindset of “don’t beat yourself up when you fall short of your altruistic ideals”? Or does it also inform real world decisions for you?
I got a request just last night and was told that the person was asking three people, and while this isn’t perfect for them, I think it was a great thing from my perspective to know.
I don’t think it’s massively relevant for me right now except vaguely paying attention to my mental health and well being, but I think it’s super relevant for new-to-EA and/or young people deciding very quickly how invested to be.
Props for writing! Some things I think strengthen this point even more:
1.
If I ask three people for their time, they don’t know whether they’re helping me get from 0 to 1 person helping with this, 1 to 2 or 9 to 10 for all they know. I also may not know that, but it’s a different ask, and people can reasonably have their bar for 0 to 1 or after a certain amount of effort the asker has put in that they’ve assumed they have
2.
I think people who care about EA and / or their impact should probably be willing to take a bunch of steps / take on a bunch of costs to avoid resenting EA in the medium to long term.
3. A frame I’ve found useful is to model EA as sort of an agent. If ten minutes of someone else’s time can save me two hours of mine, that can be a very reasonable trade. If I could have spent 30 minutes figuring something out to save someone else 20, I might value their time that much more, and then wouldn’t want to ask them to give that up.
Nice, yup, agreed that the information asymmetry is a big part of the problem in this case. I wonder if it could be a good norm to give someone this information when making requests. Like, explicitly say in a message “I’ve asked one other person and think that they will be about as well placed as you to help with this” etc
I do also think that there’s a separate cost to making requests, in that it actually does impose a cost on the person. Like, saying no takes time/energy/decision-power. Obviously this is often small, and in many cases it’s worth asking. But it’s a cost worth considering.
(Now I’ve written this out, I realise that you weren’t claiming that the info asymmetry is the only problem, but I’m going to leave the last paragraph in).
This is great and only something I’ve started modelling recently. Curious about what you think this looks like in practice. Like, is it more getting at a mindset of “don’t beat yourself up when you fall short of your altruistic ideals”? Or does it also inform real world decisions for you?
Nice
I got a request just last night and was told that the person was asking three people, and while this isn’t perfect for them, I think it was a great thing from my perspective to know.
I don’t think it’s massively relevant for me right now except vaguely paying attention to my mental health and well being, but I think it’s super relevant for new-to-EA and/or young people deciding very quickly how invested to be.