If I ask three people for their time, they don’t know whether they’re helping me get from 0 to 1 person helping with this, 1 to 2 or 9 to 10 for all they know.
Nice, yup, agreed that the information asymmetry is a big part of the problem in this case. I wonder if it could be a good norm to give someone this information when making requests. Like, explicitly say in a message “I’ve asked one other person and think that they will be about as well placed as you to help with this” etc
I do also think that there’s a separate cost to making requests, in that it actually does impose a cost on the person. Like, saying no takes time/energy/decision-power. Obviously this is often small, and in many cases it’s worth asking. But it’s a cost worth considering.
(Now I’ve written this out, I realise that you weren’t claiming that the info asymmetry is the only problem, but I’m going to leave the last paragraph in).
to avoid resenting EA in the medium to long term
This is great and only something I’ve started modelling recently. Curious about what you think this looks like in practice. Like, is it more getting at a mindset of “don’t beat yourself up when you fall short of your altruistic ideals”? Or does it also inform real world decisions for you?
I got a request just last night and was told that the person was asking three people, and while this isn’t perfect for them, I think it was a great thing from my perspective to know.
I don’t think it’s massively relevant for me right now except vaguely paying attention to my mental health and well being, but I think it’s super relevant for new-to-EA and/or young people deciding very quickly how invested to be.
Nice, yup, agreed that the information asymmetry is a big part of the problem in this case. I wonder if it could be a good norm to give someone this information when making requests. Like, explicitly say in a message “I’ve asked one other person and think that they will be about as well placed as you to help with this” etc
I do also think that there’s a separate cost to making requests, in that it actually does impose a cost on the person. Like, saying no takes time/energy/decision-power. Obviously this is often small, and in many cases it’s worth asking. But it’s a cost worth considering.
(Now I’ve written this out, I realise that you weren’t claiming that the info asymmetry is the only problem, but I’m going to leave the last paragraph in).
This is great and only something I’ve started modelling recently. Curious about what you think this looks like in practice. Like, is it more getting at a mindset of “don’t beat yourself up when you fall short of your altruistic ideals”? Or does it also inform real world decisions for you?
Nice
I got a request just last night and was told that the person was asking three people, and while this isn’t perfect for them, I think it was a great thing from my perspective to know.
I don’t think it’s massively relevant for me right now except vaguely paying attention to my mental health and well being, but I think it’s super relevant for new-to-EA and/or young people deciding very quickly how invested to be.