Aspiring rationalist. Trying to do my best to pay the world back for all the good it gives me.
EA Finland Executive Director starting in September 2025.
I’m curious about problems of consciousness, moral status, and technical AI safety. But currently focused on community building.
I think in general if we agree to a ballpark of “10% donations is enough to satisfice some goodness thresholds”, and also to “It would be good for social pressure to exist for everyone to do at least threshold amount of good”, I think it raises various considerations.
10% makes sense to me as a schelling point (and I think the tables that scale by income bracket are also sensible).
But if the threshold amount of good would be “Donate 10%, aim for an impactful career, become vegan” (which is what I feel the social pressure inside EA is pointing towards), I think that is already a significant ask for many people.
I think it is also important to note that some people are more motivated by trying to maximize impact and offset harm, and some people more motivated by minimizing harm and satisficing for impact. (Of course a standard total utilitarian model would output that whatever maximizes your net impact is best, but human value systems aren’t perfectly utilitarian.)
How do “donate 10%, become vegan, aim for an impactful”, and “donate 30%”, and “donate 20%, aim for an impactful career” compare in effectiveness as norms? I think this is pretty hard to estimate.
What kind of social pressure are you pointing here? Is it more in the direction of “donate 30%” or “donate as much as you can and aim for an impactful career?” Or do you mean social pressure in the wider society, and not within the EA community?
(Fwiw I think people underestimate the value of effective marginal spending on themselves, when considering areas of spending where there is space for significant extra value (Like purchasing more free time.). People plausibly overestimate the value on some other things, especially if one doesn’t do spending introspectiont.)