Great post, Michael! The more I have realised how uncertain the world is, the more I have come to appreciate this post.
I think y ⇐ 100 should be y ⇐ 50.
I think the sum of the 2nd component should be 200 + 50 + 50 = 300 (not 200 + 200 + 50 = 450).
I personally doubt that we have fundamental reasons to decide as a community (coordination and cooperation are instrumental reasons). Either our (moral) reasons are agent-relative or agent-neutral/universal; they are not relative to some specific and fairly arbitrarily defined group like the EA community.
Good point. Personally, I think our reasons are agent-neutral, i.e. that we should think about how to improve the portfolio of the universe, not our own portfolio or that of the EA community.
Great post, Michael! The more I have realised how uncertain the world is, the more I have come to appreciate this post.
I think y ⇐ 100 should be y ⇐ 50.
I think the sum of the 2nd component should be 200 + 50 + 50 = 300 (not 200 + 200 + 50 = 450).
Good point. Personally, I think our reasons are agent-neutral, i.e. that we should think about how to improve the portfolio of the universe, not our own portfolio or that of the EA community.
Thank you for the kind words and the corrections!