Before I dive into your new book, I have one question:
What impact do you hope WWOTF will have on philanthropic work?
Since hearing about your book in May, my reading and giving have shifted toward long-term matters (I went from tossing cash like the world’s on fire to vetting business managers on their approaches to Patient Philanthropy).
The “holy shit, x-risk” is that approx 25 people die if I read WWOTF and reallocate the $111,100 I earmarked for AMF via my EA-inspired giving fund.
I know you are pushing LEEP. That seems to align with near-termist methodology. But if the shift in my focus is an example of how your work impacts giving, what is the best case impact WWOTF has on philanthropy?
This isn’t an answer to your question, but I’m curious why you consider this to be a question worth raising for longtermism specifically, given that the same question could be raised for any claim of the form ‘x has priority over y’ and the potential to influence how EA resources are allocated. For example, you could ask someone claiming that farmed animal welfare has priority over global health and development whether they worry about the human beings who will predictably die—or be “killed”, as you put it—if donors reallocate funds in accordance with this ranking.
(Separately, note that the form of longtermism defended in WWOTF does not in fact imply that benefitting future beings has priority over helping people now.)
I raise the question because this thread is an AMA with Will on his current book. Will’s writing impacts my philanthropy, and so I am curious the type of impact he expected on philanthropic work before I dive in. Editing the question to speak more to that point.
As far as ‘x has priority over y’ is concerned, I agree that that type of calculation can be applied to any cause. My EA-inspired philanthropy is largely allocated to organizations that can save more lives in a quicker time than others. (Grants to folks who can’t afford drugs for Multiple myeloma is a current favorite of mine.)
Hey Will,
Before I dive into your new book, I have one question:
What impact do you hope WWOTF will have on philanthropic work?
Since hearing about your book in May, my reading and giving have shifted toward long-term matters (I went from tossing cash like the world’s on fire to vetting business managers on their approaches to Patient Philanthropy).
The “holy shit, x-risk” is that approx 25 people die if I read WWOTF and reallocate the $111,100 I earmarked for AMF via my EA-inspired giving fund.
I know you are pushing LEEP. That seems to align with near-termist methodology. But if the shift in my focus is an example of how your work impacts giving, what is the best case impact WWOTF has on philanthropy?
PS- Edited. See comments. Thanks Pablo!
This isn’t an answer to your question, but I’m curious why you consider this to be a question worth raising for longtermism specifically, given that the same question could be raised for any claim of the form ‘x has priority over y’ and the potential to influence how EA resources are allocated. For example, you could ask someone claiming that farmed animal welfare has priority over global health and development whether they worry about the human beings who will predictably die—or be “killed”, as you put it—if donors reallocate funds in accordance with this ranking.
(Separately, note that the form of longtermism defended in WWOTF does not in fact imply that benefitting future beings has priority over helping people now.)
I raise the question because this thread is an AMA with Will on his current book. Will’s writing impacts my philanthropy, and so I am curious the type of impact he expected on philanthropic work before I dive in. Editing the question to speak more to that point.
As far as ‘x has priority over y’ is concerned, I agree that that type of calculation can be applied to any cause. My EA-inspired philanthropy is largely allocated to organizations that can save more lives in a quicker time than others. (Grants to folks who can’t afford drugs for Multiple myeloma is a current favorite of mine.)
Re: “Killing”. Point taken.
If you haven’t already seen it, you might be interested in this comment from Rohin on the attitude of longtermists.