Searching online, I believe he gave the talk at EA Summit 2013, back when EA community-building was much more volunteer-based and didn’t have much in the way of formal organization.
As for Torres, my secondhand impression was a combination of a) believing that EA-types don’t have social justice-style concerns enough weight compared to the overwhelming importance of the far future, and b) personally feeling upset/jilted that he was rejected from a number of EA jobs.
b) personally feeling upset/jilted that he was rejected from a number of EA jobs.
That feels very uncharitable.
I understand you probably have insider knowledge, but in the linked article he mentions strong disagreements with ideas like:
Saving lives in poor countries may have significantly smaller ripple effects than saving and improving lives in rich countries. Why? Richer countries have substantially more innovation, and their workers are much more economically productive. [Consequently,] it now seems more plausible to me that saving a life in a rich country is substantially more important than saving a life in a poor country, other things being equal.
Which it’s something I see many people can have problems with.
There are plenty of people that don’t like (parts of) longtermism for reasons similar to those in the article, and I don’t think most of them are bitter because they have been rejected from EA jobs or are SJW.
Edit: since this is getting a lot of downvotes, I just want to clarify that I do think that quote is a strawman of some longermist ideas. But I do think we should be charitable of critics’ motivations and at least mention the ones they would agree with.
OP asked a question about Torres specifically. I gave them my personal subjective impression of the best account I have about Torres’ motivations. I’m not going to add a “and criticizing EA is often a virtuous activity and we can learn a lot from our critics and some of our critics may well be pure in heart and soul even if this particular one may not be” caveat to every one of my comments discussing specific criticisms of EA.
Phil isn’t an unknown internet critic whose motivations are opaque; he is/was a well known person whose motivations and behaviour are known first-hand by many in the community. Perhaps other people have other motivations for disliking longtermism, but the question OP asked was about Phil specifically, and Linch gave the Phil specific answer.
Yeah, but who is speaking here? Beckstead? I don’t know any “Beckstead”s. Phil Torres is claiming that The Longtermist Stance is “we should prioritise the lives of people in rich countries over those in poor countries”, even though I’ve never heard EAs say that. At most Beckstead thinks so, though that’s not what Beckstead said. What Beckstead said was provisional (“now seems more plausible to me”) and not a call to action. Torres is trying to drag down discourse by killing nuance and saying misleading things.
Torres’ article is filled with misleading statements, and I have made longer and stronger remarks about it here. (Even so I’m upvoting you, because −6 is too harsh IMO)
Yes the article is indeed full of strawmen and misleading statements. But (not knowing anything about Torres) I felt the top comment was strongly violating the principle of charity when trying to understand the author’s motivations.
I think the principle of charity is very important (especially when posting on a public forum), and saying that someone’s true motivations are not the ones they claim should require extraordinary proof (which maybe is the case! I don’t know anything about the history of this particular case).
Extraordinary proof? This seems too high to me. You need to strike the right balance between diagnosing dishonesty when it doesn’t exist and failing to diagnose it when it does. Both types of errors have serious costs. Given the relatively high prevalence of deception among humans (see e.g. this book), I would be very surprised if requiring “extraordinary proof” of dishonesty produced the best consequences on balance.
Searching online, I believe he gave the talk at EA Summit 2013, back when EA community-building was much more volunteer-based and didn’t have much in the way of formal organization.
As for Torres, my secondhand impression was a combination of a) believing that EA-types don’t have social justice-style concerns enough weight compared to the overwhelming importance of the far future, and b) personally feeling upset/jilted that he was rejected from a number of EA jobs.
He also gave a talk at the EA Summit 2014
That feelsveryuncharitable.I understand you probably have insider knowledge, but in the linked article he mentions strong disagreements with ideas like:Which it’s something I see many people can have problems with.There are plenty of people that don’t like (parts of) longtermism for reasons similar to those in the article, and I don’t think most of them are bitter because they have been rejected from EA jobs or are SJW.Edit: since this is getting a lot of downvotes, I just want to clarify that I do think that quote is a strawman of some longermist ideas. But I do think we should be charitable of critics’ motivations and at least mention the ones they would agree with.
Edit2: Reading through https://forum.effectivealtruism.org/search?terms=torres , it seems there is indeed some extra information and a lot of prior history about Torres’ motivations.
Especially in light of that my tone should have been way less strong. I should have written something like this: https://forum.effectivealtruism.org/posts/xtKRPkoMSLTiPNXhM/response-to-phil-torres-the-case-against-longtermism?commentId=LQhs9jJ3qx7x6Gfiv
https://forum.effectivealtruism.org/posts/xtKRPkoMSLTiPNXhM/response-to-phil-torres-the-case-against-longtermism?commentId=YSRyHbA2vmwMu9ZKo
OP asked a question about Torres specifically. I gave them my personal subjective impression of the best account I have about Torres’ motivations. I’m not going to add a “and criticizing EA is often a virtuous activity and we can learn a lot from our critics and some of our critics may well be pure in heart and soul even if this particular one may not be” caveat to every one of my comments discussing specific criticisms of EA.
Phil isn’t an unknown internet critic whose motivations are opaque; he is/was a well known person whose motivations and behaviour are known first-hand by many in the community. Perhaps other people have other motivations for disliking longtermism, but the question OP asked was about Phil specifically, and Linch gave the Phil specific answer.
Yeah, but who is speaking here? Beckstead? I don’t know any “Beckstead”s. Phil Torres is claiming that The Longtermist Stance is “we should prioritise the lives of people in rich countries over those in poor countries”, even though I’ve never heard EAs say that. At most Beckstead thinks so, though that’s not what Beckstead said. What Beckstead said was provisional (“now seems more plausible to me”) and not a call to action. Torres is trying to drag down discourse by killing nuance and saying misleading things.
Torres’ article is filled with misleading statements, and I have made longer and stronger remarks about it here. (Even so I’m upvoting you, because −6 is too harsh IMO)
Yes the article is indeed full of strawmen and misleading statements.
But (not knowing anything about Torres) I felt the top comment was strongly violating the principle of charity when trying to understand the author’s motivations.
I think the principle of charity is very important (especially when posting on a public forum), and saying that someone’s true motivations are not the ones they claim should require extraordinary proof (which maybe is the case! I don’t know anything about the history of this particular case).
Extraordinary proof? This seems too high to me. You need to strike the right balance between diagnosing dishonesty when it doesn’t exist and failing to diagnose it when it does. Both types of errors have serious costs. Given the relatively high prevalence of deception among humans (see e.g. this book), I would be very surprised if requiring “extraordinary proof” of dishonesty produced the best consequences on balance.