Rory Stewart said that after 15min someone at FTXFF cut his call with Rory short because that person wanted to go have lunch. The person reportedly also said “I don’t care about poverty”.
Rory Stewart (the ex-President of GiveDirectly, and ex-MP) now seems to think that we are weird futurists who care more about “asteroids and killer robots” than we care about the 700M people currently in poverty.
(not endorsing Sam Harris here and not saying Stewart is not directionally correct).
I doubt that Nick Beckstead literally said ‘I don’t care about poverty’.
He seems bitter that his EA org was unable to raise funds from the Future Fund even though it had a different focus area and risk profile. Now he’s shoehorning his peeves into the FTX fraud.
It’s just my inside view that he carelessly and to some extent intentionally plays fast and loose with the truth to the point of libel by saying that Beckstead said ‘To be honest I don’t care that much about poverty’ and then ended the call and went off to to have lunch. Stewart then framed it in a way as if he just in a very unreflective way just cares about ‘asteroid strikes and robot overlords’ - you can also call it hyperbole. I think he just couldn’t bear that someone younger - ‘sitting in California in his hoodie’- didn’t want to give him- Rory Stewart OBE- a grant for a charity whose effectiveness he probably understands less well then Beckstead. I have a strong prior that he misrepresented Beckstead’s view on this (Beckstead used to work for Givewell) and also due to the Sam Harris incident (which I only came across incidentally because I sometimes hate read Sam Harris on this topic). I thought it was worth it to come out strong with my inside view and on the spectrum from misremembering to lying I’m more inclined to call it lying.
He did mention the head of the FTX foundation which was Nick Beckstead—not sure about the others, but would still seem weird for them to say it like that—maybe one of the younger staff members said something like ‘I care more about the far future’ or something along the lines of ‘GiveDirectly is too risk averse’. but would still think he’s painting quite the stereotype of EA here.
Wasn’t the Future Fund quite explicitly about longtermist projects?
I mean if you worked for an animal foundation and were in a call about give directly, I can understand that somebody might say: “Look we are an animal fund, global poverty is outside our scope”.
Obviously saying “I don’t care about poverty” or something sufficiently close that your counterpart remembers it as that, is not ideal, especially not when you’re speaking to an ex-minister of the United Kingdom.
But before we get mad at those who ran the Future Fund, please consider there’s much context we don’t have. Why did this call get set up in the first place? I would expect them to be screening mechanisms in place to prevent this kind of mismatch. What Rory remembers might not have been what the Future Fund grant maker remembers and there might have been a mismatch between the very blunt ‘SF culture’ the future fund operated by and what an ex-minister expects.
That said I have a very positive impression of Rory Stewart, and it saddens me to hear our community gave him this perception. Had I been in his shoes, I’m not sure I would have thought any different.
There’s some truth here, but I think it’s part of your job as the head of any EA org to present the best side of all aspects of effective altruism. Even if you disagree with nearterm causes, speaking with grace and understanding of those who work to alleviate poverty will help the PR of your longtermist org
I think we should get the context from the future fund people, but really they should probably adjust have already commented here to explain if they were misrepresented, and called Rory Stewart to apologise and clear things up.
I think it’s part of your job as the head of any EA org to present the best side of all aspects of effective autism
Why? We should hire leaders based on how well suited they are to running the organization in question. There is no requirement that to work in EA you have to agree with all EA causes, or that you should pretend that you do.
We should hire leaders based on how well suited they are to running the organization in question
I’d argue that an important part of running a new philanthropic organisation is stakeholder engagement and relationship management, and this was not a good example of fostering a good relationship with someone who is highly influential and a likely source of valuable connections with respect to FF’s goals.
I’m somewhere in the middle—we should not expect org leaders to be true believers in everything other EAs do, but we should score at least to some extent against making other orgs/EAs work more difficult without good cause. An EA in which each cause area / org optimizes solely for its own work is an EA that gets less good done than possible.
I don’t know what actually happened between Rory and Nick, of course. There are plausible versions of what happened in which Nick’s actions and comments deserve criticism, and others in which they do not.
On (2), while it’s obviously rhetorically slanted, isn’t that a fair framing of longtermism? They do care more about the gazillions of future lives than the smaller number of present ones and they seem to understand that this is not aligned with popular intuitions on the subject.
I am not sure longtermism is compatible with good PR or having ordinary people immediately grok its conclusions on intuition alone…
(Or was your problem more that this misrepresents the actual funding allocations, in which case I wholeheartedly agree 💙)
Maybe I’m naive, but I think “care more about “asteroids and killer robots” than we care about the 700M people currently in poverty” is a terrible framing of Longtermism. Maybe it’s technically correct one level but all compassion and nuance is lost.
Longtermists care deeply about the 700 million people in poverty, and many of them have already given much of their life and money towards ending that poverty. They also see the huge future potential of humanity and see value in building and safeguarding a better future for their children and their children’s children. They’ve decided to devote their life to that better future, while many other effective altruists continue to raise funds and forge solutions to alleviate current global poverty.
Longtermism is compatible with good PR, I think we just often do it badly.
Overall this might be another tragic loss of a high profile smart guy who used to be at least a soft EA supporter.
I think I agree with the general point you’re making, but I specifically I disagree that the longtermist project is incompatible with good PR, and that it doesn’t appeal to common moral intuition (eg people do care about climate change, nuclear war, rogue AI, deadly pandemics).
eg people do care about climate change, nuclear war, rogue AI, deadly pandemics
But those things are also important without longtermism. So you can make non-longtermist PR for things that longtermists like, but it would feel dishonest if you hide the part with large numbers of people millions of years into the future.
EA (via discussion of SBF and FTX) was briefly discussed on the The Rest is Politics Podcast today (the 3rd of April) and …. I’m really irritated by what was said. This is one of the largest politics podcasts in the world at the moment, and has a seriously influential listener-base.
Rory Stewart said that after 15min someone at FTXFF cut his call with Rory short because that person wanted to go have lunch. The person reportedly also said “I don’t care about poverty”.
Rory Stewart (the ex-President of GiveDirectly, and ex-MP) now seems to think that we are weird futurists who care more about “asteroids and killer robots” than we care about the 700M people currently in poverty.
Great work, whoever that FTX person was...
I think Rory Stewart is lying… he has had problems with this recently:
https://www.samharris.org/podcasts/making-sense-episodes/356-islam-freedom
(not endorsing Sam Harris here and not saying Stewart is not directionally correct).
I doubt that Nick Beckstead literally said ‘I don’t care about poverty’.
He seems bitter that his EA org was unable to raise funds from the Future Fund even though it had a different focus area and risk profile. Now he’s shoehorning his peeves into the FTX fraud.
Was that lying or misremembering though? Lying is a fairly big accusation to make.
It’s just my inside view that he carelessly and to some extent intentionally plays fast and loose with the truth to the point of libel by saying that Beckstead said ‘To be honest I don’t care that much about poverty’ and then ended the call and went off to to have lunch. Stewart then framed it in a way as if he just in a very unreflective way just cares about ‘asteroid strikes and robot overlords’ - you can also call it hyperbole. I think he just couldn’t bear that someone younger - ‘sitting in California in his hoodie’- didn’t want to give him- Rory Stewart OBE- a grant for a charity whose effectiveness he probably understands less well then Beckstead. I have a strong prior that he misrepresented Beckstead’s view on this (Beckstead used to work for Givewell) and also due to the Sam Harris incident (which I only came across incidentally because I sometimes hate read Sam Harris on this topic). I thought it was worth it to come out strong with my inside view and on the spectrum from misremembering to lying I’m more inclined to call it lying.
Are you sure it’s not the other possible candidate? I have only heard negative things about one of their personalities.
He did mention the head of the FTX foundation which was Nick Beckstead—not sure about the others, but would still seem weird for them to say it like that—maybe one of the younger staff members said something like ‘I care more about the far future’ or something along the lines of ‘GiveDirectly is too risk averse’. but would still think he’s painting quite the stereotype of EA here.
Wasn’t the Future Fund quite explicitly about longtermist projects?
I mean if you worked for an animal foundation and were in a call about give directly, I can understand that somebody might say: “Look we are an animal fund, global poverty is outside our scope”.
Obviously saying “I don’t care about poverty” or something sufficiently close that your counterpart remembers it as that, is not ideal, especially not when you’re speaking to an ex-minister of the United Kingdom.
But before we get mad at those who ran the Future Fund, please consider there’s much context we don’t have. Why did this call get set up in the first place? I would expect them to be screening mechanisms in place to prevent this kind of mismatch. What Rory remembers might not have been what the Future Fund grant maker remembers and there might have been a mismatch between the very blunt ‘SF culture’ the future fund operated by and what an ex-minister expects.
That said I have a very positive impression of Rory Stewart, and it saddens me to hear our community gave him this perception. Had I been in his shoes, I’m not sure I would have thought any different.
There’s some truth here, but I think it’s part of your job as the head of any EA org to present the best side of all aspects of effective altruism. Even if you disagree with nearterm causes, speaking with grace and understanding of those who work to alleviate poverty will help the PR of your longtermist org
I think we should get the context from the future fund people, but really they should probably adjust have already commented here to explain if they were misrepresented, and called Rory Stewart to apologise and clear things up.
Can’t tell if joke or typo, but I enjoyed it either way
Sorry mistake, corrected lol
Can’t tell if the correction is joke or genuine misunderstanding, but I enjoyed this even more
Why? We should hire leaders based on how well suited they are to running the organization in question. There is no requirement that to work in EA you have to agree with all EA causes, or that you should pretend that you do.
I’d argue that an important part of running a new philanthropic organisation is stakeholder engagement and relationship management, and this was not a good example of fostering a good relationship with someone who is highly influential and a likely source of valuable connections with respect to FF’s goals.
I’m somewhere in the middle—we should not expect org leaders to be true believers in everything other EAs do, but we should score at least to some extent against making other orgs/EAs work more difficult without good cause. An EA in which each cause area / org optimizes solely for its own work is an EA that gets less good done than possible.
I don’t know what actually happened between Rory and Nick, of course. There are plausible versions of what happened in which Nick’s actions and comments deserve criticism, and others in which they do not.
My deductions were here, there are two main candidates given the information available (if it is reliable).
On (2), while it’s obviously rhetorically slanted, isn’t that a fair framing of longtermism? They do care more about the gazillions of future lives than the smaller number of present ones and they seem to understand that this is not aligned with popular intuitions on the subject.
I am not sure longtermism is compatible with good PR or having ordinary people immediately grok its conclusions on intuition alone…
(Or was your problem more that this misrepresents the actual funding allocations, in which case I wholeheartedly agree 💙)
Maybe I’m naive, but I think “care more about “asteroids and killer robots” than we care about the 700M people currently in poverty” is a terrible framing of Longtermism. Maybe it’s technically correct one level but all compassion and nuance is lost.
Longtermists care deeply about the 700 million people in poverty, and many of them have already given much of their life and money towards ending that poverty. They also see the huge future potential of humanity and see value in building and safeguarding a better future for their children and their children’s children. They’ve decided to devote their life to that better future, while many other effective altruists continue to raise funds and forge solutions to alleviate current global poverty.
Longtermism is compatible with good PR, I think we just often do it badly.
Overall this might be another tragic loss of a high profile smart guy who used to be at least a soft EA supporter.
I think I agree with the general point you’re making, but I specifically I disagree that the longtermist project is incompatible with good PR, and that it doesn’t appeal to common moral intuition (eg people do care about climate change, nuclear war, rogue AI, deadly pandemics).
But those things are also important without longtermism. So you can make non-longtermist PR for things that longtermists like, but it would feel dishonest if you hide the part with large numbers of people millions of years into the future.