Thanks for sharing. It looks like this article is less of a good faith effort than I had thought
Ben_Eisenpress
FLI is hiring across Comms and Ops
Phil Torres’ article: “The Dangerous Ideas of ‘Longtermism’ and ‘Existential Risk’”
Everything you have matches my understanding. For me, the key commonality between long-termist EA and Progress Studies is valuing the far future. In economist terms, a zero discount rate. The difference is time frame: Progress Studies is implicitly assuming shorter civilization. If civilization is going to last for millions of years, what does it matter if we accelerate progress by a few hundred or even a few thousand years? Much better to minimize existential risk. Tyler Cowen outlines this well in a talk he gave at Stanford. In his view, “probably we’ll have advanced civilization for something like another 6, 700 years… [It] means if we got 600 years of a higher growth rate, that’s much, much better for the world, but it’s not so much value out there that we should just play it safe across all margins [to avoid existential risk.]” He is fundamentally pessimistic about our ability to mitigate existential risks. Now I don’t think most people in Progress Studies think this way, but its the only way I see to square a zero discount rate with any priority other than minimizing existential risk.
Giving what we can now has over 6,000 people (the accomplishments page says 5,000)