I think that if a new donor appeared and increased the amount of funding available to longtermism by $100B, this would maybe increase the total value of longtermist EA by 20%.
I’m curious how much $s you and others think that longtermist EA has access to right now/will have access to in the near future. The 20% number seems like a noticeably weaker claim if longtermist EA currently has access to 100B than if we currently have access to 100M.
I actually think this is surprisingly non-straightforward. Any estimate of the net present value of total longtermist $$ will have considerable uncertainty because it’s a combination of several things, many of which are highly uncertain:
How much longtermist $$ is there now?
This is the least uncertain one. It’s not super straightforward and requires nonpublic knowledge about the wealth and goals of some large individual donors, but I’d be surprised if my estimate on this was off by 10x.
What will the financial returns on current longtermist $$ be before they’re being spent?
Over long timescales, for some of that capital, this might be ‘only’ as volatile as the stock market or some other ‘broad’ index.
But for some share of that capital (as well as on shorter time scale) this will be absurdly volatile. Cf. the recent fortunes some EAs have made in crypto.
How much new longtermist $$ will come in at which times in the future?
This seems highly uncertain because it’s probably very heavy-tailed. E.g., there may well be a single source that increases total capital by 2x or 10x. Naturally, predicting the timing of such a single event will be quite uncertain on a time scale of years or even decades.
What should the discount rate for longtermist $$ be?
Over the last year, someone who has thought about this quite a bit told me first that they had updated from 10% per year to 6%, and then a few months later back again. This is a difference of one order of magnitude for $$ coming in in 50 years.
What counts as longtermist $$? If, e.g., the US government started spending billions on AI safety or biosecurity, most of which goes to things that from a longtermist EA perspective are kind of but not super useful, how would that count?
I think for some narrow notion of roughly “longtermist $$ as ‘aligned’ as Open Phil’s longtermist pot” my 80% credence interval for the net present value is $30B - $1 trillion. I’m super confused how to think about the upper end because the 90th percentile case is some super weird transformative AI future. Maybe I should instead say that my 50% credence interval is $20B - $200B.
Generally my view on this isn’t that well considered and probably not that resilient.
… my 80% credence interval for the net present value is $30B - $1 trillion. I’m super confused how to think about the upper end because the 90th percentile case is some super weird transformative AI future. Maybe I should instead say that my 50% credence interval is $20B - $200B.′ [emphases added]
Shouldn’t your lower bound for the 50% interval be higher than for the 80% interval? Or is the second interval based on different assumptions, e.g. including/ruling out some AI stuff?
(Not sure this is an important question, given how much uncertainty there is in these numbers anyway.)
Shouldn’t your lower bound for the 50% interval be higher than for the 80% interval?
If the intervals were centered—i.e., spanning the 10th to 90th and the 25th to 75th percentile, respectively—then it should be, yes.
I could now claim that I wasn’t giving centered intervals, but I think what is really going on is that my estimates are not diachronically consistent even if I make them within 1 minute of each other.
I’m curious how much $s you and others think that longtermist EA has access to right now/will have access to in the near future. The 20% number seems like a noticeably weaker claim if longtermist EA currently has access to 100B than if we currently have access to 100M.
I actually think this is surprisingly non-straightforward. Any estimate of the net present value of total longtermist $$ will have considerable uncertainty because it’s a combination of several things, many of which are highly uncertain:
How much longtermist $$ is there now?
This is the least uncertain one. It’s not super straightforward and requires nonpublic knowledge about the wealth and goals of some large individual donors, but I’d be surprised if my estimate on this was off by 10x.
What will the financial returns on current longtermist $$ be before they’re being spent?
Over long timescales, for some of that capital, this might be ‘only’ as volatile as the stock market or some other ‘broad’ index.
But for some share of that capital (as well as on shorter time scale) this will be absurdly volatile. Cf. the recent fortunes some EAs have made in crypto.
How much new longtermist $$ will come in at which times in the future?
This seems highly uncertain because it’s probably very heavy-tailed. E.g., there may well be a single source that increases total capital by 2x or 10x. Naturally, predicting the timing of such a single event will be quite uncertain on a time scale of years or even decades.
What should the discount rate for longtermist $$ be?
Over the last year, someone who has thought about this quite a bit told me first that they had updated from 10% per year to 6%, and then a few months later back again. This is a difference of one order of magnitude for $$ coming in in 50 years.
What counts as longtermist $$? If, e.g., the US government started spending billions on AI safety or biosecurity, most of which goes to things that from a longtermist EA perspective are kind of but not super useful, how would that count?
I think for some narrow notion of roughly “longtermist $$ as ‘aligned’ as Open Phil’s longtermist pot” my 80% credence interval for the net present value is $30B - $1 trillion. I’m super confused how to think about the upper end because the 90th percentile case is some super weird transformative AI future. Maybe I should instead say that my 50% credence interval is $20B - $200B.
Generally my view on this isn’t that well considered and probably not that resilient.
Interesting, thanks.
Shouldn’t your lower bound for the 50% interval be higher than for the 80% interval? Or is the second interval based on different assumptions, e.g. including/ruling out some AI stuff?
(Not sure this is an important question, given how much uncertainty there is in these numbers anyway.)
If the intervals were centered—i.e., spanning the 10th to 90th and the 25th to 75th percentile, respectively—then it should be, yes.
I could now claim that I wasn’t giving centered intervals, but I think what is really going on is that my estimates are not diachronically consistent even if I make them within 1 minute of each other.
I also now think that the lower end of the 80% interval should probably be more like $5-15B.