I agree with Rob’s points.
This article makes several statements that I interpret as somewhere between intentionally misleading and outright incorrect.
For instance, this reply says the 2.19% Fed Funds rate was the prevailing rate in 2019, and isn’t intended to be a forecast of the future. But since we know the Fed Funds rate is 0% today, why would that rate be used to argue for a forward-looking change in behavior?
Similarly, the 2019 return of a short-term bond fund gives literally no information about what returns a similar fund would offer today. Indeed the beauty of the bond market is that it does offer guaranteed future returns if you hold a bond to maturity. As of today, US government bonds (which carry no risk of default) offer a guaranteed annualized return of:
0.08% if held for 3 months,
0.10% if held for 12 months,
0.13% if held for 2 years,
0.94% if held for 10 years
So assuming an organization needed to make a grant within 12 moths, they could currently expect to make 0.10% on their cash balance. Needless to say this is 1/10th of 1%, or <1/20th of the figure cited in the post. And it seems a stretch to assume organizations hold cash for 12 months before granting it. So the likely benefit is perhaps at best 1/10th and potentially <1/100th of what’s cited here.
Granted, this is not necessarily obvious stuff. However it is and should be easily understood by anyone holding themselves out to be giving financial advice to other people or institutions.
Sure:
Cash balance methodology
You say:
Incorrect and misleading statements in these four paragraphs:
First paragraph: Dividing interest earned over the year by end of year cash is not an upper bound estimate interest. It’s actually more like a lower bound estimate, because it implicitly assumes that the cash level at end of year was the same as the average cash level throughout the year.
Since GiveWell is a grant-making organization that’s a very dangerous assumption to make (and their 990 form shows them making ~$30m in grants over the course of the year).
You acknowledge this in the next section, but then propose an equally questionable workaround:
In your reply to me you suggest:
Even this approach seems incorrect or misleading:
The page for their Maximum Impact Fund shows the grants they make (which would come out of the cash balance shown in the 990), and it seems to show a couple of months lag between donations and grants.
The same page shows that funds received in the final three months of 2019 were substantially greater than those received in the first nine months of 2019, which is another reason that the end-of-2019 cash balance paints a misleading picture.
Second paragraph: The year you’ve chosen for your analysis happens to be the year with the highest cash rates in the past 12 years. This isn’t acknowledged, and you don’t backtest your strategy in other years (e.g. all the years where rates were ~0%).
Third & fourth paragraphs: Same analytical error as first paragraph.
Potential returns of other investment options
You say:
Citing the one year, backward looking return of JPST is misleading, and citing the one year drawdown performance is naive:
There’s nothing about that asset that makes its one year backward-looking return able to be extrapolated. Indeed this is basically the equivalent of picking a different asset that you know now performed well over a one year period and using that to estimate opportunity cost. Why not Bitcoin in 2017 or Tesla stock in 2020?
And re citing a year of drawdown data—the fund had a drawdown in March 2020! If you look at a similar index with a longer timeseries you’ll see there have been drawdowns, even against the backdrop of a period of secularly falling interest rates that juiced bond returns.
What’s more, that particular fund tracks a bond benchmark but is actively managed, meaning its managers can take more risk than the benchmark implies (and charge a management fee for doing so).
This analysis suffers from the same problem. At least there’s an attempt to look at a window longer than one year, but again it’s a cherry-picked period of 20 years, and it’s backward looking returns data implied to be forward-looking. You also mention the drawdown in 2018. What was the maximum drawdown in 2008 to 2009?
Given the organization is granting funds a couple of months after receiving them, it has little ability to tolerate drawdowns, making comparisons to a higher risk fund incorrect to misleading.
Conclusion and staff time
You say:
As shown above, getting a counterfactual impact number in the millions of dollars is not “very likely… regardless of the estimation method”. That statement is somewhere between incorrect and misleading. For instance if the cash balance is 10% of what you cite, and the interest rate is more like 0.50%, it doesn’t hold up.
The staff time estimate is also somewhere between incorrect and misleading:
If it’s just opening bank accounts, then the impact is GiveWell on average earning roughly the prevailing bank interest rate (close to zero for much of the relevant period)
If instead you want them (without the benefit of hindsight that you employ) picking actively managed JP Morgan bond funds, or creating and balancing a bond + stock fund, then I’d hazard it’s more than two hours of work per year!
So either it’s not much effort, and not much return, or there can be higher returns, but clearly it’s more than 2 hours a year of effort.