(I feel a bit dense in asking this, but where are you getting ~200M in the first 8 months from? In the sheet it’s 139.5M; the ~200M is a projected estimate.)
42.5M from Open Phil (raw data): 26.2M to “longtermism” broadly construed, 14.4M to biosecurity (10M being regrants), 1.8M to AI. The longtermism category has some interesting grants, like 3M to Kurzgesagt for making short-form video content
What jumps out to me is (1) FTX’s LTXR grants seem broader than OPP’s (2) FTX has so far granted 10x more to AI stuff than OPP. Looking into the latter a bit, OPP’s 1.8M went to 1 grant (Open Phil AI Fellowship — 2022 Class supporting 11 ML researchers over 5 years) while FTX’s 20M went to 76 grants, both big (e.g. 5M to Ought to build a language-model based research assistant) and small (e.g. 50k to one person to support 6 months of AI safety research). My sense is this is driven by some combination of VoI-maxing orientation and fast grant decisions (inspired by Fast Grants—see this comment for more commentary on this parallel)
(I feel a bit dense in asking this, but where are you getting ~200M in the first 8 months from? In the sheet it’s 139.5M; the ~200M is a projected estimate.)
I was just eyeballing the graph! Thanks ,your notes made sense. Cool stuff.
FTX has so far granted 10x more to AI stuff than OPP
This is not true, sorry the Open Phil database labels are a bit misleading.
It appears that there is a nested structure to a couple of the Focus Areas, where e.g. ‘Potential Risks from Advanced AI’ is a subset of ‘Longtermism’, and when downloading the database only one tag is included. So for example, this one grant alone from March ’22 was over $13M, with both tags applied, and shows up in the .csv as only ‘Longtermism’. Edit: this is now flagged more prominently in the spreadsheet.
I’m a bit surprised that there’s ~200M in longtermist grantmaking in the first 8 months of 2022 alone! Where is most of that money going?
(I feel a bit dense in asking this, but where are you getting ~200M in the first 8 months from? In the sheet it’s 139.5M; the ~200M is a projected estimate.)
The 139.5M for LTXR splits into
97M from FTX: 30M to biosecurity, 20M to AI, 16M to “other”, 10M to “empowering exceptional people”, 8M to epistemic institutions, 7M to econ growth, 2M to great power relations—all figures from this post
42.5M from Open Phil (raw data): 26.2M to “longtermism” broadly construed, 14.4M to biosecurity (10M being regrants), 1.8M to AI. The longtermism category has some interesting grants, like 3M to Kurzgesagt for making short-form video content
What jumps out to me is (1) FTX’s LTXR grants seem broader than OPP’s (2) FTX has so far granted 10x more to AI stuff than OPP. Looking into the latter a bit, OPP’s 1.8M went to 1 grant (Open Phil AI Fellowship — 2022 Class supporting 11 ML researchers over 5 years) while FTX’s 20M went to 76 grants, both big (e.g. 5M to Ought to build a language-model based research assistant) and small (e.g. 50k to one person to support 6 months of AI safety research). My sense is this is driven by some combination of VoI-maxing orientation and fast grant decisions (inspired by Fast Grants—see this comment for more commentary on this parallel)
I was just eyeballing the graph! Thanks ,your notes made sense. Cool stuff.
This is not true, sorry the Open Phil database labels are a bit misleading.
It appears that there is a nested structure to a couple of the Focus Areas, where e.g. ‘Potential Risks from Advanced AI’ is a subset of ‘Longtermism’, and when downloading the database only one tag is included. So for example, this one grant alone from March ’22 was over $13M, with both tags applied, and shows up in the .csv as only ‘Longtermism’. Edit: this is now flagged more prominently in the spreadsheet.