As it happens, I did a quick-and-dirty version of this analysis yesterday—see this spreadsheet. It looks to me like Goodreads is actually helpfully aggregating ratings/reviews across editions (if you click on any one of the editions here, it shows you the same figures), and the rate at which the aggregate numbers have been going up seems to have increased meaningfully since the relaunch, which does seem to provide additional (encouraging!) evidence regarding its impact.
We recently re-opened the biosecurity scholarship program for applications (deadline: January 1st 2022) - see here.
The main reason is simply that it so happens that most of the very top universities are based either in the UK or the US. (The fact that ETH Zurich is the only non-UK/US university that is in the top-20 on both the QS and Times Higher Education rankings partly reflects this, although my sense is that these rankings have some pretty serious limitations and should be taken with a major pinch of salt.) I also think there are additional benefits associated with attending university in the UK/US, including in terms of opening up career opportunities in the English-speaking world.
I agree that ETH has some things going for it and including it might well have been a reasonable choice, although my impression is that its teaching language at the undergraduate level is German, which means that it’s not really a relevant option for the vast majority of potential applicants.
In general, the decisions about which universities to include involved a number of debatable judgement calls, so I think there is a decent amount of room for reasonable disagreement on this topic.
We did consider asking for less detailed information in the financial information section for the exact reason you point out, but we ultimately felt that the current approach struck the best balance between a number of countervailing considerations. (For example, having to ask all of the most promising candidates to provide additional information at a second stage would have added to turnaround times, which would in turn have required us to set earlier application deadlines.)
Note that the financial section of our application form already requires applicants to provide meaningfully less granular information than the CSS/FAFSA forms and most of the university-specific financial aid forms I have seen, so I’m hoping that this won’t be too onerous on candidates.
Thanks—I’ve now added something along those lines to the description.
See this section of the program page linked in the post:
If you meet the application criteria for our program for people looking to pursue careers related to global catastrophic biological risks, please apply only to that program. Our plans for that program are currently somewhat in flux, but we expect to start accepting applications sometime in the fall of 2021.
The main reason has to do with capacity/turnaround times. Our experience is that a lot of candidates apply very close to the deadline, and prospective grad students typically have to accept their offers in mid-April, so if we had set our deadline in, say, mid-March instead, this would have given us only c.4 weeks to process these applications (which as it happens is already going to be a busy period for the relevant team members for other reasons). The earlier deadline gives us more wiggle room, although it does come at the cost you highlight. Candidates who don’t apply in time for our deadline and find out in February/March that they’ll require funding may want to consider applying to the Long-Term Future Fund.
I think something along those lines could be pretty promising. I’m not sure it’d be the best fit for Open Phil in particular (given that we generally focus on somewhat larger-scale types of grantmaking), but I know of some other folks active in this space who have expressed an interest in this idea/closely related ideas.
Another thing which I think could potentially be really valuable would be for someone to pull together in one place the most important information regarding the practicalities of applying to these and other universities as an international student (including e.g. information about how likely one is to get admitted to such-and-such a university with such-and-such an academic background, which was mentioned in another comment). My sense from skimming some of the existing resources is that they often aren’t great, although I haven’t tried very hard to look for better ones and it’s possible that something like what I have in mind here already exists—in which case just sharing a pointer to this could be equally valuable.
That’s a good point. Unfortunately, it’s difficult for me to say anything particularly informative to address this, given that most of the criteria we have in mind are qualitative in nature and don’t really lend themselves to being operationalised in a way that would allow candidates to better assess their odds of success.
A couple of things I can say that may or may not still be somewhat helpful:
There is neither a maximum nor a minimum number of applications we intend to fund. Instead, we intend to fund all of the applications that are above a certain bar in terms of our funding criteria. Given my current (very uncertain) assumptions about the program’s reach and the likely composition of the candidate pool, my best guess is that we will only end up awarding a handful of grants, but in principle, there is nothing preventing us from giving out a significantly greater number of scholarships if the candidate pool ends up being correspondingly larger and/or stronger than anticipated.
If an applicant’s academic record is comparable to that of other folks who get admitted to these universities, they are unusually thoughtful about the topic of how to do good in the world, and all of this comes through in their application materials, their chances of success should be favourable. (I appreciate that the second criterion in particular is somewhat unhelpfully vague!)
One way to address this issue would be to open the program for applications earlier in the year and to evaluate applicants well ahead of the university application deadlines, so that candidates already have their scholarship offers in hand before they need to decide whether to apply to the relevant universities, which should make this decision easy for those who did end up receiving scholarship offers. (As BrianTan notes in another comment, we do plan to inform successful applicants for scholarships in the US before the relevant university application deadlines in order to allow them to include information about the scholarships they have been awarded in their university applications. However, as noted in my post, they will have to get started with the required advance preparation—in terms of taking the required standardised tests, collecting references, etc. - much earlier than this.)
Given the timing of when we started working on launching this program, the fact that we will need some time to process applications, and the fact that applications to the relevant universities require advance preparation, this sort of format wasn’t really an available option for us this year, but it is something we may or may not consider doing next time around (assuming we decide to run another iteration of the program).
Another thing I’ll note is that obviously, folks with sufficiently strong academic backgrounds who apply to these universities generally also have a non-zero chance of getting admitted to and receiving funding from the universities themselves (even if for one reason or another they don’t end up receiving a scholarship from us), and they should factor this into their decisions about whether to apply.
Thanks for the suggestion—I thought I had addressed this by asking for people’s “GPA/aggregate score”, but I’ve now added a few sentences to the question description to further clarify that we’re just looking for people to use whatever metric of aggregate performance is commonly used wherever it is they go to school (not necessarily a GPA).
I looked into this recently, using Goodreads data as a proxy for sales. My takeaway was that sales of these books have been surprisingly linear over time, rather than being concentrated early on: Superintelligence; Doing Good Better; TLYCS
Presumably, the trends in Goodreads ratings/reviews need to be interpreted in the context of the (considerable) growth in Goodreads’ active users over time, and for that reason, linear-ish trends in the Goodreads data actually point towards more frontloaded growth profiles for sales/# of people who have read the books?