However, this approach is a bit silly because it does not model the acceleration of research: If there are no other donors in the field, then our donation is futile because £10,000 will not fund the entire effort required.
Could you explain this more clearly to me please? With some stats as an example it’ll likely be much clearer. Looking at the development of the Impossible Burger seems a fair phenomena to base GFI’s model on, at least for now and at least insofar as it is being used to model a GFI donation’s counterfactual impact in supporting similar products GFI is trying to push to market. I don’t understand why the approach is silly because $10,000 wouldn’t support the entire effort and that this is somehow tied to acceleration of research.
There are two ways donations to GFI could be beneficial: speeding up a paradigm-change that would have happened anyway, and increasing the odds that the change happens at all. I think it’s not unreasonable to focus on the former, since there aren’t fundamental barriers to developing vat meat and there are some long-term drivers for it (energy/land efficiency, demand).
However, in that case, it helps to have some kind of model for the dynamics of the process. Say you think it’ll take $100 million and 10 years to develop affordable vat burgers; $1million now probably represents more than .1 year of speedup, since investors will pile on as the technology gets closer to being viable. But how much does it represent? (And, also, how much is that worth?) Plus, in practice we might want to decide between different methods and target meats, but then we need to have a decent sense of the responses for each of those.
I agree that this is possible. I’d say the way to go is generating a few possible development paths (paired $/time and progress/$ curves) based on historical tech’s development and domain-experts’ prognostications, and then looking at marginal effects for each path.
Not having looked into this more, it seems doable but not-straightforward. Note that the Impossible Burger isn’t a great model for full-on synthetic meat. Their burgers are mostly plant-based, and they use yeast to synthesize hemoglobin, a single protein—something that’s very much within the purview of existing biotech. This contrasts with New Harvest and Memphis Meats’ efforts synthesizing muscle fibers to make ground beef, to say nothing of the eventual goal of synthesizing large-scale muscle structure to replicate steak, etc.
And we have a lot less to go on there. Mark Post at Maastricht University made a $325,000 burger in 2013. Memphis Meats claimed to be making meat at $40,000/kg in 2016.* Mark Post also claims scaling up his current methods could get to ~$80/kg (~$10/burger) in a few years. That’s still about an order of magnitude off from the mainstream, and I think you’d need someone unbiased with domain expertise to give you a better sense of how much tougher that would be.
*Note- according to Sentience Politics’ report on vat meat. I haven’t listened to the interview yet.
There are two ways donations to GFI could be beneficial: speeding up a paradigm-change that would have happened anyway, and increasing the odds that the change happens at all. I think it’s not unreasonable to focus on the former, since there aren’t fundamental barriers to developing vat meat and there are some long-term drivers for it (energy/land efficiency, demand).
However, in that case, it helps to have some kind of model for the dynamics of the process. Say you think it’ll take $100 million and 10 years to develop affordable vat burgers; $1million now probably represents more than .1 year of speedup, since investors will pile on as the technology gets closer to being viable. But how much does it represent? (And, also, how much is that worth?) Plus, in practice we might want to decide between different methods and target meats, but then we need to have a decent sense of the responses for each of those.
I agree that this is possible. I’d say the way to go is generating a few possible development paths (paired $/time and progress/$ curves) based on historical tech’s development and domain-experts’ prognostications, and then looking at marginal effects for each path.
Not having looked into this more, it seems doable but not-straightforward. Note that the Impossible Burger isn’t a great model for full-on synthetic meat. Their burgers are mostly plant-based, and they use yeast to synthesize hemoglobin, a single protein—something that’s very much within the purview of existing biotech. This contrasts with New Harvest and Memphis Meats’ efforts synthesizing muscle fibers to make ground beef, to say nothing of the eventual goal of synthesizing large-scale muscle structure to replicate steak, etc.
And we have a lot less to go on there. Mark Post at Maastricht University made a $325,000 burger in 2013. Memphis Meats claimed to be making meat at $40,000/kg in 2016.* Mark Post also claims scaling up his current methods could get to ~$80/kg (~$10/burger) in a few years. That’s still about an order of magnitude off from the mainstream, and I think you’d need someone unbiased with domain expertise to give you a better sense of how much tougher that would be.
*Note- according to Sentience Politics’ report on vat meat. I haven’t listened to the interview yet.