AI will have less effect on field X over the next 5-10 years then AI proponents suggest seems to have a good track record as a prediction for most X. Why should we think biology and this time is different?
Autonomous vehicles stand out as an example. Are there others?
However, I feel like “AI capabilities will advance slower than most people expect,” a similar prediction, has had a poor track record over the past 10 years.
Pretty much all fields of computational science have been using machine learning for years. A lot of cool stuff has been achieved and it’s allowed us to do a few extra things, but I wouldn’t say it’s drastically sped up research or anything.
The main speed-up from LLMs will probably be from freeing time by speeding up grant applications and other assorted paperwork, and help in writing abstracts and certain sections of papers. I don’t see it drastically speeding up research either, at least in the near-term.
Flagging that I approve this post; I do believe that the relevant biosecurity actors within EA are thinking about this (though I’d love a more public write-up of this topic). Get in touch if you are thinking about this!
Agreed, and it’s something biosecurity folks (including some focused on GCBR mitigation) are increasingly thinking about. It’s a longstanding (and evolving) concern, but by no means a solved problem.
AI will have less effect on field X over the next 5-10 years then AI proponents suggest seems to have a good track record as a prediction for most X. Why should we think biology and this time is different?
Autonomous vehicles stand out as an example. Are there others?
However, I feel like “AI capabilities will advance slower than most people expect,” a similar prediction, has had a poor track record over the past 10 years.
Pretty much all fields of computational science have been using machine learning for years. A lot of cool stuff has been achieved and it’s allowed us to do a few extra things, but I wouldn’t say it’s drastically sped up research or anything.
The main speed-up from LLMs will probably be from freeing time by speeding up grant applications and other assorted paperwork, and help in writing abstracts and certain sections of papers. I don’t see it drastically speeding up research either, at least in the near-term.
Flagging that I approve this post; I do believe that the relevant biosecurity actors within EA are thinking about this (though I’d love a more public write-up of this topic). Get in touch if you are thinking about this!
Agreed, and it’s something biosecurity folks (including some focused on GCBR mitigation) are increasingly thinking about. It’s a longstanding (and evolving) concern, but by no means a solved problem.