I think it has a large chance of accelerating timelines by a small amount, and a small chance of accelerating timelines by a large amount. You can definitely increase capabilities, even if they’re not doing research directly into increasing the size of our language models. Figuring out how you milk language models for all the capabilities they have, the limits of such milking, and making highly capable APIs easy for language models to use are all things which shorten timelines. You go from needing a super duper AGI to take over the world to a barely super-human AGI, if all it can do is output text.
Adjacently, contributing to the AGI hype shortens timelines too.
I also think the above assumes monetary or prestige pressures won’t cause organizational value drift. I think its quite likely whoever starts this will see pressure from funders, staff, and others to turn it into an AGI firm. You need good reason to believe your firm is going to not cave in, and I see nothing addressing this concern in the original post.
I think it has a large chance of accelerating timelines by a small amount, and a small chance of accelerating timelines by a large amount. You can definitely increase capabilities, even if they’re not doing research directly into increasing the size of our language models. Figuring out how you milk language models for all the capabilities they have, the limits of such milking, and making highly capable APIs easy for language models to use are all things which shorten timelines. You go from needing a super duper AGI to take over the world to a barely super-human AGI, if all it can do is output text.
Adjacently, contributing to the AGI hype shortens timelines too.
I also think the above assumes monetary or prestige pressures won’t cause organizational value drift. I think its quite likely whoever starts this will see pressure from funders, staff, and others to turn it into an AGI firm. You need good reason to believe your firm is going to not cave in, and I see nothing addressing this concern in the original post.