> I’m not hugely worried about today’s LLMs causing x-risk. > I do think they could cause catastrophic harm in the hands of bioterrorists, but that’s about it > I am going to basically shit my pants when an AI agent can, 1/ take a brief from me for a brand new tv, 2/ have it be delivered to my home on time, on spec and on budget, 3/ have also organised installation by a technician 4/ all while I’m out of the loop after step 1
Thanks for the post! My quick thoughts:
> I’m not hugely worried about today’s LLMs causing x-risk.
> I do think they could cause catastrophic harm in the hands of bioterrorists, but that’s about it
> I am going to basically shit my pants when an AI agent can,
1/ take a brief from me for a brand new tv,
2/ have it be delivered to my home on time, on spec and on budget,
3/ have also organised installation by a technician
4/ all while I’m out of the loop after step 1
Seems doable most of the time in the best future, but the failure rate will likely be high enough that people wouldn’t want to use it for a while.