Wonderful questions Madhav, thank you. The primary issue is simply lack of knowledge and wisdom - the people who allocate military funding have never heard of AGI x-risk as a serious thing beyond the movies. The person whose video I linked laughing was the head of AI R&D for the entire Pentagon.
I will endeavor to answer all of your questions whenever I wrote the next post on this topic. Thank you very much for the kind advice & interest.
It would be helpful to hear more details (including sources) about the problem you’ve found:
What has the NSA publicly announced in its position on AGI?
What has the external academic community or relevant nonprofits assessed their likely plans to be?
Which decision-makers are involved in determining the NSA’s policies on AGI development and/or safety?
Also, please add a more specific call to action describing:
The action you want to be taken
Which kinds of people are best suited to do this
Wonderful questions Madhav, thank you. The primary issue is simply lack of knowledge and wisdom - the people who allocate military funding have never heard of AGI x-risk as a serious thing beyond the movies. The person whose video I linked laughing was the head of AI R&D for the entire Pentagon.
I will endeavor to answer all of your questions whenever I wrote the next post on this topic. Thank you very much for the kind advice & interest.