Yeah I think that’s a good point. I mean I could see how you could send a civilisation the blueprints for atomic weapons and hope that they wipe themselves out or something, that would be very feasible.
I guess I’m a bit more skeptical when it comes to AI. I mean it’s hard to get code to run and it has to be tailored to the hardware. And if you were going to teach them enough information to build advanced AIs I think there’d be a lot of uncertainty about what they’d end up making, I mean there’d be bugs in the code for sure.
It’s an interesting argument though and I can really see your perspective on it.
Nice writeup.
“The primary theory is that alien civilizations could continuously broadcast a highly optimized message intended to hijack or destroy any other civilizations unlucky enough to tune in.”
One question I have is whether this is possible and how difficult it is?
I mean if I took a professional programmer and told them to write a message which would hijack another computer when sent to it isn’t that extremely hard to do? I mean even if you already know about human programming you have no idea what system that machine is running or how it is structured.
Isn’t then that problem like 10,000x harder when dealing with a totally different species and their computer equipment? The sender of the message would have no idea what computational structures they were using and absolutely no idea of the syntax or coding conventions? I mean even using binary rather than trinary or decimal is a choice?