The Navy's stance on this is no robot that is autonomous or thinks for itself can have the power to shoot or explode something. There must always be a human in the loop.
The day that a robot is less likely to cause friendly fire or accidentally target civilians is when it'll be considered immoral to let humans make the decision

Then again, if humans incorrectly fire the weapon 5% of the time, and the robot only 1%, can you blame the robot (or the company that made it) for that 1%?
I especially love
"disclaimer: I am neither condoning nor condemning the weaponization of robots, just stating the facts that I am aware of."
lol.
I often work for the US military, so I need to be careful what I say
