I’m all for the use of surgical robots, or the emerging field of ‘drone journalism’ for data gathering, and even exoskeletons. But could others go too far?
Two types of robots worth considering this week:
Exhibit A: Robots in the battlefield. The Guardian reported that AI experts have called for a boycott of South Korea’s Advanced Institute of Science and Technology (KAIST). The arms race for autonomous robots as machines of war is real. The US Military with Lockheed Martin has been developing autonomous armored vehicles.
Exhibit B: Then there’s the more benign use of a robot –in a coffee shop! The Da Vinci surgical robot (which I have written about) was used in a ‘demo’ of sorts in Kullman, Alabama, to give people a chance to see its capability in a friendly setting. This robot typically handles gall bladder and hernia procedures. (No fear, it’s not an autonomous bot.) Nice touch, humanizing this strange-looking refrigerator-sized 4-arm robot.
The point being, teaching robotics ought to come with a layer of ethics. It’s not enough to be develop breakthrough robots just because we can. There is such a thing as the 4 Laws of Robotics, as written up by Science Fiction writer, Isaac Asimov. They are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
- A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
The fourth law was added later by Asimov. We may have begun crossing the line, and ignoring it.
Interestingly, the UN this week has addressing the pace of robotics, through the Convention on Conventional Weapons (CCW) agreement, and the UN Institute for Disarmament Research. Lots of semantics in the debate, with regard to ‘autonomous’ and ‘automated’ and what constitutes ‘human control’ of these devices.