You probably know that America’s nuclear-tipped ICBMs cannot be launched without the simultaneous turning of two keys by two physically separated silo launch-control officers.
That dramatically reduces the possibility—or opportunity—for an unauthorized launch.
That’s a very smart idea—requiring collaboration to prevent catastrophe. It also requires the employment of two officers instead of one, which, on the face of it, is good for the economy in virtue of sustaining employment.
Now imagine that one of those two keys dangles from the neck of a robot, while the other remains in the possession of a human.
That makes the robot a “cobot”—a “collaborative” or “cooperative” robot, but with a mission somewhat different from how Carnegie Mellon University, a world leader in advanced technological engineering, defines the primary purpose of these robots (on its website):
“Our CoBot robots follow a novel symbiotic autonomy, in which the robots are aware of their perceptual, physical, and reasoning limitations and proactively ask for help from humans.”
In the Carnegie Mellon formulation of the “mission”, the emphasis is on requesting assistance; but on the ICBM launch control model, the mission is, at least in part, if not primarily, to prevent one’s counterpart from doing something that is unauthorized—that is to say, the collaborative mission is as much a matter of prevention as enabling.
Note that Carnegie Mellon is conceptualizing and marketing its cobots under the comforting, reassuring rubric of harmless, occasionally helpless “helpers” (as do most inventors of things that have potentially dark uses, e.g., the recently developed “spiderman”/”gecko” gloves for vertical scaling of smooth surfaces, including those of commercial offices, government facilities or your home).
Before you scream “oxymoron!”, because you think symbiosis and autonomy are incompatible, reflect on the fact that symbiosis is defined in terms of mutual benefit, not mutual dependency. So that raises the question of whether the Carnegie Mellon engineered symbiosis, looking forward to future models with limbs (which Carnegie Mellon’s current cobots do not have) and even more advanced artificial intelligence, also includes interdependency—or more to the point, excludes full robot autonomy.
That’s important to ask, since, after all, a robot’s asking for help is logically consistent with full autonomy in the sense of its not needing that help, just as a child’s asking to have its shoelaces tied is.
Cobot Servant, or Master?
Other conceptualizations of “cobot” seem to veer toward cobot as master, rather than servant: “a computer-controlled robotic apparatus that assists a human worker, as on an assembly line, by guiding or redirecting motions initiated by the worker who provides the motive power.” ( Source: http://dictionary.reference.com/browse/cobot. Boldface is mine. How is that different from a cotton plantation overseer’s role in ante-bellum Georgia, circa 1862?)
Despite the “collaborative” nature of the human-cobot interactions in both formulations, this conceptual difference in mission has the potential to evolve into operationally diametrical opposites no different from the master vs. slave opposition. So, we must remain vigilant, lest the innocent-sounding cobot conceptual frameworks lead us down a road to robotic enslavement (or worse) rather than benign assistance.
Mandatory Man-Cobot Co-employment
To help ensure that neither our species nor our jobs are reduced to slavery, or worse—extinction, perhaps we should consider the engineering of cobots as a policy, rather than merely as an opportunity. That would mean making “cobotic engineering” mandatory for robots that, if fully autonomous, would make the employment of even one human unnecessary and/or threaten (the quality or quantity of) human life.
Such a limitation on the autonomy of robots would be akin in intent and consequence to the democratic “one man-one vote” principle—in the form of a “one robot-one man” rule, even if, when or to postpone the day robots achieve complete vertically and horizontally integrated self-manufacturing capabilities.
As a job protection strategy, this “co-employment” proposal has a merit that conventional over-staffing does not: When Asian department stores employ more retail staff than are necessary, the objective is primarily to sustain the economy through robust employment and its associated spending, reduce the volume of unemployment and welfare claims, and perhaps even to promote social stability by preventing the jobless from resorting to criminal activity in economic desperation.
In the human-robot co-employment scenario, literal, as well as economic and social survival—and of more than merely one displaced human—may be at stake.
Be Careful About What You Wish and Design For
Hence, it is critical that our robot designers and engineers (meaning, for now, humans) carefully frame their own mission. Is it to create robots whose prime directives require (and not merely allow, at their “discretion”) them to
- amplify the power and skills of a human
- supplement the skills and missions of a human through a division of labor
- “guide or redirect” the actions of a human (as helpers, not as controlling overseers)
- control the actions of a human (e.g., robotic prison guards and riot-control units)
- request and depend upon human assistance
- collaborate with humans hunting other humans
- obey a human in “fail-safe” mode (to prevent catastrophes, including human-induced as well as robot-induced ones)?
It is imperative to distinguish and adopt the missions and designs that protect us from engineering the extinction of our jobs and/or our species. One way to achieve this is to set and legislate design standards that require and retain some form of human control in cobots that might otherwise be created with dangerous autonomy.
If that is not feasible, cost-effective, practical or efficient, then devising some equivalent control scheme should be given top priority—now. To do less than this would be tantamount to another technological innovation that, as sheer stupidity and folly, will, if it comes to pass, do us no good.
A frontal cobotomy.