GWS Robotics creative director David Graves, Alan Winfield, professor of robot ethics at the University of the West of England, and Joanna Bryson, associate professor in the department of computer science at the University of Bath, are among those who believe anthropomorphic bots spark fear -as well as fascination.
David, who has worked as a computer programmer for nearly two decades, said: “People tend to prefer social robots that have human-like qualities of behaviour and personality but do not look particularly human.
“I think this is because there are many science fiction dramas that see robots which look and move like humans then taking over the world. It plays into our fear that robots which are indistinguishable from humans are dangerous.”
Japanese researcher Masahiro Mori’s “uncanny valley” theory, which he developed in the 1970s, states that we react positively to robots if they have physical features familiar to us -but they disturb us if they start looking too much like us.
GWS Robotics in Knowle has invested in a humanoid robot called Pepper, which has been trained to ‘understand’ emotions, as well as use words and gestures.
The 4ft-tall robot, which can be customised to individual businesses, moves around on wheels. While cute and somewhat childlike, it is clearly a robot.
David, a Cambridge University graduate who has worked for multinational businesses, said: “I don’t think many people would find Pepper threatening. He is able to engage with people through conversation and his touch screen and is a regular visitor to schools, hospitals and businesses.”
Professor Winfield said unscrupulous manufacturers could use human-like robots to exploit vulnerable users.
He said: “Robots are no more alive than your toaster. If people are led to think otherwise, they might become afraid of robots and could be exploited by unscrupulous manufacturers.
“Even if people are not directly exploited for financial gain, they might be prompted to ‘care for’ a robot, and hence neglect other duties of care – recalling the Tamagotchi effect.”
But creators of human-like robots such as David Hanson, of Hanson Robotics, who have developed Sophia, argue human-like robots are an exploration of humanity. Sophia is the first robot to be granted (honorary) citizenship (by Saudi Arabia).
Hanson Robotics’ work has been used successfully in classrooms and in medical training. But Hanson recognises there are ethical concerns and the firm has disclosed the details of Sophia’s technology – while making it clear the robot is definitely not alive.
Professor Bryson has observed that some people are defending Sophia’s human rights, even though it is a robot.
She said: “Robots are simply not people. They are made and bought by humans, who are the responsible agents.”
Professor Bryson believes robots which are designed to look like humans present a greater risk to people.
She said: “People tend to think they can rally their knowledge and experience of human interactions where they cannot.”
How people interact with humanoid robots
AI researchers at the University of Bath have been awarded €250,000 to conduct a series of unique experiments on how people interact with humanoid robots.
Pepper robots are being used in a variety of scenarios to see whether a humanoid design makes people see robots as more human.
The Engineering and Physical Sciences Research Council (EPSRC) Principles of Robotics, a national level document on AI ethics which Bryson and Winfield helped develop, says it is unethical to design robots in a way that’s deceptive and their machine nature should be transparent.
Philip Graves, of GWS Robotics, who has been programming since the 1980s, has previously called to halt the right of ‘electronic personhood’ for robots.
He believes programmers and operators should maintain responsibility for the machines – even with the development of Artificial Intelligence.