Major advances in artificial intelligence and sensor technology mean that robots could soon be a reality in our everyday life. Vision looks at the ethical, legal and technological implications of creating artificial beings built to do our bidding
What comes to mind when you hear the word robot? It will most likely be C-3PO, the Terminator, Johnny Five or the Jetsons’ robotic maid, Rosie. These smart, agile humanoids have enchanted us over the past few decades. Sadly, they are all fictional.
One of the most advanced humanoid robots in existence is Honda’s ASIMO, a child-sized “astronaut”, which can run on two legs, climb stairs and respond to voice commands. But ASIMO requires at least one handler to operate it, takes a day to configure and has a battery that drains within an hour.
The areas where robots have made inroads into our lives are manufacturing, where simple, repetitive tasks can be performed more efficiently by machines, and the military, where the high cost of drones is offset by lives saved. When it comes to domestic robots, Roomba rules: a small, disc-shaped, autonomous vacuum cleaner, inhaling dust and dirt in its path, a nod to the robobutlers postulated in the 1960s, but unlikely to be ironing your clothes any time soon.
Russian internet entrepreneur Dmitry Grishin believes that this is soon to change. In his mind, we are entering an era of “personal robotics” like the era of personal computing in the 1980s. Thanks to cheap components, the proliferation of software developers and the increasing ease of outsourcing to places such as China via the web, building robots is now much less costly. “The big problem in the 1970s, 1980s and 1990s was only a few people with large funds could build robots,” he says. “You needed a lot of capital and had to take on a lot of risk. The evolution of the internet, batteries and chips makes robots cheaper. I believe we will start to see a lot more robots in the home and the street.”
Grishin is launching a US$25m venture fund that will invest in robotics startups. His aim is to find 10-20 early-stage companies working in areas such as healthcare, entertainment and transport. “What robotics needs is to shift from a tech focus to a problem-solving focus. There is lots of cool technology, but people don’t know what to do with it.”
For law professor and robot expert Ryan Calo, we are at a junction between an era of mostly industrial and military robotics and one of personal robotics, or “co-robotics”, where robots and people work alongside one another, particularly in areas such as healthcare, where robotic surgeons and orderlies can reduce pressure on staff. He also predicts that US Army drones will become prevalent in the street, performing civic duties such as policing and firefighting.
He agrees with Grishin that this is down to technological progress. “We have made huge gains in artificial intelligence and sensor technology, particularly robot vision, which is very complex,” he says. He cites the Kinect as a key example of how the barriers to entry to robotics have been lowered. “Kinect has motion-capture sensors and costs US$200 for what used to cost US$3,000,” he says. He believes the open nature of the platform and others like it serves to increase the rate of innovation.
One of the driving forces for the era of personal robotics is an ageing population. UN figures suggest that by 2050, the number of people over the age of 60 will exceed the number of people under the age of 15 for the first time in history – around 21 per cent of the global population. This affects the economic growth of countries by straining healthcare systems and changing the shape of labour markets. In Japan, the impact is more profound: by 2050, almost 40 per cent of its population will be over 65. A number of companies including Toyota have developed robot “nurses”, which can perform simple tasks such as lifting people out of bed and helping them walk. Other robots focus on information and communication, particularly through telepresence, a form of video-conferencing.
Telepresence robots tend to be roughly human-sized and often feature screens instead of a head, which can be occupied by the face of a caller – be it a friend or family member, or even a doctor performing a remote consultation.
One such robot is UAE-based PAL Robotics’ REEM humanoid. It has a head with vision, face-tracking and face-recognition capabilities and a multimedia touch-screen. REEM is customisable, allowing the buyer to adapt the robot to suit their needs using a range of applications.
Most of the robots selling on any notable scale do not look anything like the stereotypical humanoid. The Roomba, made by US-based iRobot, has sold some 7.5 million units. Likewise, thousands of PackBots – a tank-like machine used for bomb disposal and surveillance – have been sold to the US military. iRobot’s CEO, Colin Angle, explains that although legged robots are fascinating, they present a “fabulously complex research problem” and are very expensive because “biology doesn’t have the same cost matrix as we do”.
Despite this, humans have been obsessed with creating bots in our own image. In many cases, humanoid robots have acted as valuable subjects for biomechanical research: by simulating human movement, we can better understand it. Professor Henrik Schärfe, Director of the Centre for Computer-Mediated Epistemology at Aalborg University in Denmark, has taken this to the extreme with the creation of a robotic doppelgänger. Working with Professor Hiroshi Ishiguro at Osaka University, he has developed the Geminoids, hyper-lifelike androids designed to look like their creators.
He and his team are researching the use of robots as instruments of communication. He explains: “The basic mode of human communication is one human body in front of another human body. That is how our sensory systems work. Everything else is derived from that.”
As robots become more prevalent, complex ethical challenges arise. For example, if a robot harms a human, who is responsible? Is it the robot’s manufacturer? Or is it the robot itself? Or the human operator? Though, as Schärfe explains, for the foreseeable future robots are likely to be controlled by a mixture of autonomous systems and human controls, much like the modern car.