Quantcast
Channel: Europe Headlines on One News Page [United Kingdom]
Viewing all articles
Browse latest Browse all 65275

Meet the new generation of robots. They're almost human…

$
0
0
In Bristol, Molly the robot helps the elderly; in Lyon, iCub plays children's games. And, globally, some extraordinary developments are under way in artificial intelligence that could have a profound effect on the way we live

In a darkened robotics laboratory in Lyon, Peter Dominey and Stéphane Lallée are playing a game with a cute-looking humanoid called iCub. The game is Toybox and the object is to uncover a toy trumpet that Lallée has just moved from a box to iCub's right. For a human three-year-old such a game is child's play, but until now it has been beyond the scope of most machined intelligences. Not for iCub however.

"I will put box on the right," says iCub, making sure it has understood Lallée's instructions. "You will put the toy on the left. I will put the box on the middle."

Staring intently at the box, iCub reaches out with its left hand, grasps the box and moves it to the centre of the table, uncovering the trumpet in the process. Next, instead of telling iCub to pick up the trumpet, Lallée gestures with his finger, indicating different positions on the table where he has moved it. Much like a real child interacting with its parent, iCub's eyes swivel from side to side, its luminous pink eyebrows and mouth glowing with excitement. "You have moved the trumpet. Found it. You have moved the trumpet. Found it."

"Game over," says Lallée abruptly. iCub tilts its head towards Lallée, fixing him with its large black eyes. If you did not know better you would think iCub was disappointed. "Check on my internal state. That was pretty fun. We keep playing this game."

Part of the Chris project – short for Co-operative Human Robot Interactive Systems – iCub is at the vanguard of a new generation of social robots that is fast changing perceptions of what human-robot interactions will look like in the future. For iCub isn't just any old robot. Measuring 93cm, it is a fully fledged humanoid "child" robot equipped with sophisticated motor skills and sense abilities, including vision, sound, touch, balance and proprioception – the ability to sense the position of its arms and body in space. These facilities enable iCub to crawl on all fours, grasp and manipulate balls and other objects and turn its head so as to follow gestures or direct its gaze.

Unlike conventional robots familiar from assembly lines, iCub isn't programmed to perform a specific set of actions or tasks. Instead it acquires skills naturally by using its body to explore the world and gather data about its relation to objects and people in that world, much as a two-year-old learns by interacting with his or her environment. Through its ability to direct its gaze, grasp and manipulate objects, and "read" gestures as it co-operates with human tutors on shared tasks like the Toybox game, iCub can learn words and skills and develop co-operative strategies. There are even indications that, given time and practice, iCub may be able to develop more sophisticated cognitive skills, such as the ability to imagine the mental states of others [as explained in the box on the Sally Anne task , below].

"We've got used to seeing robots in the factory but in the 21st century robots will increasingly be living among us," says Dominey, whose work in Lyon is jointly funded by Inserm, the French national medical research agency, and CNRS, France's national scientific research foundation. "These robots must be able to take our perspective and co-operate with us and, if our plans change, they must be able to adjust their behaviour accordingly. Most important of all, they must be safe."

These days you can hardly open a newspaper or switch on the TV without being confronted with the latest robotic advance. From self-steering vacuum-cleaning robots such as Roomba (£379.95 from John Lewis) to cyborg-style robot suits (such as HAL) and the cruise control in your BMW, suddenly robots are everywhere, invading our offices and homes and, it seems, making increasing demands on our emotional lives.

Take Paro, a plush toy version of a baby harp seal. Paro does little more than coo and wag its head and tail, yet more than 1,000 have been sold since its creation in 2003, making it one of the most popular therapeutic robots. And where Paro leads, other "socibots" are sure to follow. For £55,000, Engineered Arts, a company based in Cornwall, will supply you with Robothespian, a life-sized interactive humanoid that comes with an interactive touch-screen that can play greetings, sing songs and converse in several languages.

Then there's Simon, an upper-torso humanoid robot with a "socially expressive" head. Developed by the Georgia Institute of Technology, Simon can grasp and release objects, clean up a workspace and swivel its eyes so as to interact with humans in ways that feel emotionally and socially authentic. In tests, using only his cameras as a guide, Simon could tell with close to 80% accuracy whether someone was paying attention or ignoring him.

Now, in a development straight out of a Ridley Scott film, the Japanese have sent the first talking humanoid robot into outerspace. A joint project between Tokyo University and the carmaker Toyota, Kirobo – from the Japanese words for "hope" and "robot"– is designed to provide companionship for astronaut Koichi Wakata when he journeys to the International Space Station this year.

These developments make some people uneasy. Ever since Isaac Asimov published I, Robot in 1950, writers and philosophers have been warning us about the dangers of becoming over-reliant on robots. At the same time, as populations age and it becomes increasingly expensive to provide 24/7 care to the elderly, scientists and commercial companies are convinced that large rewards await the first team that succeeds in engineering a fully autonomous humanoid.

"Assisted living is going to be a big industry, and whoever cracks the technology first will be able to export it to everyone else," says Chris Melhuish, director of the Bristol Robotics Laboratory (BRL), the largest academic centre for multidisciplinary robotics research in the UK. "If we don't invest in social robotics there's a risk we will be left behind."

Housed in a vast, hangar-like building on the edge of Bristol, the BRL, a joint venture between Bristol University and the University of the West of England, is one of more than 20 labs participating in the European Chris consortium. Inside, researchers from around the world tinker with circuit boards and wires alongside robotic arms and ghostly looking androids draped in plastic sheets. There are robots that can manipulate packages and read food labels; plastic heads that gurn and gurgle when you sit in front of them; and tiny "swarm" robots that can combine spontaneously to solve tasks. BRL even has a robot called Bert 2 that can help someone assemble a four-legged table simply by interpreting gestures and responding to verbal commands. It has yet to master an Ikea flat pack, however. "At the moment it's pretty weak what we can do, but these are the first steps in the right direction," says Melhuish. "Human-robot co-operation is not an impossible dream."

To understand how iCub may be bringing that dream a step closer, it is necessary to know something of the history of social robotics and how iCub represents a significant methodological and technological advance. In the past it was thought that the solution to more lifelike robots lay in ever more complex algorithms and codes. The problem is that it is not easy to write a computer code to enable a robot to distinguish a cup from a saucer, or walk around an obstruction, or any of the other myriad daily tasks humans take for granted. But what if, through sophisticated motors and gears, robots could be given the equivalent of muscles? And what if they could also be given touch-sensitive fingers and other sensory equipment that would allow them to explore the world and process information via their bodies?

This was the "embodied intelligence" approach pioneered at Massachusetts Institute of Technology's computational science and artificial intelligence laboratory in the 1990s by Rodney Brooks, a roboticist and entrepreneur who went on to found the company Rethink Robotics. Brooks's first effort was a stationary robot named Cog with arms that spanned 6.5ft when extended. Brooks gave Cog motors and oscillators so that it could grasp and weigh objects, and microphones for ears. It also had basic speech recognition software and an artificial voicebox. Equipped with only these basic facilities, Cog learnt to manipulate a cube and a Slinky toy through repetitive interactions with students. Cog could also find and name objects it had never seen before. But perhaps the most interesting development was how people responded to it: although Cog had no face, and a lens where his eyeballs should have been, students treated it as if it were human.

Yet Cog was completely lacking in social skills. That began to change with the next robot to emerge from Brooks's lab: Kismet. Designed by Brooks's graduate student Cynthia Breazeal, now director of the personal robots group at MIT, Kismet was primed with the same basic motivations as a six-month-old child, with built-in drives for stimulation, social interaction and periodic rest. Breazeal also endowed Kismet with abilities such as being able to look for a brightly coloured ball or recognise a person's face, plus a repertoire of facial expressions that changed according to whether Kismet's drives for arousal or rest were being met, and that simulated human moods such as happiness and boredom.

Kismet was a revelation. Even though it consisted of only a hinged metal head with cartoonish "stuck-on" red lips and eyebrows for expressing emotions, its repertoire was so socially appropriate that people were drawn to play with it, and got the sense that Kismet was reciprocating their emotions.

As Breazeal's work on Kismet was coming to an end, a young Italian roboticist, Giorgio Metta, arrived at Brooks's lab to work on his PhD. Metta's focus was not so much on software as mechanics – improving Cog's ability to manipulate objects and interact with humans on simple shared tasks. By 2004 he had finished his PhD and gained enough experience to propose a more ambitious project: building a humanoid robot with a much greater range of motion, using open-source design, allowing other roboticists to bolt on their own engineering and software applications. The proposal caught the eye of the European Union's cognition unit, which agreed to fund the project to the tune of €8.5m – and iCub was born.

Since then, 25 iCubs have been rolled out to collaborating centres across Europe, the US and Japan via the RobotCub platform. Some, such as Dominey's facility in Lyon, are focusing on iCub's ability to interact with humans on shared tasks through language and action; others are more interested in teaching iCub to manipulate objects and acquire new motor skills.

What unites these approaches is the insight that human intelligence develops through interactions with objects in the environment and is shaped profoundly by interactions with other human beings. By 18 months, for instance, a toddler can already understand the gesture to pick up a pen. Young children also seem to be primed to explore their environment and interact and co-operate with their carers long before they acquire language. These social and exploratory drives are built into iCub's operating system. Then, it's simply a matter of interacting with iCub and letting its body guide it.

"In the past if you wanted to teach a robot to recognise an object it was very difficult to write a program to that effect," says Metta, who is based at the Italian Institute of Technology in Genoa. "It was a matter of trial and error. Now, with iCub, you can be the teacher and say, 'OK, let's play with this object or do this task together' in a more or less natural way."

For instance, simply through repetitive play, Metta's team have taught iCub to distinguish a stuffed toy octopus from a purple car, despite iCub never having seen the objects before. By grasping iCub's arm and rotating it in a certain way, they can also teach iCub new gestures. These are then recorded in its autobiographical memory, meaning that next time it can make the gesture without being prompted. Similarly, through interacting with its human tutor, iCub can be taught new words and concepts – such as that "left" corresponds to the position east on the table in Toybox.

At the same time, by confronting us with questions about what it means to be human, iCub, and robotic systems like it, are fast becoming testbeds for theories about the development of similar cognitive functions in humans.

"Until now, robots have been little more than fancy machines," says Dominey. "What makes iCub so exciting is that it allows us to test our theories about the evolution of social cognition and then feed this knowledge back into the design of ever more intelligent machines."

Dominey's next goal as part of the Experimental Functional Android Assistant project is to build on the Toybox and Sally Anne studies to see if iCub can be taught to read more complex mental states and develop higher cognitive functions. "The really new thing would be if we could give iCub a sense of self so it could reflect on itself as an agent acting on the world. Then we would have the beginning of some true notion of intention and agency." Metta, meanwhile, has developed a touch-sensitive skin to enhance iCub's ability to gauge when it is getting too close to an object and is in danger of hitting it – a prerequisite for persuading people that it is safe to interact with robots at close quarters.

With safety in mind, Melhuish and his team are also experimenting with softer, more lightweight materials, in the hope of building a robot with lower body mass.

But what if robots really did become capable of reading our intentions and interacting with us safely? They are being touted as the solution to social isolation and the rising costs of health care, but would we really be prepared to invite them into our homes to look after our parents?

To get an idea of how living with a robot might work in practice, before leaving BRL I asked one of Melhuish's assistants to show me round an assisted living space patrolled by a robot nicknamed Molly. Compared to iCub, Molly – which looks like an upright vacuum cleaner with a bowling ball for a head and comes equipped with a touchscreen display and webcam – is a bit disappointing.

Molly, which is part of the Kompai range manufactured by the French company Robosoft, does not walk or crawl. Instead it trundles about on training wheels. However, it is capable of a surprising range of tasks, including monitoring your vital signs and navigating to whatever position in the room you happen to be. It's display also comes with a range of interactive games and tools for testing memory and recall. All this data can be recorded and stored for later study by healthcare professionals. Molly can also transmit images and sound in real time, allowing family members to check on their loved ones remotely.

I had only 10 minutes with Molly but was surprised at how quickly I got used to her jerky movements and electronic voicebox. And although she could hardly be mistaken for a real person, I soon found myself talking to her as if she were.

According to Dan Davies, whose job it is to take Molly into nursing homes in the Bristol area as part of a Europe-wide study into the viability of assisted living with robots, my reaction is fairly typical. "Everywhere we have taken Molly the response has been great," he says. "Even nursing staff have been reassured, saying it would be very useful to have a record of how active someone has been when they are not around to monitor them."

Of course, this is a long way from the scenario painted in films such as Robot and Frank, in which a robot is sent to care for an ageing jewel thief (Frank Langella) only to become his companion and partner in crime. And it is even further from the robotic futures imagined by Dominey and Metta. But isn't that a good thing? After all, wouldn't it be a little dehumanising, a little sad even, if robots were to become a substitute for family and friends?

Not necessarily, says Dominey. "If the alternative is to be in a depersonalised institution, it's probably better for people to stay autonomously in their home with a robot. We already interact with televisions, iPads and iPhones – all sorts of machines – and we don't consider them dehumanising. Robots are no different." Reported by guardian.co.uk 18 hours ago.

Viewing all articles
Browse latest Browse all 65275

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>