Feelix+Growing+-+Robotics

= Feelix Growing = media type="youtube" key="96_h79ffiJA?fs=1" height="505" width="640" FEELIX GROWING takes a highly interdisciplinary approach that combines theories, methods, and technology from developmental and comparative psychology, neuroimagery, ethology, and autonomous and developmental robotics, to investigate how socially situated development can be brought to robots that grow up and adapt to humans in everyday environments. We expect to have a significant impact on the scientific community, on two grounds. On the one hand, our research focus poses an important and as-yet largely unexplored scientific question that is increasingly recognized as a keystone in the development of human-oriented social technology and in the understanding of humans, and can contribute to the advancement of entertainment, developmental, service, and rehabilitation robotics. On the other hand, our strongly interdisciplinary effort could make important contributions to a number of disciplines and set the grounds towards long-term collaborations among them.
 * Project summary** If robots are to be truly integrated in humans’ everyday environment in order to provide services such as company, caregiving, entertainment, patient monitoring, aids in therapy, etc., they cannot be simply designed and taken off the shelf to be directly embedded into a real-life setting. Adaptation to incompletely known and changing environments and personalization to their human users and partners are necessary features to achieve successful long-term integration. This integration would require that, like children (but on a shorter time-scale), robots develop embedded in the social environment in which they will fulfil their roles. The overall goal of this project is the interdisciplinary investigation of socially situated development from an integrated or global perspective, as a key paradigm towards achieving robots that interact with humans in their everyday environments in a rich, flexible, autonomous, and user-centred way. To achieve this general goal we set the following specific objectives:
 * 1) Identification of scenarios presenting key issues and typologies of problems in the investigation of global socially situated development of autonomous (biologically and robotic) agents.
 * 2) Investigation of the roles of emotion, interaction, expression, and their interplays in bootstrapping and driving socially situated development, which includes implementation of robotic systems that improve existing work in each of those aspects, and their testing in the key identified scenarios.
 * 3) Integration of (a) the above capabilities in at least 2 different robotic systems, and (b) feedback across the disciplines involved.
 * 4) Identification of needs and key steps towards achieving standards in: (a) the design of scenarios and problem typologies, (b) evaluation metrics, (c) the design of robotic platforms and related technology that can be realistically integrated in people’s everyday life.

[]





Emotion robots learn from people


 * Making robots that interact with people emotionally is the goal of a European project led by British scientists.**

Feelix Growing is a research project involving six countries, and 25 roboticists, developmental psychologists and neuroscientists. Co-ordinator Dr Lola Canamero said the aim was to build robots that "learn from humans and respond in a socially and emotionally appropriate manner". The 2.3m euros scheme will last for three years. "The human emotional world is very complex but we respond to simple cues, things we don't notice or we don't pay attention to, such as how someone moves," said Dr Canamero, who is based at the University of Hertfordshire.


 * Sensory input**

The project involves building a series of robots that can take sensory input from the humans they are interacting with and then adapt their behaviour accordingly. Dr Canamero likens the robots to babies that learn their behaviour from the patterns of movement and emotional state of the world around them. The robots themselves are simple machines - and in some cases they are off-the-shelf machines. The most interesting aspect of the project is the software. Dr Canamero said: "We will use very simple robots as the hardware, and for some of the machines we will build expressive heads ourselves. "We are most interested in programming and developing behavioural capabilities, particularly in social and emotional interactions with humans." The robots will learn from the feedback they receive from humans. "It's mostly behavioural and contact feedback. "Tactile feedback and emotional feedback through positive reinforcement, such as kind words, nice behaviour or helping the robot do something if it is stuck." The university's partners are building different robots focusing on different emotional interactions.


 * 'Detect expressions'**

The robots will get the feedback from simple vision cameras, audio, contact sensors, and sensors that can work out the distance between the machine and the humans. "One of the things we are going to use to detect expressions in faces and patterns in motion is a (artificial) neural network." Artificial neural networks are being used because they are very useful for adapting to changing inputs - in this case detecting patterns in behaviour, voice, movement etc. "Neural networks learn patterns from examples of observation," said Dr Canamero. One of the areas the robots will be learning from is human movement. "Motion tells you a lot about your emotional state. "The physical proximity between human and robot, and the frequency of human contact - through those things we hope to detect the emotional states we need." The robots will not be trying to detect emotional states such as disgust but rather will focus on states such as anger, happiness, loneliness; emotions which impact on how the robot should behave.


 * 'Imprinted behaviour'**

"It is very important to detect when the human user is angry and the robot has done something wrong or if the human is lonely and the robot needs to cheer him or her up. "We are focusing on emotions relevant to a baby robot that has to grow and help human with every day life." One of the first robots built in the project is exhibiting imprinted behaviour - which is found among birds and some mammals when born. "They get attached to the first object they see when born. "It is usually the mother and that's what makes them follow the mother around. "We have a prototype of a robot that follows people around and can adapt to the way humans interact with it. "It follows closer or further away depending on how the human feels about it." Dr Canamero says robots that can adapt to people's behaviours are needed if the machines are to play a part in human society. At the end of the project two robots will be built which integrate the different aspects of the machines being developed across Europe. The other partners in this project are the Centre National de la Recherche Scientifique, Universite de Cergy Pontoise, Ecole Polytechnique Federale de Lausanne, University of Portsmouth, Institute of Communication and Computer Systems, Greece, Entertainment Robotics, Denmark and SAS Aldebaran Robotics, France.

[]

Francesco and Shubham