[{"command":"openDialog","selector":"#drupal-modal","settings":null,"data":"\u003Cdiv id=\u0022republish_modal_form\u0022\u003E\u003Cform class=\u0022modal-form-example-modal-form ecl-form\u0022 data-drupal-selector=\u0022modal-form-example-modal-form\u0022 action=\u0022\/en\/article\/modal\/7223\u0022 method=\u0022post\u0022 id=\u0022modal-form-example-modal-form\u0022 accept-charset=\u0022UTF-8\u0022\u003E\u003Cp\u003EHorizon articles can be republished for free under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence.\u003C\/p\u003E\n \u003Cp\u003EYou must give appropriate credit. We ask you to do this by:\u003Cbr \/\u003E\n 1) Using the original journalist\u0027s byline\u003Cbr \/\u003E\n 2) Linking back to our original story\u003Cbr \/\u003E\n 3) Using the following text in the footer: This article was originally published in \u003Ca href=\u0027#\u0027\u003EHorizon, the EU Research and Innovation magazine\u003C\/a\u003E\u003C\/p\u003E\n \u003Cp\u003ESee our full republication guidelines \u003Ca href=\u0027\/horizon-magazine\/republish-our-stories\u0027\u003Ehere\u003C\/a\u003E\u003C\/p\u003E\n \u003Cp\u003EHTML for this article, including the attribution and page view counter, is below:\u003C\/p\u003E\u003Cdiv class=\u0022js-form-item form-item js-form-type-textarea form-item-body-content js-form-item-body-content ecl-form-group ecl-form-group--text-area form-no-label ecl-u-mv-m\u0022\u003E\n \n\u003Cdiv\u003E\n \u003Ctextarea data-drupal-selector=\u0022edit-body-content\u0022 aria-describedby=\u0022edit-body-content--description\u0022 id=\u0022edit-body-content\u0022 name=\u0022body_content\u0022 rows=\u00225\u0022 cols=\u002260\u0022 class=\u0022form-textarea ecl-text-area\u0022\u003E\u003Ch2\u003EWhy robots are being trained in self-awareness\u003C\/h2\u003E\u003Cp\u003EIn 2016, for the first time ever, the number of\u003Ca href=\u0022https:\/\/www.technative.io\/non-industrial-robots-overtake-industrial-robots-market-size-first-time\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003E robots in homes, the military, shops and hospitals\u003C\/a\u003E surpassed that used in industry. Instead of being concentrated in factories, robots are a growing presence in people\u2019s homes and lives \u2013 a trend that is likely going to increase as they become more sophisticated and \u2018sentient\u2019.\u003C\/p\u003E\u003Cp\u003E\u2018If we take out the robot from a factory and into a house, we want safety,\u2019 said \u003Ca href=\u0022https:\/\/www.ru.nl\/english\/people\/lanillos-p\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003EDr Pablo Lanillos\u003C\/a\u003E, an assistant professor at Radboud University in the Netherlands.\u003C\/p\u003E\u003Cp\u003EAnd for machines to safely interact with people, they need to be more like humans, experts like Dr Lanillos say.\u003C\/p\u003E\u003Cp\u003EHe has designed an algorithm that enables robots to recognise themselves, in a similar way to humans.\u003C\/p\u003E\u003Cp\u003EA major distinction between humans and robots is that our senses are faulty, feeding misleading information into our brains. \u2018We have really imprecise proprioception (awareness of our body\u2019s position and movement). For example, our muscles have sensors that are not precise versus robots, which have very precise sensors,\u2019 he said.\u003C\/p\u003E\u003Cp\u003EThe human brain takes this imprecise information to guide our movements and understanding of the world.\u003C\/p\u003E\u003Cp\u003ERobots are not used to dealing with uncertainty in the same way.\u003C\/p\u003E\u003Cp\u003E\u2018In real situations, there are errors, differences between the world and the model of the world that the robot has,\u2019 Dr Lanillos said. \u2018The problem we have in robots is that when you change any condition, the robot starts to fail.\u2019\u003C\/p\u003E\u003Cp\u003EAt age two, humans can tell the difference between their bodies and other objects in the world. But this computation that a two-year-old human brain can do is very complicated for a machine and makes it difficult for them to navigate the world.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ERecognise\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EThe algorithm that Dr Lanillos and colleagues developed in a project called \u003Ca href=\u0022https:\/\/cordis.europa.eu\/project\/rcn\/209516\/results\/en\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003ESELFCEPTION\u003C\/a\u003E, enables three different robots to distinguish their \u2018bodies\u2019 from other objects.\u003C\/p\u003E\u003Cp\u003ETheir test robots included one composed of arms covered with tactile skin, another with known sensory inaccuracies, and a commercial model. They wanted to see how the robots would respond, given their different ways of collecting \u2018sensory\u2019 information.\u003C\/p\u003E\u003Cp\u003EOne test the algorithm-aided robots passed was the\u003Ca href=\u0022https:\/\/www.jove.com\/science-education\/10291\/the-rubber-hand-illusion\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003E\u0026nbsp;rubber hand illusion\u003C\/a\u003E, originally used on humans. \u2018We put a plastic hand in front of you, cover your real hand, and then start to stimulate your covered hand and the fake hand that you can see,\u2019 Dr Lanillos said.\u003C\/p\u003E\u003Cp\u003EWithin minutes, people begin to think that the fake hand is their hand.\u003C\/p\u003E\u003Cp\u003EThe goal was to deceive a robot with the same illusion that confuses humans. This is a measure of how well multiple sensors are integrated and how the robot is able to adapt to situations. Dr Lanillos and his colleagues made a robot experience the fake hand as its hand, similar to the way a human brain would.\u003C\/p\u003E\u003Cp\u003EThe second test was the \u003Ca href=\u0022https:\/\/www.pnas.org\/content\/114\/13\/3281\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003Emirror test\u003C\/a\u003E, which was originally proposed by primatologists. In this exercise, a red dot is put on an animal or person\u2019s forehead, then they look at themselves in a mirror. Humans, and some animal subjects like monkeys, try to rub the red dot off of their face rather than off the mirror.\u003C\/p\u003E\u003Cp\u003E\u003Cblockquote class=\u0022tw-text-center tw-text-blue tw-font-bold tw-text-2xl lg:tw-w-1\/2 tw-border-2 tw-border-blue tw-p-12 tw-my-8 lg:tw-m-12 lg:tw--ml-16 tw-float-left\u0022\u003E\n \u003Cspan class=\u0022tw-text-5xl tw-rotate-180\u0022\u003E\u201c\u003C\/span\u003E\n \u003Cp class=\u0022tw-font-serif tw-italic\u0022\u003E\u2018In real situations, there are errors, differences between the world and the model of the world that the robot has.\u2019\u003C\/p\u003E\n \u003Cfooter\u003E\n \u003Ccite class=\u0022tw-not-italic tw-font-normal tw-text-sm tw-text-black\u0022\u003EDr Pablo Lanillos, Radboud University, the Netherlands \u003C\/cite\u003E\n \u003C\/footer\u003E\n\u003C\/blockquote\u003E\n\u003C\/p\u003E\u003Cp\u003EThe test is a way to determine how self-aware an animal or person is. Human children are usually able to pass the test by their second birthday.\u003C\/p\u003E\u003Cp\u003EThe team trained a robot to \u2018recognise\u2019 itself in the mirror by connecting the movement of limbs in the reflection with its own limbs. Now they are trying to get a robot to rub off the red dot.\u003C\/p\u003E\u003Cp\u003EThe next step in this research is to integrate more sensors in the robot \u2013 and increase the information it computes \u2013 to improve its perception of the world. A human has about 130 million receptors in their retina alone, and 3,000 touch receptors in each fingertip, says Dr Lanillos. Dealing with large quantities of data is one of the crucial challenges in robotics. \u2018Solving how to combine all this information in a meaningful way will improve body awareness and world understanding,\u2019 he said.\u003C\/p\u003E\u003Cp\u003EImproving the way robots perceive time can also help them operate in a more human way, allowing them to integrate more easily into people\u2019s lives. This is particularly important for assistance robots, which will interact with people and have to co-operate with them to achieve tasks. \u003Ca href=\u0022https:\/\/qz.com\/1367213\/robots-could-save-the-world-from-its-aging-problem\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003EThese include service robots which have been suggested as a way to help care for the elderly\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003E\u2018(Humans\u2019) behaviour, our interaction with the world, depends on our perception of time,\u2019 said \u003Ca href=\u0022https:\/\/www.anilseth.com\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003EAnil Seth\u003C\/a\u003E, co-director of the Sackler Centre for Consciousness Science at the University of Sussex, UK. \u2018Having a good sense of time is important for any complex behaviour.\u2019\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ESense of time\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EProf. Seth collaborated on a project called \u003Ca href=\u0022https:\/\/cordis.europa.eu\/project\/rcn\/193781\/en\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003ETimeStorm\u003C\/a\u003E which examined how humans perceive time, and how to use this knowledge to give machines a sense of time, too.\u003C\/p\u003E\u003Cp\u003EInserting a clock into a robot would not give them temporal awareness, according to Prof. Seth. \u2018Humans \u2013 or animals \u2013 don\u2019t perceive time by having a clock in our heads,\u2019 he said. There are biases and distortions to how humans perceive time, he says.\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022http:\/\/www.warrickroseboom.com\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003EWarrick Roseboom\u003C\/a\u003E, a cognitive scientist also at the University of Sussex who spearheaded the university\u2019s TimeStorm efforts, created a series of experiments to quantify how people experienced the passage of time.\u003C\/p\u003E\u003Cp\u003E\u2018We asked humans to watch different videos of a few seconds up to about a minute and tell us how long they thought the video was,\u2019 Roseboom said. The videos were first-person perspectives of everyday tasks, such as walking around campus or sitting in a cafe. Subjects experienced time differently from the actual duration, depending on how busy the scene was.\u003C\/p\u003E\u003Cp\u003EUsing this information, the researchers built a\u0026nbsp;system based on deep learning that could mimic the human subjects\u2019 perception of the video durations. \u2018It worked really well,\u2019 said Prof. Seth. \u2018And we were able to predict quite accurately how humans would perceive duration in our system.\u2019\u003C\/p\u003E\u003Cp\u003EA major focus of the project was to investigate and demonstrate machines and humans working alongside each other with the same expectations of time.\u003C\/p\u003E\u003Cp\u003EThe\u0026nbsp;researchers were able to do this by demonstrating \u003Ca href=\u0022https:\/\/cordis.europa.eu\/article\/id\/123799-robots-in-search-of-lost-time-\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003Erobots assisting in meal preparation\u003C\/a\u003E, \u003Ca href=\u0022http:\/\/timestorm.eu\/repository\/videos\/\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003Esuch as serving food\u003C\/a\u003E according to people\u2019s preferences, something which requires an understanding of human time perception, planning and remembering what has already been done.\u003C\/p\u003E\u003Cp\u003ETimeStorm\u2019s follow-up project, \u003Ca href=\u0022http:\/\/www.entiment.eu\/#section_objectives\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003EEntiment\u003C\/a\u003E, created software that companies can use to programme robots with a sense of time for applications such as \u003Ca href=\u0022http:\/\/www.entiment.eu\/#section_repository\u0022 target=\u0022_blank\u0022 rel=\u0022noopener noreferrer\u0022\u003Emeal preparation and\u003C\/a\u003E\u003Ca href=\u0022http:\/\/www.entiment.eu\/#section_repository\u0022\u003E wiping down tables\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003EIn the last 10 years, the field of robot awareness has made significant progress, Dr Lanillos says, and the\u0026nbsp;next decade will see even more advances, with robots becoming increasingly self-aware.\u003C\/p\u003E\u003Cp\u003E\u2018I\u2019m not saying that the robot will be as aware as a human is aware, in a reflexive way, but it will be able to adapt its body to the world.\u2019\u003C\/p\u003E\u003Cp\u003E\u003Cem\u003EThe research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.\u003C\/em\u003E\u003C\/p\u003E\u003C\/textarea\u003E\n\u003C\/div\u003E\n\n \u003Cdiv id=\u0022edit-body-content--description\u0022 class=\u0022ecl-help-block description\u0022\u003E\n Please copy the above code and embed it onto your website to republish.\n \u003C\/div\u003E\n \u003C\/div\u003E\n\u003Cinput autocomplete=\u0022off\u0022 data-drupal-selector=\u0022form-ut-kdtverxzl51nwd01hcfaybix3nct60zrwmlyuize\u0022 type=\u0022hidden\u0022 name=\u0022form_build_id\u0022 value=\u0022form-uT-kDtveRxZL51nWd01hCfAYBIx3NCt60ZRwMlyuIzE\u0022 \/\u003E\n\u003Cinput data-drupal-selector=\u0022edit-modal-form-example-modal-form\u0022 type=\u0022hidden\u0022 name=\u0022form_id\u0022 value=\u0022modal_form_example_modal_form\u0022 \/\u003E\n\u003C\/form\u003E\n\u003C\/div\u003E","dialogOptions":{"width":"800","modal":true,"title":"Republish this content"}}]