{"id":2755042,"date":"2022-06-16T13:21:25","date_gmt":"2022-06-16T17:21:25","guid":{"rendered":"https:\/\/www.futurity.org\/?p=2755042"},"modified":"2022-06-16T13:21:25","modified_gmt":"2022-06-16T17:21:25","slug":"robot-human-interaction-motion-algorithms-2755042-2","status":"publish","type":"post","link":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/","title":{"rendered":"Method lets humans help robots ‘see’ to get around stuff"},"content":{"rendered":"

Researchers have come up with a new strategy that allows humans to help robots “see” their environments and carry out tasks.<\/p>\n

Just like us, robots can’t see through walls. Sometimes they need a little help to get where they’re going.<\/p>\n

The strategy called Bayesian Learning IN the Dark\u2014BLIND, for short\u2014is a new solution to the long-standing problem of motion planning for robots that work in environments where not everything is clearly visible all the time.<\/p>\n

The algorithm keeps a human in the loop to “augment robot perception and, importantly, prevent the execution of unsafe motion<\/a>,” according to the study.<\/p>\n

To do so, researchers at Rice University combined Bayesian<\/a> inverse reinforcement learning (by which a system learns from continually updated information and experience) with established motion planning techniques to assist robots that have “high degrees of freedom”\u2014that is, a lot of moving parts.<\/p>\n

To test BLIND, the researchers directed a Fetch<\/a> robot, an articulated arm with seven joints, to grab a small cylinder from a table and move it to another, but in doing so it had to move past a barrier.<\/p>\n

“If you have more joints, instructions to the robot are complicated,” says Carlos Quintero-Pe\u00f1a of the George R. Brown School of Engineering. “If you’re directing a human, you can just say, ‘Lift up your hand.'”<\/p>\n

But a robot’s programmers have to be specific about the movement of each joint at each point in its trajectory, especially when obstacles block the machine’s “view” of its target.<\/p>\n

Rather than programming a trajectory up front, BLIND inserts a human mid-process to refine the choreographed options\u2014or best guesses\u2014suggested by the robot’s algorithm.<\/p>\n

“BLIND allows us to take information in the human’s head and compute our trajectories in this high-degree-of-freedom space,” Quintero-Pe\u00f1a says.<\/p>\n

“We use a specific way of feedback called critique, basically a binary form of feedback where the human is given labels on pieces of the trajectory,” he says.<\/p>\n

These labels appear as connected green dots that represent possible paths. As BLIND steps from dot to dot, the human approves or rejects each movement to refine the path, avoiding obstacles as efficiently as possible.<\/p>\n

“It’s an easy interface for people to use, because we can say, ‘I like this’ or ‘I don’t like that,’ and the robot uses this information to plan,” says Constantinos Chamzas of the George R. Brown School of Engineering. Once rewarded with an approved set of movements, the robot can carry out its task, he says.<\/p>\n

“One of the most important things here is that human preferences are hard to describe with a mathematical formula,” Quintero-Pe\u00f1a says. “Our work simplifies human-robot relationships by incorporating human preferences. That’s how I think applications will get the most benefit from this work.”<\/p>\n

“This work wonderfully exemplifies how a little, but targeted, human intervention can significantly enhance the capabilities of robots to execute complex tasks<\/a> in environments where some parts are completely unknown to the robot but known to the human,” says computer scientists Lydia Kavraki, a robotics pioneer whose resume includes advanced programming for NASA’s humanoid Robonaut aboard the International Space Station.<\/p>\n

The researchers presented the work at the Institute of Electrical and Electronics Engineers’ International Conference on Robotics and Automation<\/a>.<\/p>\n

The National 糖心视频 Foundation supported the research.<\/p>\n

Source: <\/em>Rice University<\/em><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Researchers have come up with a new strategy that allows humans to help robots “see” […]<\/p>\n","protected":false},"author":138,"featured_media":2755372,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"video","meta":{"_acf_changed":false,"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"footnotes":""},"categories":[3],"tags":[316,6753],"class_list":["post-2755042","post","type-post","status-publish","format-video","has-post-thumbnail","hentry","category-science-technology","tag-algorithms","tag-robots","post_format-post-format-video","university-rice-university"],"acf":[],"yoast_head":"\nMethod lets humans help robots 'see' to get around stuff - 糖心视频<\/title>\n<meta name=\"description\" content=\"Just like us, robots can't see through walls. A new method helps humans help them "see" where they're going.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Method lets humans help robots 'see' to get around stuff\" \/>\n<meta property=\"og:description\" content=\"Just like us, robots can't see through walls. A new method helps humans help them "see" where they're going.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\" \/>\n<meta property=\"og:site_name\" content=\"糖心视频\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/pages\/糖心视频\/143906865226\" \/>\n<meta property=\"article:published_time\" content=\"2022-06-16T17:21:25+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1600\" \/>\n\t<meta property=\"og:image:height\" content=\"915\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Mike Williams-Rice\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@糖心视频News\" \/>\n<meta name=\"twitter:site\" content=\"@糖心视频News\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Mike Williams-Rice\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\"},\"author\":{\"name\":\"Mike Williams-Rice\",\"@id\":\"https:\/\/www.futurity.org\/#\/schema\/person\/45ce6e32539156f7b57cf4e5c3eaa4c3\"},\"headline\":\"Method lets humans help robots ‘see’ to get around stuff\",\"datePublished\":\"2022-06-16T17:21:25+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\"},\"wordCount\":569,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.futurity.org\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg\",\"keywords\":[\"algorithms\",\"robots\"],\"articleSection\":[\"糖心视频 and Technology\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\",\"url\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\",\"name\":\"Method lets humans help robots 'see' to get around stuff - 糖心视频\",\"isPartOf\":{\"@id\":\"https:\/\/www.futurity.org\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg\",\"datePublished\":\"2022-06-16T17:21:25+00:00\",\"description\":\"Just like us, robots can't see through walls. A new method helps humans help them \\\"see\\\" where they're going.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage\",\"url\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg\",\"contentUrl\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg\",\"width\":1600,\"height\":915,\"caption\":\"The task set for this Fetch robot by Rice University computer scientists is made easier by their BLIND software, which allows for human intervention when the robot\u2019s path is blocked by an obstacle. Keeping a human in the loop augments robot perception and prevents the execution of unsafe motion, according to the researchers. (Credit: Kavraki Lab\/Rice)\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.futurity.org\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Method lets humans help robots ‘see’ to get around stuff\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.futurity.org\/#website\",\"url\":\"https:\/\/www.futurity.org\/\",\"name\":\"糖心视频\",\"description\":\"Research news from top universities.\",\"publisher\":{\"@id\":\"https:\/\/www.futurity.org\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.futurity.org\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.futurity.org\/#organization\",\"name\":\"糖心视频\",\"url\":\"https:\/\/www.futurity.org\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.futurity.org\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2019\/05\/logo-2018.png\",\"contentUrl\":\"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2019\/05\/logo-2018.png\",\"width\":332,\"height\":105,\"caption\":\"糖心视频\"},\"image\":{\"@id\":\"https:\/\/www.futurity.org\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/pages\/糖心视频\/143906865226\",\"https:\/\/x.com\/糖心视频News\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.futurity.org\/#\/schema\/person\/45ce6e32539156f7b57cf4e5c3eaa4c3\",\"name\":\"Mike Williams-Rice\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.futurity.org\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/7da17c289104a78fb47a61ccf2b3c96b?s=96&d=mm&r=pg\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/7da17c289104a78fb47a61ccf2b3c96b?s=96&d=mm&r=pg\",\"caption\":\"Mike Williams-Rice\"},\"url\":\"https:\/\/www.futurity.org\/author\/rice-williams\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Method lets humans help robots 'see' to get around stuff - 糖心视频","description":"Just like us, robots can't see through walls. A new method helps humans help them \"see\" where they're going.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/","og_locale":"en_US","og_type":"article","og_title":"Method lets humans help robots 'see' to get around stuff","og_description":"Just like us, robots can't see through walls. A new method helps humans help them \"see\" where they're going.","og_url":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/","og_site_name":"糖心视频","article_publisher":"https:\/\/www.facebook.com\/pages\/糖心视频\/143906865226","article_published_time":"2022-06-16T17:21:25+00:00","og_image":[{"width":1600,"height":915,"url":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg","type":"image\/jpeg"}],"author":"Mike Williams-Rice","twitter_card":"summary_large_image","twitter_creator":"@糖心视频News","twitter_site":"@糖心视频News","twitter_misc":{"Written by":"Mike Williams-Rice","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#article","isPartOf":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/"},"author":{"name":"Mike Williams-Rice","@id":"https:\/\/www.futurity.org\/#\/schema\/person\/45ce6e32539156f7b57cf4e5c3eaa4c3"},"headline":"Method lets humans help robots ‘see’ to get around stuff","datePublished":"2022-06-16T17:21:25+00:00","mainEntityOfPage":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/"},"wordCount":569,"commentCount":0,"publisher":{"@id":"https:\/\/www.futurity.org\/#organization"},"image":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage"},"thumbnailUrl":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg","keywords":["algorithms","robots"],"articleSection":["糖心视频 and Technology"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/","url":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/","name":"Method lets humans help robots 'see' to get around stuff - 糖心视频","isPartOf":{"@id":"https:\/\/www.futurity.org\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage"},"image":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage"},"thumbnailUrl":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg","datePublished":"2022-06-16T17:21:25+00:00","description":"Just like us, robots can't see through walls. A new method helps humans help them \"see\" where they're going.","breadcrumb":{"@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#primaryimage","url":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg","contentUrl":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2022\/06\/robot-human-interaction-vision-1600.jpg","width":1600,"height":915,"caption":"The task set for this Fetch robot by Rice University computer scientists is made easier by their BLIND software, which allows for human intervention when the robot\u2019s path is blocked by an obstacle. Keeping a human in the loop augments robot perception and prevents the execution of unsafe motion, according to the researchers. (Credit: Kavraki Lab\/Rice)"},{"@type":"BreadcrumbList","@id":"https:\/\/www.futurity.org\/robot-human-interaction-motion-algorithms-2755042-2\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.futurity.org\/"},{"@type":"ListItem","position":2,"name":"Method lets humans help robots ‘see’ to get around stuff"}]},{"@type":"WebSite","@id":"https:\/\/www.futurity.org\/#website","url":"https:\/\/www.futurity.org\/","name":"糖心视频","description":"Research news from top universities.","publisher":{"@id":"https:\/\/www.futurity.org\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.futurity.org\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.futurity.org\/#organization","name":"糖心视频","url":"https:\/\/www.futurity.org\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.futurity.org\/#\/schema\/logo\/image\/","url":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2019\/05\/logo-2018.png","contentUrl":"https:\/\/www.futurity.org\/wp\/wp-content\/uploads\/2019\/05\/logo-2018.png","width":332,"height":105,"caption":"糖心视频"},"image":{"@id":"https:\/\/www.futurity.org\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/pages\/糖心视频\/143906865226","https:\/\/x.com\/糖心视频News"]},{"@type":"Person","@id":"https:\/\/www.futurity.org\/#\/schema\/person\/45ce6e32539156f7b57cf4e5c3eaa4c3","name":"Mike Williams-Rice","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.futurity.org\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/7da17c289104a78fb47a61ccf2b3c96b?s=96&d=mm&r=pg","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/7da17c289104a78fb47a61ccf2b3c96b?s=96&d=mm&r=pg","caption":"Mike Williams-Rice"},"url":"https:\/\/www.futurity.org\/author\/rice-williams\/"}]}},"_links":{"self":[{"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/posts\/2755042","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/users\/138"}],"replies":[{"embeddable":true,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/comments?post=2755042"}],"version-history":[{"count":7,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/posts\/2755042\/revisions"}],"predecessor-version":[{"id":2755392,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/posts\/2755042\/revisions\/2755392"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/media\/2755372"}],"wp:attachment":[{"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/media?parent=2755042"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/categories?post=2755042"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.futurity.org\/wp-json\/wp\/v2\/tags?post=2755042"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}