Research News /program/robotics/ en Tiny insects could lead to big changes in robot design /program/robotics/2025/02/24/tiny-insects-could-lead-big-changes-robot-design <span>Tiny insects could lead to big changes in robot design</span> <span><span>Jeff Zehnder</span></span> <span><time datetime="2025-02-24T10:10:49-07:00" title="Monday, February 24, 2025 - 10:10">Mon, 02/24/2025 - 10:10</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/2025-02/AdobeStock_112865445.jpeg?h=cae4551a&amp;itok=LnuC8Y5D" width="1200" height="800" alt="Closeup of a fly on wood."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <a href="/program/robotics/jeff-zehnder">Jeff Zehnder</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div> <div class="align-right image_style-wide_image_style"> <div class="imageMediaStyle wide_image_style"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/wide_image_style/public/2025-02/Robotics_SystemsDesign_20231115_JMP_027.JPG?h=fbf5a9c6&amp;itok=Mbts7Z0g" width="1500" height="563" alt="Sean Humbert and Leopold Beuken inspecting sensors on the underside of a fixed wing UAS."> </div> <span class="media-image-caption"> <p>Sean Humbert and Leopold Beuken inspecting sensors on a fixed wing UAS.</p> </span> </div> <p dir="ltr"><a href="/program/robotics/j-sean-humbert" data-entity-type="node" data-entity-uuid="726108fa-0749-46c5-85db-a11f2ca413e5" data-entity-substitution="canonical" rel="nofollow" title="J. Sean Humbert"><span>Sean Humbert</span></a><span> is unlocking the biological secrets of the common housefly to make major advances in robotics and uncrewed aerial systems (UAS).</span></p><p dir="ltr"><span>A professor in the Paul M. Rady Department of Mechanical Engineering and director of the Robotics Program at the 精品SM在线影片, Humbert is working to understand how tiny biological systems process sensory information as they move through the world.</span></p><p dir="ltr"><span>This basic sounding concept involves extremely complex science and engineering.</span></p><p dir="ltr"><span>鈥淚nsects aren鈥檛 built like robots,鈥 Humbert said. 鈥淚f I have a robot and I want it to perceive the environment, I tend to put a larger, high fidelity lidar system on it. Flies instead have small, low-quality sensors throughout their bodies. Due to the way that the measurements are processed by the nervous system, you can extract similar levels of information as bulky robotic sensors. We want to take advantage of what nature does.鈥</span></p><p dir="ltr"><span>The research has drawn the interest of the U.S. Air Force Research Laboratory, which awarded Humbert a five-year, $909,000 grant to advance the work.</span></p><p dir="ltr"><span>He also&nbsp;</span><a href="https://ieeexplore.ieee.org/document/10836695" rel="nofollow"><span>recently&nbsp;published a journal article</span></a><span> in the Institute of Electrical and Electronics Engineers (IEEE) Access journal, proposing a mathematical framework for understanding and applying to robotics connections in the flight physics and visual physiology of flies.</span></p><p dir="ltr"><span>Flies may seem an unlikely creature to study for enhancing robots, but Humbert鈥檚 co-author and post-doctoral researcher Zoe Turin says the insects have a lot to offer roboticists.</span></p><p dir="ltr"><span>鈥淚f you've ever tried to catch or swat a fly, you know that they can be quite capable fliers, despite a lack of computational power,鈥 Turin says. 鈥淚f we apply principles from how insects operate, then we may be able to develop robots that have similar capabilities at a much smaller size than traditional robots. This has potential applications across a wide variety of industries.鈥</span></p><p dir="ltr"><span>Although flies are only 6-7 millimeters long and have brains the size of a poppy seed, Humbert said their abilities have evaded researchers, until now. A key element of how flies work is distributed sensing.</span></p> <div class="align-left image_style-wide_image_style"> <div class="imageMediaStyle wide_image_style"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/wide_image_style/public/2025-02/Robotics_SystemsDesign_20231115_JMP_107.JPG?h=2b4dd1de&amp;itok=Qv5klcw-" width="1500" height="563" alt="Zoe Turin and Eugene Rush in front of a white board with a small UAS."> </div> <span class="media-image-caption"> <p>Zoe Turin and Eugene Rush in front of a white board with a small UAS.</p> </span> </div> <p dir="ltr"><span>鈥淚t鈥檚 taken years to arrive at a model of how the fly鈥檚 sensory structure is set up the way it is and to be able to figure out the math behind it,鈥 Humbert said. 鈥淭his has so much potential going forward. An F-22 fighter jet has a small number of high fidelity, big, expensive sensors that require a lot of backend processing and computation to generate quality measurements. Nature is the exact opposite. It鈥檚 small, low fidelity, lightweight, and distributed.鈥</span></p><p dir="ltr"><span>By unlocking the principles and mathematical optimizations that exist in flies, Turin said researchers will be able to explore similar techniques for robots.</span></p><p dir="ltr"><span>鈥淯nderstanding more about how insects are able to do what they do has only made them more amazing to me. This framework will hopefully help our engineered systems to react more quickly to unexpected disturbances, such as a sudden gust of wind, while reducing the computational power required,鈥 Turin said.</span></p><p dir="ltr"><span>Over the course of the grant, Humbert will take the models he has developed on flies to conduct proof-of-concept demonstrations and then experimental research using robotic sensor technology.</span></p><p dir="ltr"><span>鈥淭his is a wonderful, cool biological principle and we now have the model to explore what nature has constructed,鈥 Humbert said. 鈥淚t will revolutionize how we think about the design cycle of robotic systems.鈥</span></p></div> </div> </div> </div> </div> <div>Sean Humbert is unlocking the biological secrets of the common housefly to make major advances in robotics and uncrewed aerial...</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/large_image_style/public/2025-02/AdobeStock_112865445.jpeg?itok=bkCA_vlL" width="1500" height="996" alt="Closeup of a fly on wood."> </div> </div> <div>On</div> <div>White</div> Mon, 24 Feb 2025 17:10:49 +0000 Jeff Zehnder 136 at /program/robotics Rentschler, Aspero Medical awarded $4.5M for endoscopy advancement /program/robotics/2025/02/11/rentschler-aspero-medical-awarded-45m-endoscopy-advancement <span>Rentschler, Aspero Medical awarded $4.5M for endoscopy advancement</span> <span><span>Jeff Zehnder</span></span> <span><time datetime="2025-02-11T15:31:15-07:00" title="Tuesday, February 11, 2025 - 15:31">Tue, 02/11/2025 - 15:31</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/2025-02/2025_2_3_Mark_Rentschler_Endoscopy_Devices_PC0044_jpg.jpg?h=b939560f&amp;itok=IJadLoYV" width="1200" height="800" alt="Professor Mark Rentschler holding Aspero Medical's patented Ancora-SB balloon overtube."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/28" hreflang="en">Mark Rentschler News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><div><div><div><div><div><div><p dir="ltr"><span>It鈥檚 been six years since the launch of startup company&nbsp;</span><a href="https://www.asperomedical.com/" rel="nofollow"><span>Aspero Medical</span></a><span>, co-founded by&nbsp;</span><a href="/mechanical/mark-rentschler" rel="nofollow"><span>Professor Mark Rentschler</span></a><span> of the&nbsp;</span><a href="/mechanical" rel="nofollow"><span>Paul M. Rady Department of Mechanical Engineering</span></a><span>. The company has seen great success, including the development of a medical device designed to enable more efficient procedures in the small bowel region.</span></p><p dir="ltr"><span>Today, with the help of a $4.5 million award through the Anschutz Acceleration Initiative (AAI), Rentschler and his colleagues are working to bring two new products to the market that will transform these types of procedures even further.</span></p><p dir="ltr"><span>鈥淲e brought our first product out on the market in 2024,鈥 said Rentschler, also a faculty member in&nbsp;</span><a href="/bme/" rel="nofollow"><span>biomedical engineering (BME)</span></a><span> and&nbsp;</span><a href="/program/robotics/" rel="nofollow"><span>robotics</span></a><span>. 鈥淲e are planning to bring a second and third product to the market in 12-18 months, and we are extremely excited to get these devices in the hands of interventional endoscopists.鈥</span></p><div class="ucb-box ucb-box-title-hidden ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-content"><div><div>&nbsp;</div></div><p>Professor Mark Rentschler holding Aspero Medical's patented Ancora-SB balloon overtube.</p></div></div></div><p dir="ltr"><span>In 2023, Aspero received clearance from the Food and Drug Administration (FDA)&nbsp;</span><a href="/bme/2023/09/11/rentschlers-startup-company-improves-endoscopy-procedures-patented-balloon-technology" rel="nofollow"><span>to market and sell the Ancora-SB device</span></a><span>. The product is used during endoscopy procedures to diagnose and treat small bowel diseases.</span></p><p dir="ltr"><span>According to Rentschler, operating within the small intestine can be time consuming and technically challenging. Equipped with a patented micro-textured balloon, the Ancora-SB overtube is designed to provide more traction and anchoring consistency than smooth latex or smooth silicone balloon overtube competitors.</span></p><p dir="ltr"><span>鈥淏alloon overtubes for small bowel procedures have been around for about a decade,鈥 said Rentschler. 鈥淲e鈥檙e not looking to change the small bowel enteroscopy procedure, but instead improve balloon anchoring performance during these procedures in the small bowel.鈥</span></p><p dir="ltr"><span>Ancora-SB has allowed Aspero to prove their worth in hospitals. Their next products expand on this concept, of course, with additional features that can facilitate a less invasive interventional procedure than traditional open surgery.</span></p><p dir="ltr"><span>The next generation balloon overtube will be used to remove cancerous lesions in the large bowel region. It features an extra working channel that allows for an additional tool to be utilized alongside the visualization scope. This offers physicians more control, access, and stabilization when maneuvering through the colon and performing advanced interventional procedures.</span></p><p dir="ltr"><span>鈥淐onceptually, these devices will enable triangulated surgery with two tools and centralized visualization so that physicians can more efficiently perform surgery from inside the lumen,鈥 Rentschler said. 鈥淚nstead of historically invasive procedures, where the patient is cut open, and the cancerous bowel region is removed, we鈥檙e assisting physicians as they remove the cancer from the inside of the lumen during an outpatient procedure.</span></p><p dir="ltr"><span>鈥淚t's much less invasive, with potentially tremendous cost savings, and numerous benefits for the patient.鈥</span></p><p dir="ltr"><span>Aspero鈥檚 third product will be another balloon overtube, this time with a working channel that enables minimally invasive cancer removal in the esophagus and stomach regions of the gastrointestinal tract.</span></p><div class="ucb-box ucb-box-title-hidden ucb-box-alignment-left ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-content"><div><div>&nbsp;</div></div><p>Rentschler showcasing all three of the medical devices in Aspero Medical's multi-product platform, including their two new highly anticipated devices.</p></div></div></div><p dir="ltr"><span>Rentschler and his team say the two upcoming devices have the potential to replace a large, and growing, number of today鈥檚 conventional surgical procedures in the gastrointestinal region by enhancing safety and efficiency while reducing patient recovery time. Moving procedures from inpatient surgery to outpatient endoscopy can generate potential cost savings of up to 50 percent or more.</span></p><p dir="ltr"><span>鈥淓veryone knows this is the direction we need to go. Clinical outcomes from these types of procedures are incredibly strong, but the techniques and devices aren鈥檛 widely available yet,鈥 said Rentschler. 鈥淲e are creating products that help physicians and patients feel safe and comfortable without overcomplicating things. The paradigm is rapidly shifting, and we endeavor to push endoscopy forward.鈥</span></p><p dir="ltr"><span>The company is currently finalizing the design of the second product. It鈥檚 about six months further along in development than the third product, but Rentschler says they are looking to have both devices FDA cleared by the end of 2026.</span></p><p dir="ltr"><span>When all three devices hit the market, Aspero will look to market a portfolio of products, rather than a single tool. But further innovation is on the horizon, this time incorporating the Ancora balloon technology with a robotic element.</span></p><p dir="ltr"><span>鈥淎ncora is a multi-product platform focusing on the small bowel, large bowel, stomach and esophageal regions,鈥 Rentschler said. 鈥淥ur next potential venture will be in flexible robots. We鈥檒l continue with our balloon overtubes, but as anchoring platforms to be used with flexible robotic endoscopy systems.鈥</span></p><p dir="ltr"><span>Until then, Rentschler and company are full steam ahead on these next products. The $4.5 million AAI grant is being offered over a four year span, but they anticipate spending that money much sooner so they can get the devices out on the market and begin positively impacting patients and physicians everywhere.</span></p><p dir="ltr"><span>But that鈥檚 not their only goal. With a lot of Colorado involved in the company鈥檚 revolutionizing technology, Rentschler hopes to also tell another story.</span></p><p dir="ltr"><span>鈥淚 started Aspero Medical with Dr. Steven Edmundowicz at CU Anschutz. We鈥檝e received a number of grants from the state of Colorado and everyone involved is invested in our vision,鈥 said Rentschler. 鈥淲e believe that a rising tide raises all boats, and when we think of Aspero, we want it to be a successful Colorado story.鈥</span></p></div></div></div></div></div></div></div> </div> </div> </div> </div> <script> window.location.href = `/mechanical/rentschler-aspero-awarded-45m-endoscopy-advancement`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 11 Feb 2025 22:31:15 +0000 Jeff Zehnder 135 at /program/robotics CS robotics research to help strengthen domestic battery supply chain /program/robotics/2024/12/05/cs-robotics-research-help-strengthen-domestic-battery-supply-chain <span>CS robotics research to help strengthen domestic battery supply chain</span> <span><span>Jeff Zehnder</span></span> <span><time datetime="2024-12-05T11:28:06-07:00" title="Thursday, December 5, 2024 - 11:28">Thu, 12/05/2024 - 11:28</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/2024-12/Screenshot%202024-12-05%20at%2011-33-18%20image.png%20%28PNG%20Image%201446%20%C3%97%20890%20pixels%29%20%E2%80%94%20Scaled%20%2883%25%29.png?h=02da8a9e&amp;itok=KfTf0XwP" width="1200" height="800" alt="Visualization of robotics dissembly."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/5" hreflang="en">Nikolaus Correll News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>Computer science professor&nbsp;</span><a href="/lab/correll" rel="nofollow"><span>Nikolaus Correll and his lab at 精品SM在线影片&nbsp;</span></a><span>have been awarded $1.8 million by the U.S. Department of Energy Advanced Research Projects Agency-Energy (ARPA-E) to help establish a circular supply chain for domestic electric vehicle (EV) batteries.</span></p><p dir="ltr"><span>The percentage of EV passenger vehicles on the road is&nbsp;</span><a href="https://about.bnef.com/electric-vehicle-outlook/" rel="nofollow"><span>expected to rise</span></a><span> to 28% by 2030 and 58% by 2040, globally.</span></p><p dir="ltr"><span>The existing supply chain for EV batteries relies mostly on recycling to recover critical minerals such as cobalt, nickel or copper.</span></p><p dir="ltr"><span>However, conventional battery recycling methods are energy-intensive, produce significant quantities of greenhouse gases, and lead to large volumes of waste deposited in landfills.</span></p><p dir="ltr"><span>精品SM在线影片 joins 12 other projects around the country working to change this dynamic through ARPA-E's&nbsp;</span><a href="https://arpa-e.energy.gov/technologies/projects/robust-robotic-disassembly-ev-battery-packs-using-open-world-vision-language" rel="nofollow"><span>Catalyzing Innovative Research for Circular Use of Long-lived Advanced Rechargeables (CIRCULAR)</span></a><span> program.</span></p><p dir="ltr"><span>Correll's project focuses on autonomous robotic disassembly of EV lithium-ion battery packs. Humanoid robots will work together with robotic arms to manipulate wire harnesses and remove screws and other components before dismantling commercial battery packs with a heavy-duty industrial arm.</span></p><p dir="ltr"><span>Correll explained that people are interested in using robots for the task due to the hazardous nature of the work.</span></p><p dir="ltr"><span>"The batteries are quite dangerous to handle due to the risk of electrocution and spontaneous ignition," Correll said.</span></p><p dir="ltr"><span>The Correll Lab's project will use state-of-the-art perception models and large-language models to consider the physics and context of each battery.</span></p><p dir="ltr"><span>By advancing the efficiency and ability of battery disassembly systems, component recycling could be done at a commercial scale more safely and cost-effectively, leading to less waste in landfills and more material available for new EV batteries.</span></p><p dir="ltr"><span>The director of ARPA-E, Evelyn N. Wang, said, "I look forward to seeing how these CIRCULAR projects develop regeneration, repair, reuse, and remanufacture technologies to create a sustainable EV battery supply chain."&nbsp;</span></p></div> </div> </div> </div> </div> <script> window.location.href = `/cs/2024/12/02/cs-robotics-research-help-strengthen-domestic-battery-supply-chain`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 05 Dec 2024 18:28:06 +0000 Jeff Zehnder 134 at /program/robotics Robots can鈥檛 outrun animals (yet). A new study explores why /program/robotics/2024/04/29/robots-can%E2%80%99t-outrun-animals-yet-new-study-explores-why <span>Robots can鈥檛 outrun animals (yet). A new study explores why </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-04-29T12:58:02-06:00" title="Monday, April 29, 2024 - 12:58">Mon, 04/29/2024 - 12:58</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/mclari_spider_close_1_jpg.jpg?h=4164a200&amp;itok=dg78MoyF" width="1200" height="800" alt="A robot called mCLARI designed by engineers at 精品SM在线影片 poses next to a spider."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/13" hreflang="en">Kaushik Jayaram News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>The question may be the 21st century鈥檚 version of the fable of the tortoise and the hare: Who would win in a foot race between a robot and an animal?</p><p>In a new perspective article, a team of engineers from the United States and Canada, including CU&nbsp;Boulder roboticist Kaushik Jayaram, set out to answer that riddle. The group analyzed data from dozens of studies and came to a resounding 鈥渘o.鈥 In almost all cases, biological organisms, such as cheetahs, cockroaches and even humans, seem to be able to outrun their robot counterparts.&nbsp;</p><p>&nbsp;</p><div class="feature-layout-callout feature-layout-callout-xlarge feature-layout-callout-float-right clearfix"><div class="feature-layout-callout-inner element-max-width-padding"><p>&nbsp;</p><div class="video-filter"><div class="fluid-width-video-wrapper"></div></div><p>&nbsp;</p></div></div><p>The researchers, led by <a href="https://faculty.washington.edu/sburden/2024-04-24-avm/" rel="nofollow">Samuel Burden at the University of Washington</a> and <a href="https://www.sfu.ca/sfunews/stories/2024/04/why-can-t-robots-outrun-animals-.html" rel="nofollow">Maxwell Donelan at Simon Fraser University</a>, published their findings <a href="http://www.science.org/doi/10.1126/scirobotics.adi9754" rel="nofollow">last week in the journal Science Robotics</a>.</p><p>鈥淎s an engineer, it is kind of upsetting,鈥 said Jayaram, an assistant professor in the Paul M. Rady Department of Mechanical Engineering at 精品SM在线影片. 鈥淥ver 200 years of intense engineering, we鈥檝e been able to send spacecraft to the moon and Mars and so much more. But it鈥檚 confounding that we do not yet have robots that are significantly better than biological systems at locomotion in natural environments.鈥</p><p>He hopes the study will inspire engineers to learn how to build more adaptable, nimble robots. The researchers concluded that the failure of robots to outrun animals doesn鈥檛 come down to shortfalls in any one piece of machinery, such as batteries or actuators. Instead, where engineers might falter is in making those parts work together efficiently. &nbsp;</p><p>This pursuit is one of Jayaram鈥檚 chief passions. His lab on the 精品SM在线影片 campus is home to a lot of creepy crawlies, including several furry wolf spiders that are about the size of a half dollar.</p><p>鈥淲olf spiders are natural hunters,鈥 Jayaram said. 鈥淭hey live under rocks and can run over complex terrain with incredible speed to catch prey.鈥</p><p>He envisions a world in which engineers build robots that work a bit more like these extraordinary arachnids.</p><p>鈥淎nimals are, in some sense, the embodiment of this ultimate design principle鈥攁 system that functions really well together,鈥 he said.</p><p>&nbsp;</p><div class="image-caption image-caption-none"><p>&nbsp;</p><div class="feature-layout-hero-wrapper"></div><p>&nbsp;</p><p>A cockroach alongside the HAMR-Jr robot. (Credit: Kaushik Jayaram)</p><p>&nbsp;</p></div><h2>Cockroach energy</h2><p>The question of 鈥渨ho can run better, animals or robots?鈥 is complicated because running itself is complicated.&nbsp;</p><p>In previous research, Jayaram and his colleagues at Harvard University designed a line of robots that seek to <a href="/today/2020/06/03/cockroach-inspired-robot-among-smallest-fastest-ever" rel="nofollow">mimic the behavior of the oft-reviled cockroach</a>. The team鈥檚 <a href="https://ieeexplore.ieee.org/abstract/document/9197436" rel="nofollow">HAMR-Jr model</a> fits on top of a penny and sprints at speeds equivalent to that of a cheetah. But, Jayaram noted, while HAMR-Jr can bust a move forward and backward, it doesn鈥檛 move as well side-to-side or over bumpy terrain. Humble cockroaches, in contrast, have no trouble running over surfaces from porcelain to dirt and gravel. They can also <a href="https://royalsocietypublishing.org/doi/full/10.1098/rsif.2017.0664" rel="nofollow">dash up walls</a> and <a href="https://www.pnas.org/doi/abs/10.1073/pnas.1514591113" rel="nofollow">squeeze through tiny cracks</a>.</p><p>To understand why such versatility remains a challenge for robots, the authors of the new study broke these machines down into five subsystems including power, frame, actuation, sensing, and control. To the group鈥檚 surprise, few of those subsystems seemed to fall short of their equivalents in animals.&nbsp;</p><p>&nbsp;</p><div class="feature-layout-callout feature-layout-callout-large feature-layout-callout-float-right clearfix"><div class="feature-layout-callout-inner element-max-width-padding"><p>&nbsp;</p><div class="image-caption image-caption-none"><p><a href="/today/sites/default/files/styles/large/public/article-image/clari_robot.cc13_0.jpg?itok=DjdIqnDI" rel="nofollow"></a></p><p>Kaushik Jayaram, right, with graduate student Heiko Kabutz, left, in Jayaram's lab on the 精品SM在线影片 campus. (Credit: Casey Cass/精品SM在线影片)</p><p>&nbsp;</p></div><p>&nbsp;</p></div></div><p>High-quality lithium-ion batteries, for example, can deliver as much as 10 kilowatts of power for every kilogram (2.2 pounds) they weigh. Animal tissue, in contrast, produces around one-tenth that. Muscles, meanwhile, can鈥檛 come close to matching the absolute torque of many motors.&nbsp;</p><p>鈥淏ut at the system level, robots are not as good,鈥 Jayaram said. 鈥淲e run into inherent design trade-offs. If we try to optimize for one thing, like forward speed, we might lose out on something else, like turning ability.鈥</p><h2>Spider senses</h2><p>So, how can engineers build robots that, like animals, are more than just the sum of their parts?&nbsp;</p><p>Animals, Jayaram noted, aren鈥檛 split into separate subsystems in the same way as robots. Your quadriceps, for example, propel your legs like HAMR-Jr鈥檚 actuators move their limbs. But quads also produce their own power by breaking down fats and sugars and incorporating neurons that can sense pain and pressure.</p><p>Jayaram thinks the future of robotics may come down to 鈥渇unctional subunits鈥 that do the same thing: Rather than keeping power sources separate from your motors and circuit boards, why not integrate them all into a single part? In a 2015 paper, 精品SM在线影片 computer scientist Nikolaus Correll, who wasn鈥檛 involved in the current study, proposed such theoretical 鈥渞obotic materials鈥 that work more like your quads.&nbsp;</p><p>Engineers are still a long way away from achieving that goal. Some, like Jayaram, are making steps in this direction, such as through his lab鈥檚 Compliant Legged Articulated Robotic Insect (CLARI) robot, <a href="/today/2023/08/30/tiny-shape-shifting-robot-can-squish-itself-tight-spaces" rel="nofollow">a multi-legged robot that moves a little like a spider</a>. Jayaram explained that CLARI relies on a modular design, in which each of its legs acts like a self-contained robot with its own motor, sensors and controlling circuitry. The team鈥檚 <a href="https://ieeexplore.ieee.org/abstract/document/10341588" rel="nofollow">new and improved version called&nbsp;mCLARI</a>&nbsp;can move in all directions in confined spaces, a first for four-legged robots.</p><p>It's one more thing that engineers like Jayaram can learn from those perfect hunters, wolf spiders.</p><p>鈥淣ature is a really useful teacher.鈥</p></div></div></div></div> </div> </div> </div> </div> <script> window.location.href = `/today/2024/04/29/robots-cant-outrun-animals-yet-new-study-explores-why`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 29 Apr 2024 18:58:02 +0000 Anonymous 122 at /program/robotics A delicate touch: teaching robots to handle the unknown /program/robotics/2024/04/02/delicate-touch-teaching-robots-handle-unknown <span>A delicate touch: teaching robots to handle the unknown </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-04-02T15:30:06-06:00" title="Tuesday, April 2, 2024 - 15:30">Tue, 04/02/2024 - 15:30</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2024-04-02_at_1.40.49_pm_png.png?h=62a55699&amp;itok=6PRfYtN9" width="1200" height="800" alt="A robot grasping objects"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/5" hreflang="en">Nikolaus Correll News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>William Xie, a first-year PhD student in computer science, is teaching a robot to reason how gently it should grasp previously unknown&nbsp;objects by using large language models (LLMs).&nbsp;</p><p><a href="https://deligrasp.github.io/" rel="nofollow">DeliGrasp</a>, Xie's project, is an intriguing step beyond the custom, piecemeal solutions currently used to avoid pinching or crushing novel objects.&nbsp;</p><p>In addition, Deligrasp helps the robot translate what it can 'touch' into meaningful information for people.&nbsp;</p><p>"William has gotten some neat results by leveraging common sense information from large language models. For example, the robot can estimate and explain the ripeness of various fruits after touching them." Said his advisor, <a href="/lab/correll" rel="nofollow">Professor Nikolaus Correll</a>.&nbsp;</p><p>Let's learn more about DeliGrasp, Xie's journey to robotics, and his plans for the conference Japan and beyond.&nbsp;</p><p>&nbsp;</p><div class="video-filter"><div class="fluid-width-video-wrapper"></div></div><h2>How would you describe this research?&nbsp;</h2><p>As humans, we鈥檙e able to quickly intuit how exactly we need to pick up a variety of objects, including delicate produce or unwieldy, heavy objects. We鈥檙e informed by the visual appearance of an object, what prior knowledge we may have about it, and most importantly, how it feels to the touch when we initially grasp it.&nbsp;</p><p>Robots don鈥檛 have this all-encompassing intuition though, and they don鈥檛 have end-effectors (grippers/hands) as effective as human hands. So solutions are piecemeal: the community has researched 鈥渉ands鈥 across the spectrum of mechanical construction, sensing capabilities (tactile, force, vibration, velocity), material (soft, rigid, hybrid, woven, etc鈥). And then the corresponding machine learning models and/or control methods to enable 鈥渁ppropriately forceful鈥 gripping are bespoke for each of these architectures.</p><p>Embedded in LLMs, which are trained on an internet鈥檚 worth of data, is common sense physical-reasoning that crudely approximates a human鈥檚 (as the saying goes: 鈥渁ll models are wrong, some are useful鈥). We use the LLM-estimated mass and friction to simplify the grasp controller and deploy it on a two-finger gripper, a prevalent and relatively simple architecture. Key to the controller working is the force feedback sensed by the gripper as it grasps an object, and knowing at what force threshold to stop鈥攖he LLM-estimated values directly determine this threshold for any arbitrary object, and our initial results are quite promising.</p><h2>How did you get inspired to pursue this research?</h2><p>I wouldn鈥檛 say that I was inspired to pursue this specific project. I think, like a lot of robotics research, I had been working away at a big problem for a while, and stumbled into a solution for a much smaller problem. My goal since I arrived here has been to research techniques for assistive robots and devices that restore agency for the elderly and/or mobility-impaired in their everyday lives. I鈥檓 particularly interested in shopping (but eventually generalist) robots鈥攐ne problem we found is that it is really hard to determine, let alone pick ripe fruits and produce with a typical robot gripper and just a camera. In early February, I took a day to try out picking up variably sized objects via hand-tuning our MAGPIE gripper鈥檚 force sensing (an affordable, open-source gripper developed by the Correll Lab). It worked well; I let ChatGPT calibrate the gripper which worked even better, and it evolved very quickly into DeliGrasp.</p><h2>What would you say is one of your most interesting findings so far?</h2><p>LLMs do a reasonable job of estimating an arbitrary object鈥檚 mass (friction, not as well) from just a text description. This isn鈥檛 in the paper, but when paired with a picture, they can extend this reasoning for oddballs鈥攇igantic paper airplanes, or miniature (plastic) fruits and vegetables.</p><p>With our grasping method, we can sense the contact forces on the gripper as it closes around an object鈥攖his is a really good measure of ripeness, it turns out. We can then further employ LLMs to reason about these contact forces to pick out ripe fruit and vegetables!</p><h2>What does the day-to-day of this research look like?</h2><p>Leading up to submission, I was running experiments on the robot and picking up different objects with different strategies pretty much every day. A little repetitive, but also exciting. Prior to that, and now that I鈥檓 trying to improve the project for the next conference, I spend most of my time reading papers, thinking/coming up with ideas, and setting up small, one-off experiments to try out those ideas.</p><h2>How did you come to study at 精品SM在线影片?&nbsp;</h2><p>For a few years, I鈥檝e known that I really wanted to build robots that could directly, immediately help my loved ones and community. I had a very positive first research experience in my last year of undergrad and learned what it felt like to have true personal agency in pursuing work that I cared about. At the same time I knew I鈥檇 be relocating to Boulder after graduation. I was very fortunate that Nikolaus accepted me and let me keep pursuing this goal of mine.</p><p>It鈥檇 be unfathomable if I could keep doing this research in academia or industry, though of course that would be ideal. But I鈥檓 biased toward academia, particularly teaching. I鈥檝e been teaching high school robotics for 5 years now, and now teaching/mentoring undergrads at CU鈥攅ach day is as fulfilling as the first. I have great mentors across the robotics faculty and senior PhD students we work in ECES 111, a giant, well-equipped space that 3 robotics labs share, and it鈥檚 great for collaboration and brainstorming.&nbsp;</p><h2>What are your hopes for this international conference (and what conference is it?)</h2><p>The venue is a workshop at the 2024 International Conference on Robotics and Automation (ICRA 2024), happening in Yokohama, Japan from May 13-17. The name of the workshop is a mouthful: Vision-Language Models for Navigation and Manipulation (VLMNM).</p><p>A workshop is detached from the main conference, and kind of is its own little bubble (like a big supermarket鈥攖he conference鈥攈osting a pop-up food tasting event鈥攖he workshop). I'm really excited to meet other researchers and pick their brains. As a first-year, I鈥檝e spent the past year reading papers from practically everyone on the workshop panel, and from their students. I鈥檒l probably also spend half my time exploring (eating) around the Tokyo area.</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/cs/2024/04/02/delicate-touch-teaching-robots-handle-unknown`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 02 Apr 2024 21:30:06 +0000 Anonymous 120 at /program/robotics 精品SM在线影片 robotics research showcased in Advanced Intelligent Systems /program/robotics/2024/01/09/cu-boulder-robotics-research-showcased-advanced-intelligent-systems <span>精品SM在线影片 robotics research showcased in Advanced Intelligent Systems</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-01-09T09:19:21-07:00" title="Tuesday, January 9, 2024 - 09:19">Tue, 01/09/2024 - 09:19</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/aisy202370057-blkfxd-0001-m.jpg?h=1b1dd7b3&amp;itok=IlRauDrG" width="1200" height="800" alt="Advanced Intelligent Systems cover with a tiny robot."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/13" hreflang="en">Kaushik Jayaram News</a> </div> <a href="/program/robotics/jeff-zehnder">Jeff Zehnder</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p> </p><div class="align-right image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/medium_750px_50_display_size_/public/article-image/aisy202370057-blkfxd-0001-m.jpg?itok=3_6X3KC4" width="750" height="985" alt="Advanced Intelligent Systems cover with a tiny robot."> </div> </div> <a href="/program/robotics/node/64" rel="nofollow">Kaushik Jayaram's </a>bioinspired robotics are on the cover of the latest issue of the journal Advanced Intelligent Systems.<p>The article, "Design of CLARI: A Miniature Modular Origami Passive Shape-Morphing Robot," discusses the design and creation of Jayaram's compliant legged articulated robotic insect.</p><p>Jayaram is an assistant professor in the Robotics Program and the Paul M. Rady Department of Mechanical Engineering. He is an expert in robotics and systems design, materials, and work at the micro and nanoscale.</p><p>The cover shows a 2.59 gram, 3.4 cm long, modular origami robot capable of passive shape morphing.</p><p>These tiny robots provide unique abilities to access confined environments and have potential for applications such as search-and-rescue and high-value asset inspection.</p><p class="lead"><a href="https://onlinelibrary.wiley.com/doi/full/10.1002/aisy.202300181" rel="nofollow">Read the full journal article at Advanced Intelligent Systems...</a></p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 09 Jan 2024 16:19:21 +0000 Anonymous 119 at /program/robotics Xu's 'cyborg jellyfish' highlighted in Nature /program/robotics/2023/12/12/xus-cyborg-jellyfish-highlighted-nature <span>Xu's 'cyborg jellyfish' highlighted in Nature </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-12-12T13:35:15-07:00" title="Tuesday, December 12, 2023 - 13:35">Tue, 12/12/2023 - 13:35</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/nicole_xu.jpg?h=b0b513fd&amp;itok=f-CAYe98" width="1200" height="800" alt="Nicole Xu"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/68" hreflang="en">Nicole Xu News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>Assistant Professor Nicole Xu recently spoke with <em>Nature </em>for a feature about&nbsp;biohybrid robots and their real-world applications.</p><p>Xu and her collaborators have been working on a jellyfish-inspired robot that can help monitor climate change and ecological shifts in the Earth's oceans.&nbsp;</p><blockquote><p>"Jellyfish have a number of appealing characteristics for roboticists. They are energy-efficient swimmers, and are able to descend to great depths. Compared with current mechanical submersibles, Xu says, jellyfish are less likely to cause damage to marine environments. Their natural appearance and quietness also make them unremarkable 鈥 during the ocean tests, fish swam right up to them."</p></blockquote><p><a href="https://nicolexulab.com/" rel="nofollow">Xu's research lab</a>&nbsp;works at the intersection of robotics, fluid dynamics&nbsp;and biology. Their methods include laboratory experiments, theoretical modeling&nbsp;and field work.</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/mechanical/2023/12/07/xus-cyborg-jellyfish-highlighted-nature`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 12 Dec 2023 20:35:15 +0000 Anonymous 112 at /program/robotics Is the World Ready for Self-Driving Cars? /program/robotics/2023/11/27/world-ready-self-driving-cars <span>Is the World Ready for Self-Driving Cars? </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-11-27T10:41:51-07:00" title="Monday, November 27, 2023 - 10:41">Mon, 11/27/2023 - 10:41</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/sm-coloradan_fall23_half2_jpg.jpg?h=011a176b&amp;itok=QSm8MUIb" width="1200" height="800" alt="Drawing of a person in a car with no steering wheel."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/6" hreflang="en">Christoffer Heckman News</a> <a href="/program/robotics/taxonomy/term/67" hreflang="en">Majid Zamani News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><div><div><div><p>In August, the California Public Utilities Commission <a href="https://www.wsj.com/articles/cruise-waymo-get-approval-to-expand-driverless-vehicles-in-san-francisco-923fe89d" rel="nofollow">made history</a> when it voted to allow two self-driving car companies, Waymo and Cruise, to commercially operate their 鈥渞obotaxis鈥 around the clock in San Francisco.</p><p>Within hours, Cruise reported at least 10 incidents where vehicles stopped short of their destination, blocking city streets. The commission demanded they recall 50% of their fleet.&nbsp;</p><p>Despite these challenges, other cities 鈥 including Las Vegas, Miami, Austin and Phoenix 鈥 <a href="https://www.axios.com/2023/08/29/cities-testing-self-driving-driverless-taxis-robotaxi-waymo" rel="nofollow">are allowing</a> autonomous vehicle startups to conduct tests on public roads.&nbsp;</p><p>&nbsp;</p><div><div><blockquote><p><strong>"Self-driving car proponents see the jump from laboratories to real-world testing as a necessary step that has been a long time coming."</strong></p></blockquote><p>&nbsp;</p></div></div><p>Self-driving car proponents see the jump from laboratories to real-world testing as a necessary step that has been a long time coming. The first autonomous vehicle was tested on the Autobahn in Germany in 1986, but the advances stalled in the 1990s due to technology limitations.&nbsp;</p><p>After a 2007 Defense Department鈥檚 Advanced Research Projects Agency (DARPA) <a href="https://www.darpa.mil/about-us/timeline/darpa-urban-challenge" rel="nofollow">competition featuring autonomous driving capabilities</a>, it seemed like the era of driverless cars had finally arrived. The competition kickstarted a Silicon Valley race to develop the first commercial driverless car. Optimism abounded, with engineers, investors and automakers predicting there would be as many as 10 million self-driving cars on the road by 2020.&nbsp;</p><p>鈥淭he question for the last 30 years is 鈥 how long is this going to take?鈥 said <strong>Javier von Stecher</strong> (PhDPhys鈥08), senior software engineer at Nvidia who has worked on self-driving car technology at companies including Uber and Mercedes-Benz. 鈥淚 think a lot of people were oversold on the idea that we could get this working fast. The biggest shift I鈥檝e seen over the past decade is people realizing how hard this problem really is.鈥&nbsp;</p><p>The stakes may be high, but that鈥檚 not deterring 精品SM在线影片 researchers. From creating systems and models to studying human-machine interactions, university teams are working to advance the field safely and responsibly as self-driving cars become a fixture in our society.&nbsp;</p><p>Their next big question: Can we learn to trust these vehicles?</p><h2>Cruise Control</h2><p>The idea behind autonomous vehicles is simple. An artificial intelligence system pulls in data from an array of sensors including radar, high-resolution cameras and GPS, and uses this data to navigate from point A to point B while avoiding obstacles and obeying traffic laws. Sounds simple? It鈥檚 not.</p><p>When a self-driving car encounters an unexpected obstacle, it makes split-second judgment calls 鈥 should it brake or swerve around it? 鈥 that develop naturally in humans but are still beyond even the most sophisticated AI systems.&nbsp;</p><p>Moreover, there will always be an edge case that the AI-powered car hasn鈥檛 seen before, which means the key to safe autonomous vehicles is building systems that can correctly favor safe choices in unfamiliar situations.</p><p>Majid Zamani, associate professor and director of <a href="https://www.hyconsys.com/" rel="nofollow">精品SM在线影片鈥檚 Hybrid Systems Control Lab</a>, studies how to create software for autonomous systems such as cars, drones and airplanes. In autonomous vehicles鈥 AI systems, data flows into the AI and helps it make decisions. But how the AI creates those decisions is a mystery. This, said Zamani, makes it difficult to trust the AI system 鈥 and yet trust is critically important in high-stakes applications like autonomous driving.</p><p>鈥淭hese are what we call safety critical applications because system failure can cause loss of life or damage to property, so it鈥檚 really important that the way those systems are making decisions is provably correct,鈥 Zamani said.&nbsp;</p><p>In contrast to AI systems that use data to create models that are not intelligible to humans, Zamani advocates for a bottom -up approach where the AI鈥檚 models are derived from fundamental physical laws, such as acceleration or friction, which are well-understood and unchanging.</p><p>鈥淚f you derive a model using data, you have to be able to ensure that you can quantify how much error is in that model and the actual system that uses it,鈥 Zamani said.</p><p>Mathematically demonstrating the safety of the models used by autonomous vehicles is important for engineers and policymakers who need to guarantee safety before they鈥檙e deployed in the real world. But this raises some thorny questions: How safe is 鈥渟afe enough,鈥 and how can autonomous vehicles communicate these risks to drivers?&nbsp;</p><h2>Computer, Take the Wheel&nbsp;</h2><p>Each year, more than 40,000 Americans die in car accidents, and <a href="https://one.nhtsa.gov/people/injury/research/udashortrpt/background.html" rel="nofollow">according to the National Highway Traffic Safety Administration (NHTSA)</a>, about 90% of U.S. auto deaths and serious crashes are attributable to driver error. The great promise of autonomous vehicles is to make auto deaths a relic of history by eliminating human errors with computers that never get tired or distracted.&nbsp;&nbsp;</p><p>The NHTSA designates six levels of 鈥渁utonomy鈥 for self-driving cars, which range from Level 0 (full driver control) to Level 5 (fully autonomous). For most of us, Level 5 is what we think of when we think of self-driving cars: a vehicle so autonomous that it might not even have a steering wheel and driver鈥檚 seat because the computer handles everything. For now, this remains a distant dream, with many automakers pursuing Level 3 or 4 autonomy as stepping stones.&nbsp;</p><p>鈥淢ost modern cars are Level 2, with partial autonomous driving,鈥 said Chris Heckman, associate professor and director of the Autonomous Robotics and Perception Group in 精品SM在线影片鈥檚 computer science department. 鈥淯sually that means there鈥檚 a human at the wheel, but they can relegate some functions to the car鈥檚 software such as automatic braking or adaptive cruise control.鈥</p><p>While these hybrid AI-human systems can improve safety by assisting a driver with braking, acceleration and collision avoidance, limitations remain. Several fatal accidents, for example, have resulted from drivers鈥 overreliance on autopilot, which stems from issues of human psychology and AI understanding.</p><h2>Fostering Trust&nbsp;</h2><p>This problem is deeply familiar to Leanne Hirschfield, associate research professor at the Institute of Cognitive Science and the director of the System-Human Interaction with NIRS and EEG (<a href="https://www.shinelaboratory.com/" rel="nofollow">SHINE</a>) Lab at 精品SM在线影片. Hirschfield鈥檚 research focuses on using brain measurements to study the ways humans interact with autonomous systems, like self-driving cars and AI systems deployed in elementary school classrooms.&nbsp;</p><p>&nbsp;</p><div><div><blockquote><p><strong>"When an autonomous vehicle can show the driver information about how it鈥檚 making decisions or its level of confidence in its decisions, the driver is better equipped to determine when they need to grab the wheel."</strong></p></blockquote><p>&nbsp;</p></div></div><p>Trust, Hirschfield said, is defined as a willingness to be vulnerable and take on risks, and for decades the dominant engineering paradigm for autonomous systems has been focused on ways to foster total trust in autonomous systems.&nbsp;</p><p>鈥淲e鈥檙e realizing that鈥檚 not always the best approach,鈥 Hirschfield said. 鈥淣ow, we鈥檙e looking at trust calibration, where users often trust the system but also have enough information to know when they shouldn鈥檛 rely on it.鈥</p><p>The key to trust calibration, she said, is transparency. When an autonomous vehicle can show the driver information about how it鈥檚 making decisions or its level of confidence in its decisions, the driver is better equipped to determine when they need to grab the wheel.&nbsp;</p><p>Studying user responses is challenging in a laboratory setting, where it鈥檚 difficult to expose drivers to real risks. So Hirschfield and researchers at the U.S. Air Force Academy have been using a Tesla modified with a variety of internal sensors to study user trust in autonomous vehicles.&nbsp;</p><p>鈥淧art of what we鈥檙e trying to do is measure someone鈥檚 level of trust, their workload and emotional states while they鈥檙e driving,鈥 Hirschfield said. 鈥淭hey鈥檒l have the car whipping around hills, which is how you need to study trust because it involves a sense of true risk compared to a study in a lab setting.鈥&nbsp;</p><p>Although Hirschfield said that researchers have made a lot of progress in understanding how to design autonomous vehicles to foster driver trust, there is still a lot of work to be done.</p><h2>Human-Centered Design&nbsp;</h2><p>Sidney D鈥橫ello, a professor at the Institute of Cognitive Science, studies how human-computer interactions shift the way we think and feel. For D鈥橫ello, it鈥檚 unclear whether the current crop of self-driving cars can shift to a new driver-focused paradigm from the current perfected engineering-forward approach.</p><p>鈥淚 think we need an entirely new methodology for the self-driving car context,鈥 D鈥橫ello said. 鈥淚f you really want something you can trust, then you need to design these systems with users starting from day one. But every single car company is kind of stuck in this engineering mindset from 50 years ago where they build the tech and then they present it to the user.鈥</p><p>The good news, D鈥橫ello said, is that automakers are starting to take this challenge seriously. A collaboration between Toyota and the Institute of Cognitive Science focused on designing autonomous vehicles that foster trust in the user.</p><p>鈥淭he autonomous model typically implies the AI is in the center with the human hovering around it,鈥 said D鈥橫ello. 鈥淏ut this needs to be a model with the human in the center.鈥&nbsp;</p><p>Even when users learn to trust autonomous vehicles, living with driverless cars and reconceptualizing how they relate to them is complex. But there鈥檚 a lot we can apply from research on prosthetics, said Cara Welker, assistant professor in biomechanics, robotics and systems design.</p><p>Much like autonomous vehicles analyze surroundings to make navigation and control decisions, robotic prostheses monitor a wearer鈥檚 movements to understand appropriate behavior. And just as teaching users to trust prosthetics requires strong feedback loops and predictable prosthetic behavior, teaching drivers to trust autonomous vehicles means providing drivers with information about what the AI is doing 鈥 and it requires drivers to reconceptualize vehicles as extensions of themselves.&nbsp;</p><p>鈥淭here鈥檚 a difference between users being able to predict the behavior of an assistive device versus having some kind of sensory feedback,鈥 Welker said. 鈥淎nd this difference has been shown to affect whether the people think of it as 鈥榤e and my prosthesis鈥 instead of just 鈥榤e, which includes my prosthesis.鈥 And that鈥檚 incredibly important in terms of how users will trust that device.鈥&nbsp;</p><p>How, then, will drivers evolve to experience cars as extensions of themselves?&nbsp;</p><h2>Next Exit</h2><p>In 2018, a <a href="https://www.nytimes.com/interactive/2018/03/20/us/self-driving-uber-pedestrian-killed.html" rel="nofollow">pedestrian was killed</a> by a self-driving Uber in Arizona, which marked the first fatality attributed to an autonomous vehicle. Although the driver pleaded guilty in the case, the question of who is responsible when autonomous vehicles kill is far from settled.&nbsp;</p><p>Today, there is limited regulation dictating autonomous vehicle safety and liability. One problem is that vehicles are regulated at the federal level while drivers are regulated at the state level 鈥 a division of responsibility that doesn鈥檛 account for a future where the driver and vehicle are more closely aligned.&nbsp;</p><p>Researchers and automakers have voiced frustration with existing autonomous driving regulations, agreeing that updated regulations are necessary. Ideally, regulations would ensure driver, passenger and pedestrian safety without quashing innovation. But what these policies might look like is still unclear.&nbsp;</p><p>The challenge, said Heckman, is that the engineers don鈥檛 have complete control over how autonomous systems behave in every circumstance. He believes it鈥檚 critical for regulations to account for this without insisting on impossibly high safety standards.&nbsp;</p><p>鈥淢any of us work in this field because automotive deaths seem avoidable and we want to build technologies that solve that problem,鈥 Heckman said. 鈥淏ut I think we hold these systems [to] too high of a standard 鈥 because yes, we want to have safe systems, but right now we have no safety frameworks, and automakers aren鈥檛 comfortable building these systems because they may be held to an extremely high liability.鈥&nbsp;</p><p>Other industries may offer a vision for how to regulate the autonomous driving industry while providing acceptable safety standards and enabling technological development, Heckman said. The aviation industry, for example, adopted rigorous engineering standards and fostered trust in engineers, pilots, passengers and policymakers.&nbsp;</p><p>鈥淭here鈥檚 an engineering principle that trust is a perception of humans,鈥 Heckman said. 鈥淭rust is usually built through experience with a system, and that experience confers trust on the engineering paradigms that build safe systems.&nbsp;</p><p>鈥淲ith airplanes, it took decades for us to come up with designs and engineering paradigms that we feel comfortable with. I think we鈥檒l see the same in autonomous vehicles, and regulation will follow once we鈥檝e really defined what it means for them to be trustworthy.鈥&nbsp;</p></div></div></div></div> </div> </div> </div> </div> <script> window.location.href = `/coloradan/2023/11/06/world-ready-self-driving-cars`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 27 Nov 2023 17:41:51 +0000 Anonymous 109 at /program/robotics Building next generation autonomous robots to serve humanity /program/robotics/2023/11/17/building-next-generation-autonomous-robots-serve-humanity <span>Building next generation autonomous robots to serve humanity </span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-11-17T16:19:59-07:00" title="Friday, November 17, 2023 - 16:19">Fri, 11/17/2023 - 16:19</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/Edgar_Mines_Lab_2023_094.JPG?h=c48d9d91&amp;itok=ekILKiys" width="1200" height="800" alt="A SPOT robot navigating autonomously."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/6" hreflang="en">Christoffer Heckman News</a> <a href="/program/robotics/taxonomy/term/66" hreflang="en">Eric Frew News</a> <a href="/program/robotics/taxonomy/term/12" hreflang="en">Sean Humbert News</a> </div> <a href="/program/robotics/jeff-zehnder">Jeff Zehnder</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><p>One thousand feet underground, a four-legged creature scavenges through tunnels in pitch darkness. With vision that cuts through the blackness, it explores a spider web of paths, remembering its every step and navigating with precision. The sound of its movements echo eerily off the walls, but it is not to be feared 鈥 this is no wild animal; it is an autonomous rescue robot.</p><p>Initially designed to find survivors in collapsed mines, caves, and damaged buildings, that is only part of what it can do.</p><p>Created by a team of 精品SM在线影片 researchers and students, the robots placed third as the top US entry and <a href="/today/2021/09/24/engineers-take-home-500000-international-underground-robotics-competition" rel="nofollow">earned $500,000 in prize money</a> at a Defense Advanced Projects Research Agency Subterranean Challenge competition in 2021.</p><h2>Going Futher</h2><p>Two years later, they are pushing the technology even further, earning new research grants to expand the technology and create new applications in the rapidly growing world of autonomous systems.</p><p>鈥淚deally you don鈥檛 want to put humans in harm鈥檚 way in disaster situations like mines or buildings after earthquakes; the walls or ceilings could collapse and maybe some already have,鈥 said <a href="/mechanical/j-sean-humbert" rel="nofollow">Sean Humbert,</a> a professor of mechanical engineering and director of the <a href="/program/robotics/2023/09/20/cu-boulder-offers-new-graduate-program-robotics" rel="nofollow">Robotics Program at 精品SM在线影片.</a> 鈥淭hese robots can be disposable while still providing situational awareness.鈥</p><p>The team developed an advanced system of sensors and algorithms to allow the robots to function on their own 鈥 once given an assignment, they make decisions autonomously on how to best complete it.</p><h2>Advanced Communication</h2><p>A major goal is to get them from engineers directly into the hands of first responders. Success requires simplifying the way the robots transmit data into something approximating plain English, according to Kyle Harlow, a computer science PhD student.</p><p>鈥淭he robots communicate in pure math. We do a lot of work on top of that to interpret the data right now, but a firefighter doesn鈥檛 have that kind of time,鈥 Harlow said.</p><p>To make that happen Humbert is collaborating with <a href="/cs/christoffer-heckman" rel="nofollow">Chris Heckman,</a> an associate professor of computer science, to change both how the robots communicate and how they represent the world. The robots鈥 eyes 鈥 a LiDAR sensor 鈥 creates highly detailed 3D maps of an environment, 15 cm at a time. That鈥檚 a problem when they try to relay information 鈥 the sheer amount of data clogs up the network.</p><p>鈥淗umans don鈥檛 interpret the environment in 15 cm blocks,鈥 Humbert said. 鈥淲e鈥檙e now working on what鈥檚 called semantic mapping, which is a way to combine contextual and spatial information. This is closer to how the human brain represents the world and is much less memory intensive.鈥</p><h2>High Tech Mapping</h2><p>The team is also integrating new sensors to make the robots more effective in challenging environments. The robots excel in clear conditions but struggle with visual obstacles like dust, fog, and snow. Harlow is leading an effort to incorporate millimeter wave radar to change that.</p><p>鈥淲e have all these sensors that work well in the lab and in clean environments, but we need to be able to go out in places such as Colorado where it snows sometimes,鈥 Harlow said.</p><p>Where some researchers are forced to suspend work when a grant ends, members of the subterranean robotics team keep finding new partners to push the technology further.</p><h2>Autonomous Flight</h2><p><a href="/aerospace/eric-frew" rel="nofollow">Eric Frew,</a> a professor of aerospace at 精品SM在线影片, is using the technology for a new National Institute of Standards and Technology competition to develop aerial robots 鈥 drones 鈥 instead of ground robots, to autonomously map disaster areas indoors and outside.</p><p>鈥淥ur entry is based directly on the Subterranean Challenge experience and the systems developed there,鈥 Frew said.</p><p>Some teams in the competition will be relying on drones navigated by human operators, but Frew said 精品SM在线影片鈥檚 project is aiming for an autonomous solution that allows humans to focus on more critical tasks.</p><p>Although numerous universities and private businesses are advancing autonomous robotic systems, Humbert said other organizations often focus on individual aspects of the technology. The students and faculty at 精品SM在线影片 are working on all avenues of the systems and for uses in environments that present extreme challenges.</p><p>鈥淲e鈥檝e built world-class platforms that incorporate mapping, localization, planning, coordination 鈥 all the high level stuff, the autonomy, that鈥檚 all us,鈥 Humbert said. 鈥淭here are only a handful of teams across the world that can do that. It鈥檚 a huge advantage that 精品SM在线影片 has.鈥</p></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/engineering/2023/11/17/building-next-generation-autonomous-robots-serve-humanity`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 17 Nov 2023 23:19:59 +0000 Anonymous 107 at /program/robotics Jayaram and team win IROS Best Paper Award on Safety, Security, and Rescue Robotics /program/robotics/2023/10/31/jayaram-and-team-win-iros-best-paper-award-safety-security-and-rescue-robotics <span>Jayaram and team win IROS Best Paper Award on Safety, Security, and Rescue Robotics</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-10-31T13:28:03-06:00" title="Tuesday, October 31, 2023 - 13:28">Tue, 10/31/2023 - 13:28</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/program/robotics/sites/default/files/styles/focal_image_wide/public/article-thumbnail/v2mclari_coin_vertical_crop.jpeg.jpg?h=05780fb3&amp;itok=zdRpeN4Y" width="1200" height="800" alt="A robot the size of a penny."> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/program/robotics/taxonomy/term/1"> Research News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/program/robotics/taxonomy/term/13" hreflang="en">Kaushik Jayaram News</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-row-subrow row"> <div class="ucb-article-text col-lg d-flex align-items-center" itemprop="articleBody"> <div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Assistant Professor Kaushik Jayaram鈥檚 <a href="/lab/jayaram/" rel="nofollow">Animal Inspired Movement and Robotics Laboratory</a> recently won the <a href="https://ieee-iros.org/iros-2023-award-winners/" rel="nofollow">IROS Best Paper Award on Safety, Security, and Rescue Robotics</a>, rising above around 3,000 other academic papers that were submitted to the IEEE/RSJ International Conference on Intelligent Robots and Systems. Along with Jayaram as the PI of the lab, PhD student Heiko Kabutz was the lead researcher of the paper, and PhD students Alex Hedrick and Parker McDonnell were coauthors, as well.</p><p>Their paper titled <a href="https://arxiv.org/abs/2310.04538" rel="nofollow"><em>mCLARI: a shape-morphing insect-scale robot capable of omnidirectional terrain-adaptive locomotion in laterally confined spaces</em></a><em>, </em>improves upon their <a href="https://onlinelibrary.wiley.com/doi/full/10.1002/aisy.202300181" rel="nofollow">previous miniature shape-morphing robot</a> to demonstrate the ability to passively change its shape to squeeze through narrow gaps in multiple directions. This is a new capability for legged robots, let alone insect-scale systems, that enables significantly enhanced maneuverability in cluttered environments, and has the potential to aid first responders after major disasters.</p><p>Kabutz and Jayaram鈥檚 <a href="/lab/jayaram/research/mclari" rel="nofollow">latest version</a> is scaled down 60% in length and 38% in mass, while maintaining 80% of the actuation power. The robot weighs less than a gram but can support over three times its body weight as an additional payload. It is also over three times as fast as its predecessor reaching running speeds of 60 millimeters per second, or three of its body lengths per second.</p><p>Check out their video of mCLARI here: <a href="https://www.youtube.com/watch?v=KbMi6ezXf-Y" rel="nofollow">https://www.youtube.com/watch?v=KbMi6ezXf-Y</a>.</p><p>With the latest breakthrough that Jayaram and Kabutz have now achieved with their research, they are able to scale down (or up), their design without sacrificing design integrity bringing such robots closer in size to real-world application needs.</p><p>鈥淪ince these robots can deform, you can still have slightly larger sizes,鈥 Jayaram said. 鈥淚f you have a slightly larger size, you can carry more weight, you can have more sensors, you'll have a longer lifetime and be more stable. But when you need to be, you can squish through and go through those specific gaps.鈥</p><p>Kabutz, who leads the design of the mClari, has surgeon-like hands that allow him to build and fold the tiny legs of the robot. Kabutz grew up fascinated by robots and competed in robotic competitions in high school.</p><p>鈥淚nitially, I was interested in building bigger robots,鈥 said Kabutz, 鈥渂ut when I came to Jayaram鈥檚 lab, he really got me interested in building bioinspired robots at the insect scale.鈥</p><p>Jayaram鈥檚 research team studies concepts from biology and applies them to the design of real-world engineered systems. In his lab, you can find robots modeled after the body morphologies of various arthropods including cockroaches and spiders.&nbsp;</p><p>鈥淲e are fundamentally interested in understanding why animals are the way they are and move the way they do,鈥 said Jayaram, 鈥渁nd how we can build bioinspired robots that can address social needs, like search and rescue, environmental monitoring, or even use them during surgery.鈥</p></div></div></div></div> </div> <div class="ucb-article-content-media ucb-article-content-media-right col-lg"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> </div> </div> </div> </div> <script> window.location.href = `/mechanical/2023/10/31/jayaram-and-team-win-iros-best-paper-award-safety-security-and-rescue-robotics`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 31 Oct 2023 19:28:03 +0000 Anonymous 105 at /program/robotics