Research

Research

Liquid Robotics' Wave Gliders USV selected to conduct scientific research in Arctic and Southern Oceans

Liquid Robotics’ Wave Glider USV has been selected as the sensor platform to conduct advanced scientific research in the “most inhospitable and remote regions” of the Arctic and Southern Oceans by top oceanographers from Scripps Institution of Oceanography (Scripps) and the Applied Physics Laboratory of the University of Washington (APL-UW). The oceanographers leading these missions are Dr. Eric Terrill and Dr. Sophia Merrifield of the Coastal Observing Research and Development Center (CORDC), Scripps Institution of Oceanography; Dr. Ken Melville and his team at the Air-Sea Interaction Laboratory, Scripps Institution of Oceanography; and Dr. Jim Thomson and his team in the Stratified Ocean Dynamics of the Arctic (SODA) program at the Applied Physics Laboratory-University of Washington (APL-UW). The oceanographers will use the Wave Glider USVs, which are proven in extreme ocean conditions, to collect real time data and rare insights into the “dynamic conditions that drive the world’s weather and climate.” This data is important to scientists in their quest to understand and improve global ocean weather modeling and climate prediction. “The reliability of the platform, modular payloads, and proven navigation capabilities led to our decision to select the Wave Glider for our upcoming science program,” says Dr. Terrill. “Tackling at-sea science questions has plenty of challenges and we needed a platform we could trust and adapt. The modularity allows us to deploy our own sensors and adapt autonomy algorithms so that the vehicle will optimally sample the ocean.” Each team will equip the Wave Gliders with “sophisticated oceanographic and atmospheric sensors” to measure extreme wave states, winds, temperature, and salinity in the upper layers of the ocean. These regions have historically been undersampled because of the dangers and risks associated with operating in these turbulent oceans. The USVs will give oceanographers the opportunity to observe the real time weather and climate conditions safely from shore.

DriveOhio and Ohio UAS Center partner to support UAS research and development

DriveOhio and the Ohio UAS Center have announced a new strategic plan to support UAS research and development. Through the strategic plan, three main initiatives— FlyOhio, Ohio UAS Center Operations, and workforce development—will be addressed. ​In an effort to facilitate the widescale use of UAS for delivery and transport, FlyOhio will seek to develop an unmanned traffic management (UTM) system; Ohio UAS Center Operations will facilitate the use of UAS for a variety of business services; and workforce development will educate and create the skills needed for smart mobility jobs around this technology.    “UAS technology is advancing just as quickly as autonomous and connected vehicle technology, and Ohio understands how both can work together across multiple smart mobility initiatives,” says Jim Barna, executive director of DriveOhio. “Companies operating new UAS technologies need opportunities to test and deploy them, and the nation needs a traffic management system that can make drone package delivery and transportation safe and commercially viable. We aim to do all of this in Ohio.” Along with the Air Force Research Laboratory (AFRL), FlyOhio is researching a $5 million ground-based detect-and-avoid radar system called SkyVision at the Springfield-Beckley Municipal Airport in Clark County, Ohio. FlyOhio will also explore a second UTM platform along the 33 Smart Mobility Corridor, which is 35-mile stretch of highway between Dublin and East Liberty, Ohio. The Smart Mobility Corridor is already a testing ground for autonomous and connected vehicles and communications. Lastly, FlyOhio will identify locations that can accommodate aircraft that takeoff and land vertically. Known as vertiports, these locations are becoming more and more important with the advancement of unmanned package delivery, the entities note. In terms of the Ohio UAS Center Operations, the state of Ohio is already using UAS to work in support and improvement of state and local government operations. This technology is also being used for a variety of other services such as project and environmental surveying, infrastructure inspection, and police and firefighting support. Through this new strategic plan, statewide data collection operations will include bridge inspections, aerial photography and mapping, and emergency management, just to name a few. The entities say that by expanding these operations and use case development services, the UAS Center will “continue growing Ohio’s UAS ecosystem and accelerate technology adoption in the state.” Finally, regarding UAS Workforce Development, DriveOhio and the UAS Center are actively working with smart mobility workforce development stakeholders from government, industry, education and local communities across Ohio to identify pilot programs involving UAS technologies. These programs seek to deliver benefits for a number of potential employees, including today’s workforce, tomorrow’s workforce, and the emerging workforce. “As workforce development is a core part of DriveOhio’s mission to support and advance Ohio’s smart mobility future, initiatives like these go hand-in-hand with new projects, helping to prepare workers for jobs in this new UAS sector,” the entities say. For Fred Judson, managing director of the UAS Center, this strategic plan provides a strategy for Ohio to continue playing a role in advancing the country’s smart mobility operations to the sky. “We’re excited to continue working with companies, government entities and local communities to develop unmanned traffic systems, promote UAS technologies and use cases and develop the workforce here in Ohio to fill the jobs these new technologies will present,” Judson says.

Virginia Tech engineers hope to redefine search and rescue protocols by teaming up human searchers with UAS

With the help of a grant from the National Science Foundation worth $1.5 million, a group of Virginia Tech engineers will pair up human researchers with UAS, in hopes of redefining search and rescue protocols. Utilizing autonomous algorithms and machine learning, the UAS will complement search and rescue efforts from the air. Additionally, they will suggest tasks and send updated information to human searchers on the ground.  The researchers hope to make searches more effective by using mathematical models based on historical data that reflect what lost people actually do combined with typical searcher behavior, which balances autonomy with human collaboration. Having received support from the Virginia Department of Emergency Management, the researchers will also work closely with the local Black Diamond Search and Rescue Council throughout the project. “Human searchers are excellent at what they do. Drones are unlikely to replace people on the ground because searchers are too good at their jobs,” explains the leader of these efforts, Ryan Williams, an assistant professor in the Bradley Department of Electrical and Computer Engineering within the College of Engineering. Williams’ family has also been involved with the Black Diamond Search and Rescue Council since its founding more than three decades ago. “However, what drones can do is address these niche problems of the search process by providing large-scale data that can help a search team make better decisions.” Some instances where UAS could prove beneficial are exploring treacherous terrain that’s difficult for human searchers to reach. They could also collect information about areas of the search environment that are relatively unknown, which would save time.  To build the mathematical model that will help the UAS decide where to go and how to search, Nicole Abaid, an assistant professor in the Department of Biomedical Engineering and Mechanics, is using historical data gathered from more than 50,000 documented lost person scenarios. The data comes from the work of Robert Koester, a search and rescue expert who will consult on the project. “From the historical search data, we know that certain types of people tend to do certain things when they’re lost,” Abaid says. “For example, people with cell phones tend to move up in elevation as they try to get service, while an elderly person might not travel very far.” Abaid plans to build more than 30 lost person profiles into the model that incorporate various pieces of information, including age, mental status, and activity—i.e. hiking, horseback riding, hunting, etc. Abaid will also use topographical data from ArcGIS maps that can provide insight into how people usually move through terrain. “The overall goal is to create an autonomous, scalable system that can make search and rescue processes here and elsewhere more effective simply by intelligently incorporating existing technology,” Abaid says. For Williams, making the UAS as unobtrusive as possible to searchers on the ground is of great importance. “Most of the data a drone will pick up from sensors, whether it be from cameras, thermal imaging, or Lidar surveys, will be really uninteresting,” Williams says. “We don’t want to interrupt human searchers to show them video feed of a piece of trash on the ground, for instance. That wastes precious time.” The team also includes Nathan Lau, an assistant professor in the Grado Department of Industrial and Systems Engineering, and James McClure, a computational scientist in Advanced Research Computing. Specializing in human-computer interaction, Lau will seek to address the issue of when and how the UAS will interrupt searchers to provide new information, while McClure will design a wearable, backpack-based server to provide on-the-ground processing and communications support.

Oklahoma State University drone program awarded $1.49 million grant

As part of its National Robotics Initiative 2.0 program, the National Science Foundation (NSF) has awarded Oklahoma State University’s (OSU) drone program a $1.49 million grant. Awarded to OSU researchers He Bai, Rushikesh Kamalapurkar, Jamey Jacob and Matt Vance and former faculty member Balaji Jayaraman, the grant, which is the only grant of its kind awarded in Oklahoma, will be used to fund research that uses UAS technology to estimate wind conditions in differing environments. The OSU team will use UAS to measure differing wind conditions in various environments, as part of OSU’s Tier 1 Research Initiatives involving UAS in public health and safety. According to the researchers, the idea is that a UAS on a set flight path can report wind conditions using the magnitude and direction of deviation from its preset flight path. The project will also look into how the safety and efficiency of UAS operations can be improved based on wind estimates. Having worked on research projects for the NSF, NASA and Oklahoma Established Program to Stimulate Competitive Research (EPSCoR), the research team is keenly aware of the need for better methods to model and predict wind and estimate UAS trajectory in different wind environments. A number of companies have experimented delivering packages via UAS, so these methods are considered extremely important to the continued progression of UAS traffic in urban environments. While entities such as NASA are working on an unmanned traffic management (UTM) system to provide virtual drone highways, weather data from those environments must be collected and interpreted efficiently so that it can be relayed to pilots and traffic management in real time, so that multiple vehicles can safely occupy the small spaces of urban environments. The system could also be beneficial for urban air mobility or air taxi applications, as well. The team plans on focusing on information that will be useful to UAS pilots and how to use the vehicle, in collaboration with other nearby vehicles, to analyze and predict the wind phenomenon it is experiencing and communicate that information back to the pilot for adjustments. The team will use several resources such as the new Excelsior laboratory and indoor drone aviaries, as well as simulation facilities and outdoor flight facilities; all of which are available through OSU and the Unmanned Systems Research Institute. The OSU team is hopeful that this four-year project will serve as a foundation to better understand wind and weather phenomenon that UAS encounter in urban environments. 

UC Riverside engineers awarded grant to develop new generation of GPUs for autonomous systems

The National Science Foundation has awarded three University of California, Riverside (UC Riverside) engineers a $1.2 million grant to develop a new generation of energy-efficient, energy-elastic, and real-time-aware Graphics processing units (GPUs) that can be used in resource-constrained environments such as emerging embedded and autonomous systems, including UAS and autonomous vehicles. Daniel Wong, Hyoseung Kim, and Nael Abu-Ghazaleh are the recipients of the grant. All three are Marlan and Rosemary Bourns College of Engineering faculty members, and are professors of electrical and computer engineering. GPUs commonly provide the computational power needed to enable autonomous systems. They are widely used in supercomputing and cloud computing to significantly speed up applications, such as image processing, deep learning, and other computationally intensive workloads. That added speed comes at a cost, though, as GPUs used in parallel computing consume large amounts of energy, which limits their use in self-contained, often battery-operated environments such as vehicles and drones. Effective GPUs for autonomous systems need to be energy efficient and capable of executing workloads in real time. An example of this is that for an autonomous vehicle to safely navigate on the road, it has to be able to process a variety of sensor information, such as camera and lidar, and make a decision within milliseconds to prevent the vehicle from crashing. But modern embedded GPUs have several limitations when used in autonomous system settings. GPUs tend to be energy inefficient, which leads to insufficient computational power and limited autonomous system capability. To successfully perform real-time operations and meet the workload deadlines required to provide correct and safe operations, GPU hardware and software need to be timing aware, so the UC Riverside project will look to solve these issues by providing solutions that cover both software and hardware in order to enable real-time embedded GPUs for autonomous systems. “Current GPUs consume almost the same amount of power when actively processing a workload and when idle, wasting energy,” Wong says. The UC Riverside team will design “energy-elastic” hardware, which lets the GPU consume power based on the amount of work it has to do. “If it is doing little work, it will consume less power; if it needs to do more work, it will consume more power,” Wong explains. GPU hardware is made up a lot of schedulers, which are unaware of the timing requirement of workloads. As a result, if multiple workloads are running on the GPU, it’s possible that some workloads may miss the deadline due to competition for hardware resources. The UCR group will create timing-aware hardware and software, which allows the various hardware schedulers to prioritize workloads to make sure that deadlines are met. The researchers will design real-time scheduling software that coordinates with hardware schedulers. Having hardware that keeps the software updated on the workload’s progress will allow the software to make better scheduling decisions and improve real-time operations. “Today’s software and hardware do not coordinate to enforce workload deadlines,” Kim says. “This project would enable multiple workloads to run safely together and increase the capabilities provided by embedded GPUs in autonomous systems.”

Teams of "swarm systems integrators" to develop UAS swarm infrastructure for US military

In an effort to help the U.S. military in urban combat, two teams of “swarm systems integrators” will look to develop a UAS swarm infrastructure, using funding from a multi-million-dollar contract that is part of the Defense Advanced Research Project Agency’s (DARPA) Offensive Swarm-Enabled Tactics (OFFSET) program. The goal of the program is “to empower … troops with technology to control scores of unmanned air and ground vehicles at a time.” The two teams will be responsible for developing the system infrastructure and integrating the work of the “sprint” teams, which will focus on swarm tactics, swarm autonomy, human-swarm teaming, physical experimentation and virtual environments. The teams will be led by Northrop Grumman and Raytheon BBN, which is a key research and development arm of the Raytheon Company. Julie A. Adams of OSU’s College of Engineering is a member of the team led by Raytheon BBN, and she is the only university-based principal investigator on either team. “I specifically will work on swarm interaction grammar – how we take things like flanking or establishing a perimeter and create a system of translations that will allow someone to use those tactics,” Adams explains. “We want to be able to identify algorithms to go with the tactics and tie those things together, and also identify how operators interact with the use of a particular tactic.” Adams continues, “our focus is on the individuals who will be deployed with the swarms, and our intent is to develop enhanced interactive capabilities: speech, gestures, a head tilt, tactile interaction. If a person is receiving information from a swarm, he might have a belt that vibrates. “We want to make the interaction immersive and more understandable for humans and enable them to interact with the swarm.” Adams, who is the associate director for deployed systems and policy at the college’s Collaborative Robotics and Intelligent Systems Institute, points out that last summer, China launched a record swarm of 119 fixed-wing UAS, but she also points out that right now, they “don’t have the infrastructure available for testing the capabilities of large swarms.” “Advances have been made with indoor systems, including accurately tracking individual swarm members and by using simulations,” Adams says. “Those are good first steps but they don’t match what will happen in the real world. Those approaches allow for testing and validation of some system aspects but they don’t allow for full system validation.” The goal of the integrators is to make sure that operators can interact with the swarm as a whole, or subgroups of the swarm, as opposed to individual agents. Adams says that what the agents do as a whole is “much more interesting” than what they do individually, comparing a UAS swarm to a school of fish acting in unison in response to a predator. “We’ve got these ‘primitives’” – basic actions a swarm can execute – “and we’ll map these primitives to algorithms for the individual agents in the swarm, and determine how humans can interact with the swarm based on all of these things,” Adams says. “We want to advance and accelerate enabling swarm technologies that focus on swarm autonomy and how humans can interact and team with the swarm.”  Researchers envision swarms of more than 250 autonomous vehicles that can work together to gather information and help troops in “concrete canyon” surroundings, where buildings impair line-of-sight, satellite-based communication. Those vehicles would include multi-rotor UAS and ground rovers. The information collected by the swarms could help keep U.S. troops safer, as well as civilians in the battle areas.

University of Colorado Boulder and others collect data on monster storm using UAS

A team of aerospace engineers from the University of Colorado Boulder (CU) recently spent the first half of June traveling across the Midwest in search of monster storms. During this time period, the CU team, which was made up of 16 CU employees and students, encountered a storm on June 8 outside Norris, South Dakota, and used one of its three “TTwistor” UAS to fly through the dark skies to collect data from the storm. “It is amazing to me how you're driving along for several hours and it's nice, clear, sunny skies, and all of sudden you're under these clouds and it gets dark pretty fast,” says Eric Frew, CU associate professor, via the Daily Camera. Frew, who is also the associate director of technology at CU's Integrated Remote and In Situ Sensing (IRISS), drove one of the vehicles in a three-vehicle convoy straight towards the supercell thunderstorm outside Norris, South Dakota, and the TTwistor UAS flew above the convoy. Equipped with the specialized UAS that they have spent years building and honing, the CU team met meteorologists from the University of Nebraska-Lincoln and Texas Tech University, who plan to use the UAS data to improve understanding and forecasting of tornadoes. “It's still very much an open question of: Why does this type of storm become a tornado?” Frew says. “Most of the strong, violent tornadoes are created from supercell thunderstorms, but very few supercell thunderstorms create tornadoes. You know what storms to go study, but you still don't know why they do or don't produce tornadoes.” The three yellow TTwistor UAS that the CU team brought for these flights were built to collect the data requested by the meteorologists, including the pressure, temperature and humidity of the air and the speed and direction of the wind. The UAS, which look like small airplanes and have propellers on each wing, have sensors embedded in the nose of the UAS, and a video camera is embedded in the tail. The UAS are built to take off from the roof of Ford Explorers equipped with special racks, and are programmed to follow the vehicles into storms. The UAS are FAA-compliant, and the FAA requires a sightline on the UAS at all times. The storms are capable of moving faster than 30 miles per hour, so the only way to keep the UAS in sight is by driving with them. According to Frew, UAS are the only way to safely collect the data that the meteorologists desire. “We're in these environments because nobody in their right mind would pilot an aircraft in these environments,” Frew says. This was the first trip that a UAS was used to fly head-on into a storm. The CU team drove each day—sometimes for as many as six hours—in areas between North Dakota and southern Oklahoma, and in the afternoon, they searched for the brewing storms. The June 8 storm in South Dakota was the most successful "good" weather mission during their two-week trip. On the day of the June 8 storm, the CU team launched its UAS from the Ford Explorer. A small contingent drove in that vehicle, while the rest of the team stayed behind with a meteorologist and waited for the mission to end so they could help pack up the UAS and avoid the worst of the storm. The Ford Explorer followed two other cars; one of which carried meteorologists who tracked the conditions, and another that acted as a scout car and drove about a mile ahead to keep an eye out for conditions that would damage the UAS. “We just drove straight to the storm, and the storm came right to us,” Frew says. ​Some of the footage picked up by the UAS' tail camera shows massive, dark clouds; a glowing blue patch where hail formed; and flashes of lightning. The group traveling towards the storm turned around when the scout car encountered hail, as the UAS cannot fly through sustained hail. Now, the CU team will compile the data collected by the UAS and share it with the meteorologists for further study. This initiative was the culmination of a three-year grant by the National Science Foundation, but Frew notes that their work and partnerships are ongoing, and anticipates continuing to learn and chase storms with them. “These are measurements that the meteorology world wants, and you cannot get without being in the storm,” Frew says. “That's the key.”

The University of Illinois Urbana-Champaign's new Center for Autonomy will focus on autonomous tech

According to the Associated Press, the University of Illinois Urbana-Champaign is launching a new center called the Center for Autonomy, which will be used to focus on autonomous technology such as self-driving cars and robotic assistants. The university has allocated $2.1 million for the center, while the College of Engineering is providing an additional $2.1 million to recruit new robotics faculty. Professor Geir Dullerud will serve as the director of the center. Dullerud says that the center will play an integral role in designing systems that work without human intervention, in a way that’s both reliable and safe. With this in mind, Dullerud points out that “there’s a difference between a self-driving car that works most of the time and a self-driving car that works all of the time.” The center will also offer more experimental space for autonomy and robotics research.

Black Swift Technologies' S2 UAS to be deployed in Greenland for atmospheric research studies

Black Swift Technologies (BST) has announced that its Black Swift S2 UAS will be used to conduct high-altitude high-latitude atmospheric research studies in Greenland, as part of the international East Greenland Ice-Core Project (EastGRIP). With support from the National Science Foundation (NSF), the work will be conducted by the Institute of Arctic and Alpine Research (INSTAAR), which is part of the University of Colorado Boulder. The Black Swift S2 will operate at temperatures as low as -20 degrees Celsius or colder, and will be flying at altitudes up to 14,000 feet— right on the edge of commercial airspace, Black Swift notes—to make routine atmospheric measurements. To get a better understanding of how climate conditions are impacting Greenland’s mass as a result of sublimation, or evaporation, directly into the atmosphere, Black Swift’s S2 will perform transects or vertical profiles of the arctic atmosphere to analyze the water vapor above the ice sheet. As Black Swift explains, the isotopes of the snow and ice represent a “fingerprint of the temperature” when that water condensed out of a cloud, providing researchers with a fairly reliable historical temperature record. This can be scrutinized along with a lot of other variables measured in the ice core, including dust, volcanic debris, chemical make up, and trapped atmospheric gases.  Researchers hope that by analyzing past climate conditions, they can obtain new knowledge on the timing and response of the ice sheet to changes in a variety of climate drivers, like sea ice extent, and changes in ocean circulation. “Ice sheets are melting and glaciers are retreating. Greenland is no exception,” says Bruce Vaughn, INSTAAR’s lead Research Fellow on the project at CU Boulder, who has been conducting isotope research in arctic environments for nearly 30 years. “Measuring the amount of water vapor above the ice sheet and its isotopes can tell us about its origins, whether it’s coming from the ice sheet or the atmosphere. We have been, during the course of our fieldwork over the last couple of years, taking measurements using a small three- or four-meter tall tower (Figure 3). The data we have captured lets us look at the gradients, in terms of both in water vapor concentration and it’s isotopic signature above the ice sheet. That can tell us a whole lot. But what it doesn’t tell us much about is what’s happening above the boundary layer in the atmosphere and the upper troposphere and how air masses in the stratosphere might be mixing in. This is where the S2 fits into our strategy.” Scientists are essentially taking capturing measurements of precipitation from the sky that represent climate when they take measurements in the ice core. Black Swift explains that “as the ice sheet responds to the atmosphere between precipitation events, it will exchange isotopically with the atmosphere—meaning water molecules from the ice sheet will exchange with water molecules in the atmosphere.” “It’s really a two-way conversation between the atmosphere and the ice sheet,” the company says. Black Swift goes on to explain that “scientists have learned that between precipitation events, the micro-layer on the surface of the ice will slowly start to exchange and isotopically mirror what’s in the water vapor above it.” Therefore, understanding that relationship provides scientists with more insight into what the isotopes in the ice core are actually telling them. These observations close to the surface can be made relatively easily, but they are labor intensive. For scientists, the air mass above the ice sheet (into and above the boundary layer) is still a bit of a mystery to scientists, which is another reason why the S2 is being deployed. “The S2 has become an integral part of what we’re trying to accomplish in Greenland,” Vaughn says.

MIT researchers enables soft robotic arm to understand its configuration in 3D space using "sensorized" skin

MIT has announced that for the first time, its researchers have leveraged just motion and position data from the “sensorized” skin of a soft robotic arm to enable it to understand its configuration in 3D space. MIT notes that soft robots made from highly compliant materials—similar to those found in living organisms—are being championed as safer, and more adaptable, resilient, and bioinspired alternatives to traditional rigid robots. Giving these deformable robots autonomous control is considered a “monumental task,” though, because at any given moment, they can move in a virtually infinite number of directions, which makes it hard to train planning and control models that drive automation. Large systems of multiple motion-capture cameras are traditionally used to achieve autonomous control, MIT says. These cameras provide the robots feedback about 3D movement and positions, but these large systems are considered impractical for soft robots in real-world applications. In a paper being published in the journal IEEE Robotics and Automation Letters, MIT researchers describe a system of soft sensors that cover a robot’s body to provide “proprioception,” which means awareness of motion and position of its body. That feedback runs into a novel deep-learning model that sifts through the noise and captures clear signals to estimate the robot’s 3D configuration. The system was validated on a soft robotic arm resembling an elephant trunk that can predict its own position as it autonomously swings around and extends. Off-the-shelf materials can be used to fabricate the sensors, so any lab can develop their own systems according to Ryan Truby, a postdoc in the MIT Computer Science and Artificial Laboratory (CSAIL), and the co-first author on the paper along with CSAIL postdoc Cosimo Della Santina. “We’re sensorizing soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication,” Truby explains. “We want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. This is a first step toward that type of more sophisticated automated control.” A future goal is to help make artificial limbs that can more dexterously handle and manipulate objects in the environment. “Think of your own body: You can close your eyes and reconstruct the world based on feedback from your skin. We want to design those same capabilities for soft robots,” explains co-author Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. Fully integrated body sensors have been a long-term goal in soft robotics, MIT notes, as traditional rigid sensors detract from a soft robot body’s natural compliance, complicate its design and fabrication, and can cause various mechanical failures. Soft-material-based sensors are considered a more suitable alternative, but they require specialized materials and methods for their design, which makes them difficult for many robotics labs to fabricate and integrate in soft robots. Truby explains that one day while he was working in his CSAIL lab looking for inspiration for sensor materials, he made an interesting connection.  “I found these sheets of conductive materials used for electromagnetic interference shielding, that you can buy anywhere in rolls,” Truby says. These materials have “piezoresistive” properties, which means when they are strained, they change in electrical resistance. Truby realized that if placed on certain spots on the trunk, they could make effective soft sensors. MIT explains that as the sensor deforms in response to the trunk’s stretching and compressing, its electrical resistance is converted to a specific output voltage, which is then used as a signal correlating to that movement. The material didn’t stretch much, though, which would limit its use for soft robotics. Truby was inspired by a variation of origami that includes making cuts in a material called kirigami, which led him to designing and laser-cutting rectangular strips of conductive silicone sheets into various patterns, such as rows of tiny holes or crisscrossing slices like a chain link fence. Truby says this made them far more flexible, stretchable, “and beautiful to look at.” The researchers’ robotic trunk is made up of three segments, each with four fluidic actuators (12 total) used to move the arm. They fused one sensor over each segment, with each sensor covering and gathering data from one embedded actuator in the soft robot. They used a technique called “plasma bonding” that energizes a surface of a material to make it bond to another material. It takes approximately a few hours to shape dozens of sensors that can be bonded to the soft robots using a handheld plasma-bonding device. The sensors captured the trunk’s general movement, as hypothesized, but they were “really noisy,” MIT notes. “Essentially, they’re nonideal sensors in many ways,” Truby says. “But that’s just a common fact of making sensors from soft conductive materials. Higher-performing and more reliable sensors require specialized tools that most robotics labs do not have.” By sifting through the noise to capture meaningful feedback signals, the researchers built a deep neural network to do most of the heavy lifting to estimate the soft robot’s configuration using only the sensors. The researchers developed a new model to kinematically describe the soft robot’s shape that significantly lowers the number of variables needed for their model to process. During experiments, the researchers had the trunk swing around and extend itself in random configurations for approximately 90 minutes. This traditional motion-capture system was used for ground truth data. During training, the model analyzed data from its sensors to predict a configuration, and compared its predictions to that ground truth data which was being collected at the same time. As a result, the model “learns” to map signal patterns from its sensors to real-world configurations. Results showed that for certain and steadier configurations, the robot’s estimated shape matched the ground truth. The next goal for the researchers is to explore new sensor designs for improved sensitivity and to develop new models and deep-learning methods to reduce the required training for every new soft robot. Researchers also hope to refine the system to better capture the robot’s full dynamic motions. The neural network and sensor skin are currently not sensitive to capture subtle motions or dynamic movements. Truby notes, though, that this is still considered an important first step for learning-based approaches to soft robotic control. “Like our soft robots, living systems don’t have to be totally precise,” Truby says. “Humans are not precise machines, compared to our rigid robotic counterparts, and we do just fine.”

Pages