NeBula Logo

Autonomy Solution

video
video
video
video
video

From Search for Life in our solar system
to terrestrial exploration of Extreme Environments

Is (or was) there life beyond Earth? The answer to this question lies underground on planetary bodies in our solar system. Planetary subsurface voids are one of the most likely places to find signs of life (both extinct and extant). Subsurface voids are also one of the main candidates for future human colonization beyond Earth. To this end, TEAM CoSTAR is participating in the DARPA Subterranean Challenge to develop fully autonomous systems to explore subsurface voids with a dual focus on planetary exploration and terrestrial applications in search and rescue, mining industry, and AI/Autonomy in extreme environments.

CoSTAR_Team_at_Lab_Wide_Talk

NeBula

...

NeBula Autonomy Solution

To address various technical challenges across multiple domains in autonomous exploration of extreme environments, we develop a unified modular software system, called NeBula (Networked Belief-aware Perceptual Autonomy). JPL’s NeBula is specifically designed to address stochasticity and uncertainty in various elements of the mission, including sensing, environment, motion, system health, communication, among others. NeBula has been implemented on multiple heterogeneous robotic platforms (wheeled, legged, tracked and flying vehicles), was demonstrated across various terrestrial or planetary-analogue missions, and has won a DARPA Challenge focused on robotic autonomy.

1. Verifiable autonomy under extreme conditions:

Nebula develops an autonomy architecture that translates the mission specifications into single- or multi-robot behaviors. NeBula quantifies risk and trust in this process by taking uncertainty in robot motion, control, sensing, and environment into account when abstracting activities and behaviors. As a result it provides quantitative guarantees on the performance of the autonomy framework under environment assumptions.

2. Modularity and mobility-based adaptation:

Nebula focuses on a modular design to enable adaptation to various mobility platforms (legged, flying, wheeled, and tracked) and various computational capacities.

3. Resilient Navigation:

Nebula develops a GPS-free navigation solution resilient to perceptually-challenging conditions such as variable illumination, dust, dark, smoke, and fog. The solution relies on degeneracy-aware fusion of various complementary sensing modalities, including vision, IMU, lidar, radar, contact sensors, and ranging systems (e.g., magneto-quasi static signals and UWBs). The system can autonomously switch between and fuse different sensing modalities based on the environmental features.

4. Single- and multi-robot SLAM and dense 3D mapping:

Nebula develops GPS-denied large-scale (several Km+) SLAM solvers and 3D mapping frameworks using confidence-rich mapping methods to provide precise topological, semantic-based, and geometrical maps of the extreme environments such as subsurface caves and mine networks under variable and challenging illumination conditions.

5. Extreme Traversability:

NeBula develops solutions that have enabled robots to autonomously traverse extreme terrains with various traversability-stressing elements such as loose and slippery surfaces (sand, water), muddy terrains, rock-laden terrains, high-slope areas, and autonomously go up and down stairs in terrestrial applications.

6. Multi-robot operations and mesh communication:

Nebula by design can be implemented on multi-robot systems to enable faster and more efficient missions. Robots can also deploy static radios to create a wireless mesh network backbone. For inter-robot communication, this relies on resilient mesh networking solutions that can accommodate intermittent communication links between robots.

7. Autonomous skill learning:

Nebula applies and extends reinforcement learning and in general machine learning methods to enable fast and safe robot motions in perceptually-degraded environments.

Robot Ecosystem

Placeholder

Networked control of a multi-robot system: Nebula focuses on a modular design to enable adaptation to various mobility platforms (legged, flying, wheeled, and tracked) and various computational capacities. It is designed to autonomously coordinate and allocate tasks among a team robots with heterogeneous capabilities. It dynamically maps robot capabilities to their roles during the operation.

...
Legged robots

Legged and multi-limbed robots to handle extreme terrains. Learn more about Nebula-Spot

...
Wheeled rovers

Wheeled rovers for traversing relatively even surfaces. Learn more about Nebula-Husky

...
Tracked robots

Tracked robots with controllable flippers.

...
Drones and hybrid rolling/flying robots

Rollocopter, which ability to roll or fly given the mission state. Learn more about Rollocopter

Space Applications

...

B R A I L L E

NASA’s BRAILLE (Biologic and Resource Analog Investigations in Low Light Environments) Project aims to investigate & quantify the geologic and biologic diversity in lava tube caves, while developing strategies for their exploration in a multitude of scenarios.

NeBula-powered autonomous robots create new operational paradigms for the science team in remote detection activities hosted at Lava Beds National Monument, one of the largest collections of Mars-like lava tubes in North America. These exercises help NASA prepare for future missions to caves on Mars and other rocky bodies, as the agency continues its ongoing hunt for life in the universe.

Learn More
...

Lunar NeBula

In support of the Artemis program’s vision and leveraging Commercial Lunar Payload Services (CLPS) timeline, our team is studying how NeBula can empower next generation robots to enable extreme and long-range lunar terrain traversability, construction of these future colonial outposts, and scout regions of the Moon (surface and subsurface) that possess resources, such as reliable sources of water, oxygen and construction materials.

In addition to enabling a capable and fully autonomous squad of versatile robots, NeBula develops intuitive human-robot interfaces that can be used by Astronauts to facilitate various activities such as carrying payloads or exploring areas that would pose a danger to the crew.

Learn More

For related details of NeBula deployment in support of R&D efforts and space missions targeting our moon, Europa, Enceladus, and other bodies, see here.

Terrestrial Applications

DARPA Subterranean Challenge


The DARPA Subterranean or “SubT” Challenge is a robotic competition that seeks novel approaches to rapidly map, navigate, and search underground environments. The competition spans a period of three years. CoSTAR is a DARPA-funded team participating in the systems track developing and implementing physical systems that will be tasked with the traversal, mapping, and search in various subterranean environments: including natural caves, mines, and urban underground.

Learn More

Team and Partnerships

AboutCO-STAR Logo

TEAM CoSTAR [Collaborative SubTerranean Autonomous Resilient Robots] is a collaboration between NASA’s JPL, MIT, Caltech, KAIST, LTU, and several industry partners (see below). TEAM CoSTAR with more than 60 key members aims at revolutionizing how we operate in the underground domains and subsurface voids for both terrestrial and planetary applications.

sponsor_logos

Opportunities

...

CoSTAR Gallery

...
...
...

Partnerships

sponsor_logos

Interested in joining the team? Please contact us at subt.partnership@jpl.nasa.gov for more details on how to sponsor and support the team.

Media

Contact Us

Technical Lead – Principal Investigator
  • Dr. Ali-akbar Agha-mohammadi
  • NASA’s Jet Propulsion Laboratory, Caltech
  • aliagha@jpl.nasa.gov
  • Homepage
Charles C. Gates Jr.–Franklin Thomas Laboratory, Caltech
  • Prof. Joel W. Burdick
  • California Institute of Technology
  • Laboratory for Robotics and Bioengineering
  • Homepage

Laboratory for Information & Decision Systems, MIT
  • Prof. Luca Carlone
  • Massachusetts Institute of Technology
  • Laboratory for Information & Decision Systems
  • Homepage
Unmanned Systems Research Group, KAIST
  • Prof. "David" Hyunchul Shim
  • Korea Advanced Institute of Science and Technology
  • Unmanned Systems Research Group
  • Homepage

Robotics Team, LTU
  • Prof. George Nikolakopoulos
  • Luleå University of Technology
  • Homepage

Publications

Selected Publications

Below are several papers describing recent results from the NeBula framework:

  • "NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge," Accepted for publication in the Journal of Field Robotics, 2021. [PDF]

  • "Autonomous Spot: Long-range Autonomous Exploration of Extreme Environments with Legged Locomotion," IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, USA, 2020. Best Paper Award on Safety, Security, and Rescue Robotics.[PDF] [Video]

  • "Deep Learning Tubes for Tube MPC," Robotics: Science and Systems (RSS), Corvallis, USA, 2020. [PDF]

  • Confidence-rich 3D Grid Mapping,” International Journal of Robotics Research (IJRR), vol.38, pp.1352-1374, 2019. [PDF]

  • "Autonomous Navigation of Drones,” The International Symposium on Robotics Research (ISRR). Hanoi, Vietnam, 2019. [PDF]

  • Bi-directional Value Learning for Risk-aware Planning Under Uncertainty,” IEEE Robotics and Automation Letters (RA-L), vol.4, no.3, pp.2493-2500, March, 2019. [PDF]

  • "LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments," IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020. [PDF]

  • "LOCUS - LiDAR Odometry for Consistent Operation in Uncertain Settings", IEEE Robotics and Automation Letters (RA-L). 2020. [PDF][Video]

  • Contact Inertial Odometry: Collisions are your Friend,” The International Symposium on Robotics Research (ISRR), Hanoi, Vietnam, 2019. [PDF]

Below is a list of papers describing recent results from the NeBula framework and papers that the NeBula framework is built on:

  • Ali-Akbar Agha-Mohammadi, et al., NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge," Accepted for publication in the Journal of Field Robotics, 2021. [PDF]

  • David D. Fan*, Kyohei Otsu*, Yuki Kubo, Anushri Dixit, Joel Burdick, and Ali-Akbar Agha-Mohammadi, STEP: Stochastic Traversability Evaluation and Planning for Safe Off-road Navigation," Under review, 2021. [PDF]

  • S.K. Kim∗, A. Bouman∗, G. Salhotra, D. D. Fan, K. Otsu, J. Burdick, and Ali-Akbar Agha-Mohammadi, “PLGRIM: Hierarchical value learning for large-scale exploration in unknown environments,” in Proceedings of the International Conference on Automated Planning and Scheduling (ICAPS), vol. 31, Guangzhou, China, 2021. [PDF]

  • Muhammad Fadhil Ginting, Kyohei Otsu, Jeffrey Edlund, Jay Gao, and Ali-Akbar Agha-Mohammadi, “CHORD: Distributed Data-sharing via Hybrid ROS 1 and 2 for Multi-robot Exploration of Large-scale Complex Environments,” IEEE Robotics and Automation Letters (RA-L), 2021. [PDF]

  • Andrea Tagliabue, Jesus Tordesillas, Xiaoyi Cai, Angel Santamaria-Navarro, Jonathan P. How, Luca Carlone, and Ali-akbar Agha-mohammadi, “LION: Lidar-Inertial Observability-Aware Navigator for Vision-Denied Environments,” International Symposium on Experimental Robotics (ISER), Floriana, Malta, 2021. [PDF]

  • Rohan Thakker, Nikhilesh Alatur, David D. Fan, Jesus Tordesillas, Michael Paton, Kyohei Otsu, Olivier Toupet and Ali-akbar Agha-mohammadi, “Autonomous Off-road Navigation over Extreme Terrains with Perceptually-challenging Conditions,” International Symposium on Experimental Robotics (ISER), Floriana, Malta, 2021. [PDF] [Video]

  • Hyungho Chris Choi, Inhwan Wee, Micah Corah, Sahand Sabet, Taeyeon Kim, Thomas Touma, David Hyunchul Shim, and Ali-akbar Agha-mohammadi, “BAXTER: Bi-modal Aerial-Terrestrial HybridVehicle for Long-endurance Versatile Mobility,” International Symposium on Experimental Robotics (ISER), Floriana, Malta, 2021. [PDF]

  • Kamak Ebadi, Matteo Palieri, Sally Wood, Curtis Padgett, and Ali-akbar Agha-mohammadim, "DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments," Journal of intelligent and robotic systems, 2021. [PDF]

  • Marcel Kaufmann, Tiago Stegun Vaquero, Kyohei Otsu, Giovanni Beltrame, Ali-akbar Agha-mohammadi, “Copilot MIKE: An Autonomous Assistant for Multi-Robot Operations in Cave Exploration,” IEEE Aerospace Conference, Big Sky Resort, USA, 2021.

  • Amanda Bouman*, Muhammad Fadhil Ginting*, Nikhilesh Alatur*, Matteo Palieri, David D. Fan, Thomas Touma, Torkom Pailevanian, Sung-Kyun Kim, Kyohei Otsu, Joel Burdick, and Ali-akbar Agha-mohammadi, “Autonomous Spot: Long-range Autonomous Exploration of Extreme Environments with Legged Locomotion,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, USA, 2020. [PDF] [Video]

  • Matteo Palieri, Benjamin Morrell, Abhishek Thakur, Kamak Ebadi, Jeremy Nash, Arghya Chatterjee, Christoforos Kanellakis, Luca Carlone, Cataldo Guaragnella, and Ali-akbar Aga-mohammadi, “LOCUS - LiDAR Odometry for Consistent Operation in Uncertain Settings,” IEEE Robotics and Automation Letters (RA-L). 2020. [PDF] [Video]

  • Nobuhiro Funabiki, Benjamin Morrell, Jeremy Nash and Ali-akbar Agha-mohammadi, "Range-Aided Pose-Graph-Based SLAM: Applications of Deployable Ranging Beacons for Unknown Environment Exploration," IEEE Robotics and Automation Letters (RA-L), vol.6, no.1, pp.48-55, 2020. [PDF]

  • Fan, David D., Ali-akbar Agha-mohammadi, and Evangelos A. Theodorou. "Deep Learning Tubes for Tube MPC." Robotics: Science and Systems, Corvallis, USA, 2020. [PDF]

  • K. Ebadi, Y. Change, M. Palieri, A. Stephens, A. H. Hatteland, E. Heiden, A. Thakur, B. Morrell, L. Carlone, A. Agha-mohammadi. "LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments," IEEE International Conference on Robotics and Automation (ICRA), Paris, France 2020. [PDF]

  • David D. Fan, Jennifer Nguyen, Rohan Thakker, Nikhilesh Alatur, Ali-akbar Agha-mohammadi, Evangelos A. Theodorou. Bayesian Learning-Based Adaptive Control for Safety Critical Systems. IEEE International Conference on Robotics and Automation (ICRA), Paris, France. 2020. [PDF]

  • Pierre-Yves Lajoie, Benjamin Ramtoula, Yun Chang, Luca Carlone, Giovanni Beltrame. DOOR-SLAM:Distributed, Online, and Outlier Resilient SLAM for Robotic Teams. IEEE Robotics and Automation Letters, 2020. [PDF]

  • Andrew Kramer, Carl Stahoviak, Angel Santamaria-Navarro, Ali-akbar Agha-mohammadi and Christoffer Heckman, "Radar-Inertial Ego-Velocity Estimation for Visually Degraded Environments”, IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020. [PDF]

  • Takahiro Sasaki, Kyohei Otsu, Rohan Thakker, Sofie Haesaert, and Ali-akbar Agha-mohammadi, “Where to Map? Iterative Rover-Copter Path Planning for Mars Exploration,” IEEE International Conference on Robotics and Automation(ICRA), Paris, France, 2020. [PDF]

  • Benjamin Morrell, Matteo Palieri, Nobuhiro Funabiki, Abhishek Thakur, Jennifer G Blank, and Ali-akbar Agha-mohammadi, “Robotic Localization and Multi-Sensor, Semantic 3D Mapping for Exploration of Subsurface Voids,” American Geophysical Union (AGU), San Francisco, CA, 2020. [PDF] [Video]

  • Thomas Touma, Jennifer G. Blank, Muhammad Fadhil Ginting, Christopher Patterson, and Ali-akbar Agha-mohammadi, “Mars Dogs: Biomimetic Robots for the Exploration of Mars, from its Rugged Surface to its Hidden Caves,” American Geophysical Union (AGU), San Francisco, CA, 2020. [Video]

  • Marcel Kaufmann, Tiago Stegun Vaquero, Kyohei Otsu, and Ali-akbar Agha-mohammadi, “One Operator to Rule Them All: Human-Robot Interaction for Real-World and Analog Subsurface Exploration,” American Geophysical Union (AGU), San Francisco, CA, 2020.

  • Muhammad Fadhil Ginting, Thomas Touma, Jeffrey A. Edlund, and Ali-akbar Agha-mohammadi, “Deployable Mesh Network for Enabling Reliable Communication from within Subsurface Voids to the Planetary Surface,” American Geophysical Union (AGU), San Francisco, CA, 2020. [Video]

  • Kyohei Otsu, Scott Tepsuporn, Rohan Thakker, Tiago Vaquero, Jeffrey A. Edlund, William Walsh, Michael T. Wolf, Ali-akbar Agha-mohammadi, “Autonomous Exploration and Mapping of Communication-degraded Environment with a Robot Team,” IEEE Aerospace Conference, Big Sky Resort, USA, 2020. [PDF]

  • Andrea Tagliabue, Stephanie Schneider, Marco Pavone, Ali-akbar Agha-mohammadi, “The Shapeshifter: a Multi-Agent, Multi-Modal Robotic Platform for the Exploration of Titan," IEEE Aerospace Conference, Big Sky Resort, USA, 2020.[PDF]

  • Ali-akbar Agha-mohammadi, Karl L. Mitchell, Penelope J. Boston, “Robotic Exploration of Planetary Subsurface Voids in Search for Life”, 2019. [PDF]

  • Sung-Kyun Kim, Rohan Thakker and Ali-akbar Agha-mohammadi, “Bi-directional Value Learning for Risk-aware Planning Under Uncertainty,” IEEE Robotics and Automation Letters (RA-L), vol.4, no.3, pp.2493-2500, March 2019. [PDF]

  • Thomas Lew, Tomoki Emmei, David Fan, Tara Bartlett, Angel Santamaria-Navarro, Rohan Thakker, Ali-akbar Agha-mohammadi, Contact Inertial Odometry: Collisions are your Friend. The International Symposium on Robotics Research (ISRR), Hanoi, Vietnam, 2019. [PDF]

  • Ali-akbar Agha-mohammadi, Andrea Tagliabue, Stephanie Schneider, Benjamin Morrell, Marco Pavone, Jason Hofgartner, Issa AD Nesnas et al. "The Shapeshifter: A Morphing, Multi-Agent, Multi-Modal Robotic Platform for the Exploration of Titan." NASA NIAC Phase I Study Final Report (2019). [PDF]

  • David Fan, Rohan Thakker, Tara Bartlett, Meriem Ben Miled, Leon Kim, Evangelos Theodorou, Ali-akbar Agha-mohammadi, Autonomous hybrid ground/aerial mobility in unknown environments. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, 2019. [PDF]

  • Jared Strader, Kyohei Otsu, and Ali-akbar Agha-mohammadi, “Perception-aware Mast Motion Planning for Planetary Exploration Rovers,” Journal of Field Robotics, 2019. [PDF]

  • Ali-akbar Agha-mohammadi, Eric Heiden, Karol Hausman and Gaurav S. Sukhatme, “Confidence-rich 3D Grid Mapping: Toward High-speed Vision-based UAV Navigation,” International Journal of Robotics Research (IJRR), vol.38, pp.1352-1374, 2019. [PDF]

  • Santamaria-Navarro, A., Thakker, R., Fan, D.D., Morrell, B., Agha-mohammadi, A.A. Autonomous Navigation of Drones. The International Symposium on Robotics Research (ISRR), Hanoi, Vietnam, 2019. [PDF]

  • Max Pflueger, Ali-akbar Agha-mohammadi and Gaurav S. Sukhatme, “Rover-IRL: Inverse Reinforcement Learning with Soft Value Iteration Networks for Planetary Rover Path Planning,” IEEE Robotics and Automation Letters (RA-L), vol.4, no.2, pp.1387-1394, 2019.[PDF]

  • Ali-akbar Agha-mohammadi, Saurav Agarwal, Sung-Kyun Kim, Suman Chakravorty and Nancy M. Amato, “SLAP: Simultaneous Localization and Planning for Physical Mobile Robots via Enabling Dynamic Replanning in Belief Space,” IEEE Transactions on Robotics (TRO), vol.34, no.5, pp.1195-1214, 2018. [PDF]

  • Kamak Ebadi, and Ali-Akbar Agha-Mohammadi. "Rover Localization in Mars Helicopter Aerial Maps: Experimental Results in a Mars-Analogue Environment." In International Symposium on Experimental Robotics, pp. 72-84. Springer, Buenos Aires, Argentina 2018. [PDF]

  • Ali-akbar Agha-mohammadi, Suman Chakravorty and Nancy Amato, “FIRM: Sampling-based Feedback Motion Planning Under Motion Uncertainty and Imperfect Measurements,” International Journal of Robotics Research (IJRR), vol.33, no.2, pp.268-304, 2014 [PDF]