NASA Selects Astrobotic for 2 Small Business Awards to Improve Spacecraft Operations

by Douglas Messier
Managing Editor
NASA has selected Astrobotic Technology for two Small Business Innovation Research (SBIR) awards to develop technology to help spacecraft improve proximity operations in orbit and avoid hazards when landing on other worlds.
The SBIR Phase I awards are each worth up to $125,000 for projects lasting up to six months.
One project involves the creation of the Astrobotic LiDAR-Inertial Navigation (ALIN) software package to improve the ability of satellites to perform rendezvous, proximity and docking operations. The capability is vital for building infrastructure in orbit and autonomous satellite inspection and servicing.
“This modular and versatile software leverages LiDAR Simultaneous Localization and Mapping (SLAM) to provide navigation and mapping capabilities. ALIN will specifically target applications requiring high fidelity relative navigation solutions to non-cooperative dynamic spacecraft, such as the inspection and servicing of satellites,” the proposal abstract said.
Astrobotic’s other proposal is focused on improving the ability of spacecraft to land precisely and to avoid hazards on the moon, Mars and other worlds.
The proposed work applies a deep learning approach to this problem, as the highly parallelizable nature of learning-based computations naturally extends to hardware acceleration, enabling additional computational power to compute and combine hazard maps across both LiDAR and camera data,” the proposal abstract said.
“The output of this development will be a demonstration of the feasibility and performance of a deep-learning based hazard detection system that leverages both LiDAR and image data to achieve mission-speed performance on path-to-flight hardware,” the astract added.
Summaries of the two selected projects follow.
SBIR PHASE I AWARD
Amount: up to $125,000
Duration: 6 months
LiDAR-based Navigation and Mapping for Rendezvous,
Proximity Operations, Docking
Subtopic Title: Guidance, Navigation, and Control
Astrobotic Technology, Inc.
Pittsburgh, PA
Principal Investigator: Mr. Jeremy Hardy
Estimated Technology Readiness Level (TRL) :
Begin: 3
End: 4
Technical Abstract
High-fidelity relative navigation and three-dimensional mapping are key competencies to achieve a variety of mission objectives in Earth, Lunar, and eventually Martian Orbit. Developing autonomous and reliable Rendezvous, Proximity Operations, and Docking (RPOD) technologies will play a key role in the ability to build infrastructure in orbit by providing autonomous satellite inspection and servicing capabilities, among many other applications.
Astrobotic, a Pittsburgh, PA-based space robotics company, proposes to further develop existing in-house technology to create the Astrobotic LiDAR-Inertial Navigation (ALIN) software package. This modular and versatile software leverages LiDAR Simultaneous Localization and Mapping (SLAM) to provide navigation and mapping capabilities. ALIN will specifically target applications requiring high fidelity relative navigation solutions to non-cooperative dynamic spacecraft, such as the inspection and servicing of satellites.
Phase I will yield a prototype system featuring a space-relevant compute platform capable of real time data collection from a terrestrial grade scanning LiDAR and analysis of the system to provide a clear path forward for achieving real time mapping and relative navigation on space hardware.
Phase II, if awarded, would focus on optimizing algorithmic localization, mapping performance, and timing to meet RPOD-specific mission requirements. Specifically, development would focus on improving localization and mapping under the challenging circumstances of a very sparse scene with a single dynamic LiDAR-observable object being observed from a non-inertial reference frame, as is the case in most RPOD missions.
The results of a Phase II will demonstrate the viability of the ALIN software package in simulation, and with follow-on investment the system could be infused into a flight program.
Potential NASA Applications
The proposed Phase I work will lead to a prototype LiDAR-based navigation and mapping solution geared toward the satellite servicing and inspection industry. Phase II will begin working towards the development of the sensor as a flight-ready module and conducting extensive testing on flight-ready hardware. The resulting technology could become flight ready in a Phase III, providing the opportunity for early mission infusion and to perform testing and data collection on smaller cubesat style missions or on the ISS.
Potential Non-NASA Applications
Robust GPS-denied localization and mapping capabilities have strong potential in the private sector filling the need to inspect and understand the severity of damage in hard-to-access locations. A navigation system that can safely operate in dark, unmapped locations could advance understanding of the types of necessary maintenance in facilities where GPS is not available.
SBIR PHASE I AWARD
Amount: up to $125,000
Duration: 6 months
Real-time Hazard Detection via Deep Learning
Subtopic Title: Flight Dynamics and Navigation Technologies
Astrobotic Technology, Inc.
Pittsburgh PA
Principal Investigator: Kori Macdonald
Estimated Technology Readiness Level (TRL) :
Begin: 2
End: 3
Technical Abstract
On-board hazard detection is critical to the success of landed missions, as available orbiter data does not capture the lunar terrain at a resolution that enables identification of potentially mission-threatening rocks and craters on the centimeter-scale. Current state-of-the-art technologies in hazard detection typically use LiDAR data to address low/variable illumination conditions during landing operations, however the option to include image data can result in a hazard detection solution that is more frequently updated at a higher resolution.
The proposed work applies a deep learning approach to this problem, as the highly parallelizable nature of learning-based computations naturally extends to hardware acceleration, enabling additional computational power to compute and combine hazard maps across both LiDAR and camera data. The output of this development will be a demonstration of the feasibility and performance of a deep-learning based hazard detection system that leverages both LiDAR and image data to achieve mission-speed performance on path-to-flight hardware.
The proposing team is currently developing a LiDAR-based hazard detection module for Astrobotic’s Griffin Mission One to deliver NASA’s VIPER rover to the lunar south pole, planned for late 2023. Techniques developed in the proposed work will benefit from the V&V infrastructure developed for this and future missions.
Additionally, Astrobotic will leverage the LunaRay Suite, which is capable of generating and verifying accurate terrain data, including terrain models, photometrically accurate image data, as well as simulated LiDAR data at input locations, times, and viewing positions.
As such, a large and widely varied training dataset will be produced, enabling the training of a robust network. By providing a robustly trained solution on relevant hardware, the proposing team seeks to drive forward the market of applied deep learning technologies in the space industry.
Potential NASA Applications
As landing precision requirements continue to grow with increasingly complex mission scenarios, customers will look to a flexible solution which utilizes as much data as possible to produce an accurate solution. Astrobotic’s own participation in NASA’s CLPS program will provide an internal customer enabling demonstration of this technology on a landed mission. With flight heritage and demonstrated successes, this system will become a sensor considered as a strong option for future missions through the CLPS and Artemis programs.
Potential Non-NASA Applications
The ability for an airborne system to track objects in real-time may be of interest to the DOD to gain intelligence and ensure troop safety in uncertain environments. The DOD may be interested in a hazard detection system for missions landing in uncertain areas as well. Hardware acceleration for deep learning applications would find a host of applications, such as in the autonomous vehicle sector.
Leave a Reply
You must be logged in to post a comment.