ORCID Identifier(s)

ORCID 0000-0002-7204-5262

Graduation Semester and Year

Fall 2024

Language

English

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Computer Engineering

Department

Computer Science and Engineering

First Advisor

William J. Beksi

Second Advisor

Manfred Huber

Third Advisor

Farhad A. Kamangar

Fourth Advisor

Nicholas R. Gans

Abstract

Establishing a sustained human presence beyond Earth necessitates the development of autonomous systems capable of extracting and utilizing local resources. On the Moon, in-situ resource utilization (ISRU) is essential to reduce the dependency on Earth-based supplies. Leveraging lunar resources such as water ice for life support and fuel production or regolith for surface construction will enable long-term lunar missions and future deep space exploration. NASA's Artemis program is targeting the lunar south pole (LSP), an area with raw, yet abundant resources available. The harsh environmental conditions present significant challenges for humans operating and installing surface infrastructure. Autonomous robotic systems are therefore essential for performing the `dirty' jobs, such as excavation and construction. However, this is not without challenge as extreme lighting conditions, abrasive lunar dust, and limited communication bandwidth and delays complicate the operation of these autonomous machines. This dissertation develops and applies methods for lunar excavation and navigation to support ISRU operations at the LSP. Recent advances in robot learning, including reinforcement learning (RL) and one-shot learning approaches like learning from demonstration, have enabled adaptive and efficient control strategies for complex terrestrial robotic tasks. We extend these methods to address challenges specific to lunar operations, an area where such techniques remain largely unexplored. In the first part of the dissertation, we present an RL approach for autonomous bucket drum excavation. We utilize a simulation environment that approximates the effects of regolith excavation and train an agent to perform efficient linear trenching with limited sensory input. Our method demonstrates improved efficiency compared to traditional control strategies by adapting to varying terrain conditions without explicit programming. Second, we introduce a novel approach to lunar navigation using dynamic movement primitives. We develop a modified formulation of the framework for circumnavigating a lander while avoiding known rock hazards. This method enables a compact representation of the complex movement patterns that we perturb to perform obstacle avoidance. Third, we address the challenge of real-time hazard detection under LSP lighting conditions. We create a synthetic dataset of labeled images simulating distinctly challenging polar lighting scenarios. We then evaluate multiple state-of-the-art instance segmentation models to perform rock hazard detection and segmentation under these conditions. Finally, we integrate our vision-based hazard detection system with a modified movement primitives approach for adaptive lunar navigation. This combined system enables real-time obstacle avoidance while preserving the overall planned path structure. We demonstrate its effectiveness on multiple primitives in a high-fidelity lunar simulation environment. This dissertation demonstrates its contributions through several simulations that approximate key conditions for excavation and navigation. We demonstrate the potential of combining RL-based excavation, movement primitive-driven navigation, and vision-based hazard detection for autonomous ISRU operations. It is our hope that such advancements will play a role in supporting future efforts for a sustained human presence on the Moon.

Keywords

Robotics, Moon, ISRU

Disciplines

Robotics

License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Available for download on Tuesday, November 25, 2025

Included in

Robotics Commons

Share

COinS