Why This Matters

Search and rescue operations require autonomous systems to make decisions under significant uncertainty about target locations and environmental conditions. Traditional planning approaches either require perfect information or are computationally intractable for large environments. This work is innovative because it provides a scalable approach to planning under partial observability by intelligently focusing search resources on regions likely to contain targets, while maintaining the ability to adapt to new information discovered during execution.

What We Did

This paper presents a framework for efficient path planning and search in urban environments with uncertain information about target locations and environmental hazards. The approach formulates the problem as a partially observable Markov decision process and develops the Shrinking POMCP algorithm that reduces computational complexity by focusing search on promising regions of the state space. The method combines belief state updates with efficient search tree planning, allowing autonomous systems to locate targets while managing uncertainty in a real-time setting.

Key Results

The Shrinking POMCP algorithm demonstrates substantial improvements in computational efficiency compared to standard POMCP approaches while maintaining solution quality. Experimental validation in simulated environments shows that the approach successfully localizes targets while minimizing search time and computational resources. The method proves effective at balancing the need for comprehensive environment coverage with the constraint of limited planning time.

Full Abstract

Cite This Paper

@inproceedings{zhang2024,
  author = {Zhang, Yunuo and Luo, Baiting and Mukhopadhyay, Ayan and Stojcsics, Daniel and Elenius, Daniel and Roy, Anirban and Jha, Susmit and Maroti, Miklos and Koutsoukos, Xenofon and Karsai, Gabor and Dubey, Abhishek},
  booktitle = {2024 International Conference on Assured Autonomy (ICAA)},
  title = {Shrinking POMCP: A Framework for Real-Time UAV Search and Rescue},
  year = {2024},
  pages = {48--57},
  abstract = {Efficient path optimization for drones in search and rescue operations faces challenges, including limited visibility, time constraints, and complex information gathering in urban environments. We present a comprehensive approach to optimize UAV-based search and rescue operations in neighborhood areas, utilizing both a 3D AirSim-ROS2 simulator and a 2D simulator. The path planning problem is formulated as a partially observable Markov decision process (POMDP), and we propose a novel "Shrinking POMCP" approach to address time constraints. In the AirSim environment, we integrate our approach with a probabilistic world model for belief maintenance and a neurosymbolic navigator for obstacle avoidance. The 2D simulator employs surrogate ROS2 nodes with equivalent functionality. We compare trajectories generated by different approaches in the 2D simulator and evaluate performance across various belief types in the 3D AirSim-ROS simulator. Experimental results from both simulators demonstrate that our proposed shrinking POMCP solution achieves significant improvements in search times compared to alternative methods, showcasing its potential for enhancing the efficiency of UAV-assisted search and rescue operations.},
  contribution = {lead},
  doi = {10.1109/ICAA64256.2024.00016},
  keywords = {path planning, search and rescue, partial observability, Monte Carlo tree search, autonomous systems, UAV operations, planning under uncertainty, belief state management}
}
Quick Info
Year 2024
Keywords
path planning search and rescue partial observability Monte Carlo tree search autonomous systems UAV operations planning under uncertainty belief state management
Research Areas
emergency POMDP scalable AI
Search Tags

Shrinking, POMCP, Framework, Real, Time, Search, Rescue, path planning, search and rescue, partial observability, Monte Carlo tree search, autonomous systems, UAV operations, planning under uncertainty, belief state management, emergency, POMDP, scalable AI, 2024, Zhang, Luo, Mukhopadhyay, Stojcsics, Elenius, Roy, Jha, Maroti, Koutsoukos, Karsai, Dubey