Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Autonomous Navigation in Unstructured Environments
Submission status
Open
Submission deadline
The ability for vehicles to efficiently navigate without human intervention through complex, dynamic environments opens up diverse opportunities, for example in transportation, search and rescue, monitoring and planetary exploration. This collection explores the various technologies which will lead to safe efficient operation of autonomous vehicles in these demanding environments. Vehicles may be terrestrial, marine or aerial.
Topics include:
Imaging and sensing technologies and data fusion techniques for enhanced environmental perception
Integration of machine learning algorithms for real-time decision-making e.g. in obstacle avoidance, path planning, and operational resilience
Autonomous management of vehicle dynamics e.g. for terrain adaptability or environmental variability
Development of reliable communication systems including human-AI interactions
Liang Zhao and colleagues demonstrated a methodological framework that merges maritime knowledge and autonomous manoeuvring model for an intelligent shipping application. It serves as a fully digitalised platform for route customization and evaluation for maritime transportation, optimizing both operational decision-making and safety assurance.
Long Chen and colleagues show a fully autonomous open-pit mine. Heterogenous machinery and tasks are coordinated using parallel learning and digital twins.
Haughn and colleagues develop gust rejection controllers and overcome challenges of computationally expensive modeling and expansive distributed sensing networks. With only three pressure tap sensors, small fixed wing uncrewed aerial vehicles could extend into more complex urban environments.
Xingyu Zhao and colleagues report a Bayesian learning framework for runtime self-verification of robotics systems. This framework allows robots to autonomously evaluate and reconfigure themselves after both regular and singular events, using only imprecise and partial prior knowledge.