Author:

Jan Giedl
Supervisor:Prof. Gudrun Klinker
Advisor:Mayer, Elisabeth (@ru68tob2)
Submission Date:[created]

Abstract

Urban Air Mobility (UAM) demands accurate vertical take off and landing, but GNSS navigation is often degraded by multipath, shadowing, and atmospheric disturbances. This thesis develops a feature-detection error model for vision augmented navigation and an immersive multi display visualization for UAM pilots. Using Unreal Engine combined with ProjectAirSim Plugin, we create realistic vertiport scenes, including fiducial markers, and simulate vehicle dynamics. Image sequences with corresponding grouth-truth camera poses are generated under varied weather, lighting, and motion conditions. A model of the feature detection error is derived as functions of drone pose, environmental parameters, and sensor noise. Through Unreal Engine we stream live simulation and navigation data to the LRZ CAVE, a five wall VR environment, delivering a real time pilot view while synchronizing multiple render nodes. This master thesis offers a foundation for research and operational assessment of vision assisted UAM navigation and demonstrates its results to pilots through a multi-display visualization.

Results/Implementation/Project Description

Conclusion

[ PDF (optional) ] 

[ Slides Kickoff/Final (optional)]