Author:

Erick Alvarez Reyes
Supervisor:Prof. Gudrun Klinker
Advisor:Sandro Weber (@no68tap)
Submission Date:[created]

Abstract

As the number of applications employing virtual simulation environments has increased, so has the emphasis on the real-time integration of humans in these environments. Multiple solutions utilise motion tracking to grant humans control over virtual characters in order to achieve this integration. A remaining challenge with these solutions is to use few and easily accessible tracking devices while also simulating the user’s movements with high fidelity. In order to address this, the authors of QuestSim propose a novel method to track human movement using only the headset and the two controllers of a Virtual Reality device. Using these trackers, they train a Reinforcement Learning agent with motion data to control physically-simulated characters. Despite having only three trackers on the upper body, the agent achieves the unanticipated result of imitating the movements of the lower body. Therefore, we consider it crucial to corroborate these findings and investigate the capabilities and deficiencies of this method. Experiments conducted on our reimplementation confirm that QuestSim is capable of controlling the entire body of the character, but is limited to the variety of motions that it can replicate. Furthermore, additional modifications that we test reveal that the addition of feet trackers improves only slightly the quality of certain movements, whereas variations of the Reinforcement Learning reward refine the imitation significantly.

Results/Implementation/Project Description

Conclusion

[ PDF (optional) ] 

[ Slides Kickoff/Final (optional)]