Author: | Mohamed Ben Jazia, Sonja Stefani, Uzair Tajuddin, Konstantin Kraus |
---|---|
Supervisor: | Prof. Gudrun Klinker |
Advisor: | |
Submission Date: | March 11, 2019 |
Abstract
HoloRPG is a classical role-playing game that uses Mixed Reality to immerse the player in a fabricated world. Game entities are spawned into the real world surroundings of the player and rendered as holograms by the HoloLens. Walls and obstacles of the real world are scanned and taken into account in the interaction of game entities with the surroundings.
The game makes the player the main character of an adventure that is set in a fantasy like setting. The ultimate goal of the game is to raise the level of immersion so much that the interaction with the fake game entities feels as real to the player as the interaction with real world objects.
Multiple visualization and interaction techniques have been discussed and evaluated in the scope of this project to move towards a more immersive experience.
Results
The original version of the game relied on a game master to create content in a previously mapped space. However, with the current version of the Mixed-Reality-Toolkit (formerly Holo-Toolkit) new capabilities for spatial understanding allow developers to extract additional information from the spatial mapping results. This data can be used to procedurally place objects in the environment.
The LeapMotion provides an API for Unity in its SDK that lets you get some data about the hands that are being tracked (e.g. hand position, facing direction, velocity, etc.). Although the website says the API also provides a basic gesture recognition implementation, the actual API has all references to it removed.
The starting point for this tool was the data that was received directly from the LeapMotion. The implementation defined a set of features present at the tracked hand (e.g hand openness i.e. palm open or fist closed, etc.). We used then these features to define a set of states the hand could be in at any given time (e.g. hand facing forwards, hand moving to the left, etc.). The idea originally was then to build a state machine that would let us track the states that the hand has to go through to finally perform a gesture.
Conclusion
Having evaluated multiple approaches to improve the gameplay experience the choice to keep the advantages of an untethered mixed reality experience was rather obvious. However, this could not be fully achieved due to problems with the Client-Server based LeapMotion connection and current Unity updates. The final prototype still proves that having fully tracked hands with gesture recognition significantly improves the user experience.
Also the use of spatial understanding to create a more dynamic and open environment was demonstrated, but is still not fully explored.
The most practical way forward for this project would be to move onto the newly presented HoloLens 2. Not only does the next generation device improve performance, field of view and user comfort, it also adds hand-tracking abilities similar to those provided by the LeapMotion. With these updates the game could be fully run on the headset without the need for any additional hardware.