Author: | Marian Ludwig, Moritz Naser, Jakob Raith, Akbar Suriaganda |
---|---|
Supervisor: | Prof. Gudrun Klinker |
Advisor: | Daniel Dyrda |
Submission Date: | 30.09.2020 |
Storyworld
In the process of creating an immersive demo scene, we focused on a strong story world during the conception phase. We created a document to note story ideas on the world-level, the plot/scene-level and the character-level.
Storyworld
For our game, we chose a Sci-Fi setting. Humans have conquered space and interstellar travel centuries ago and spread across the galaxy. We were inspired by a setting similar to the Star Wars galaxy.
The game starts on a planet that was involved in a big war between different factions centuries ago. The planet was on the losing side and was attacked with a special bomb in a final strike from the aggressors. This bomb is equipped with a drill and drills itself deep into the planet near its core. When detonated it has the power to rip a planet apart. This is why there are floating islands on the planet and why its gravity is distorted.
Not all people were killed in the attack though. They still live on the destroyed planet, now under the rules of their former enemies. The bomb however caused the gravitational distortions. They create so-called “gravitational tremors” that are similar to earthquakes. Scanners around the settlements sense these tremors and give warning signals to the population.
Plot
This is where the story of our game starts. A young mechanic named Jason notices a malfunctioning scanner. He tries to fix the scanner, but only causes it to shut down. Now blind to the gravitational tremors, the village is in grave danger. Jason sets out to locate the source of the signal that caused the malfunction in the first place.
Scene
At this point the game itself starts. We tried to find the right spot to insert players into our story. The point was picked consciously to be not at the very start of the game but still at a point that could introduce the player to the world we built. Jason sets out to locate the signal and comes across an old drone he calls Marvin. He turns it on thinking it is the source of the signal, but in fact it was also affected by it. They set out together to find the source and come across different points of interest. Here the player learns more about the world until they get to an entrance to a cave. The signal seems to originate from beyond this entrance and so they enter. At this point the scene and our game ends, leaving the player with a sense of achievement but leaving the player with questions about the story to come.
Writing
The task of writing is to deliver the decided on story and plot points to the player. We wanted to introduce the players to elements of the world without laying it out in front of them too clearly. We designed different points of interest throughout the level, the player can interact with. Each of them tells them something different about the world in a way that we hope feels organic and in service of the world building. The points of interest were laid out in the level so that the player is easily attracted to them through pathing and level geometry. An “ideal” path through the level can be seen below.
We tried to keep the dialogue light hearted and interesting while remaining natural dialogue that would fit the world and the conversations between Jason and Marvin. We did this by using terms that are known in the game world but not necessarily in ours without explicitly explaining their meaning. It is our hope that this leads to a deeper sense of immersion without confusing the player. For example, the dialogue below hints at past events in the world that are not further explored in the level, but should help flesh out the believability of the game.
MARVIN
The structures are indicative of a ritualistic place.
JASON
Can you tell how old this ruin is?
MARVIN
The decay of the etch marks suggests at least 830 Sol years.
JASON
830? But that was even before the Diaspora.
MARVIN
Affirmative. This is new information. Adding sample to database.
JASON
What have I gotten myself into?
Dialogue System
To implement our dialogues we used the plugin “Dialogue System for Unity” from PixelCrushers. It comes with everything we needed for our scope right out of the box and allowed for easy design of interactions and dialogues. The dialogues are written as a graph that visualizes the flow of dialogue. Although we did not make use of it, the plugin allows branching dialogue choices and replies. Interacting with a point of interest triggers the respective dialogue.
Level Design
We've set ourselves some design goals, which some of them are the lost feeling in the alien environment and the counteracting orientation through exploration. These two goals should be achieved via level design.
The Map
Early Designs
In the beginning, we wanted to achieve the lost feeling by creating a huge level. The level consisted of five stages: the landing zone, the forest path, the lake, the ruins and the cave. The size of each stage was increasing until the lake and afterwards decreasing again but deepening in story. Each stage had at least one distinctive point, the player could orientate with, such as a rock floating or a large ruin. Also, each stage had a lookout area where the player can see the long path they have walked or will walk. Three of these stages had been designed already.
Final Level
However, the early designs would go beyond our scope. So we took the most important aspects of all stages and put them into one stage.
Design Goal Satisfaction
Feeling lost
The term "feeling lost" can be better understood as "being in a vast environment where the player is motivated to explore in order to not feel lost". But still, it is more difficult to achieve this feeling when using only one compact environment. Nevertheless, we used some tricks that help, such as verticality and unreachable distances. Our terrain has four levels: the arrival, the viewpoint, the crater and the forest. With this verticality, the player is able to look above other levels at some places, especially at the viewpoint. This way, the player notice the long way they have walked or will walk. We also added areas, that are visibly unreachable. For example, the huge mountains in the distance or even the floating rocks above the level. With the fog effect, the distance is even more visible.
Orientation
As a counteract to the lost feeling, the player should be able to orientate themselves by exploration. What's the point of exploration if you don't know you've already explored it? That's why distinctive marks are place at certain places in the environment. These marks can be "natural" (as can be on an alien planet) or "man-made" (as can be on an alien planet). For example, in the beginning behind the spawn zone, there is a huge boulder flying and visible from alomst the whole stage. It can be used as a reference for a direction the player doesn't want to go to, since it would be a way back. It is then important for the level creator, that these marks can be easily seen and recognized.
Importing and placing photoscanned models in unity
Nowadays thanks to Quixel megascans and other resources it is even for small indie studios or student projects possible to create realistic looking environments without spending thousands of hours in modeling 3d assets. In the Quixel library you can find everything from a dry grass plant over a beautiful red flower to a gigantic rock. However, what is still missing in the library are trees. Therefore we tried to take the also photoscanned models from the book of the dead demo of Unity and imported them into our project. This was rather difficult because the book of the dead demo project uses a very own hdrp rendering pipeline that is not compatible with the usual hdrp rendering pipeline from unity, for which reason we couldn't use the same materials as well as the wind system of book of the dead. Instead we just imported the 3d models and the textures and used a different material. With the level design in mind, we firstly blocked out the important parts of the level, like the main trail, the river, the valley and the viewpoint and after that we carefully placed the trees that are next to the main gameplay areas so they are placed naturally and correctly. The other trees were placed by the mass-tree tool of unity terrain, that placed the remaining area with all the selected trees randomly. After deleting unnecessary trees the biggest challenge appeared. Different from the tree placing, unity terrain in the hdrp version is missing a proper placing tool for grass, which means in theory that every single grass prefab has to be placed by hand. Soon we figured out that this method is very time consuming and frustrating. Having the first blades of grass near the trail, we dived deeper into the topic on the internet to find a solution. It turned out that there is a tool called prefab brush+ on the asset store which enables you to place prefabs on terrain quite easily. With this tool installed the further placing of grass was way easier and handier. With all the ferns, clover and grass from the book of the dead demo and new assets from Quixel we were able to create a dense forest with reasonable performance. We gathered quite a big amount of images and other references to get an eye for the right placing of foliage for example from games like Star Wars Battlefront, the Vanishing of Ethan Carter and more.
How to do a proper wind system in hdrp?
As mentioned before, the trees from the book of the dead demo had already a material with some wind animation, however, since we were not able to use it, we searched for a different solution. We came up with the idea to use the materials from the fountain blue demo (also from unity) since this project uses the same render pipeline and same unity version as our project does. Also the fountain blue demo not only provides a wind material for the tree, it also provides suitable materials for grasses and bushes. We only needed to assign the new material with the old textures. After that the correct settings had to be set in the wind volume as well as in the corresponding script that handles the wind movement globally and the wind was working. As the wind was influenced by a regular wind volume and not baked into the shader as in the book of the dead demo we could also use the wind source as the main influence on the leaves particles to give them a natural movement. By tweaking values like frequency, turbulence strength and more we finally got a result that looks reasonable and natural.
Character Controller
As our main goal was to create a very realistically looking scene, we decided to use a first-person character controller as we thought it would support the immersion.
To control the player, we used the “All-in-one first-person playercontroller” Asset from the Unity Asset Store. It is a single script that is put on the player object in the scene and provides various options for the player movement, such as jumping, crouching, and even different footstep sounds. However, we did have to change some lines of code of the script as the movement on a Unity Terrain was not clean but rather stuttering, even though correct physics materials were applied. The script was not updating input, velocities, and camera rotations properly, i.e. in the Update rather than in the Fixed Update functions, which had to be changed. Also, to create a very smooth movement of the drone - which is depending on the player’s movement and view direction - the script had to be adjusted accordingly. More on that in the Companion Drone Behavior section.
Player Input
The game only allows keyboard and mouse input, we did not include any controller support. The input keys are the same as in most games, i.e. WASD keys control the player’s moving direction, the mouse the looking direction. Additionally, left control is mapped to crouching. The F key triggers certain scene events, such as starting dialogues with the companion drone, or sending the drone to scan a point of interest.
Player Movement
To move the player, the WASD input is used to calculate the player’s new velocity. The player has a rigidbody component and the movement is thus with respect to Unity’s built-in physics engine. The player can move up slopes with a maximum angle of 55°with respect to the horizontal plane, which, for example, allows walking up staircases. If an object is too high above the ground, the player might be able to crouch beneath it to surpass the obstacle. Jumping down from an arbitrary height is possible without consequences but not necessarily intended, as the level provides a fixed path to walk along.
Player Interactions
The level has several points of interest where the player can trigger interactions. For most of them, the drone will also start scanning one or multiple objects in the scene. Additionally, a dialogue with the drone is triggered. For some points, the player can only trigger a dialogue with the companion drone to learn more about the story. Lastly, the player is also able to interact with objects in the level which then changes the environment, for example pushing a pillar forward to create a way across the river.
Companion Drone Behavior
One major game element is the player’s companion drone. It constantly follows the player, is able to scan artifacts in the environment and it provides information about the world to the player through conversations.
When the scene starts, the drone is inactive lying in front of the player on the ground. The player has to interact with the drone first so it is booting. After a small conversation the drone decides to accompany the player.
Drone Movement
The basic idea was to keep the drone flying at a fixed position in front and thus relative to the player, so it can always be seen, but not blocking the overall first-person view. To achieve that, the drone “reverse-follows” the player, which means that it is reacting to the player’s movement but is rather leading the way than actually following the player. When the player moves, the drone will also move, lerping its current position to a fixed position in front of the player. Additionally, the look rotation of the drone will change to the look direction of the player. The drone has two fixed points, one in front-right, and one in front-left of the player. While lerping to its new position, the drone checks the distance to both fixed points and possibly changes the position it is lerping to, if the other point is suddenly closer, e.g. when the player decides to quickly change the direction.
If the player is not moving, the drone will also not be moving, except from idle animations. However, as soon as the player stops moving, the drone will change its look direction to look at the player which makes the drone look more “lively” and it generates a more immersive feeling of having a companion drone at your side. If the player now looks around, without moving, the drone will remain in its position and only start moving again as soon as the player starts moving.
When the player interacts with a POI, the drone will in some cases fly in front of the artifact and start scanning the artifact, which then triggers a conversation with the player. After completing the scan, the drone will fly back to a fixed position in front of the player.
Drone Interactions
The drone itself cannot trigger any interactions. However, if the player triggers an interaction, the drone reacts to it in most cases. It will either start a dialogue or start scanning an artifact.
Scan Effect
The scan effect consists of a group of rays that is set up in a horizontal line and rotated upwards. Each ray is constructed by an emitting sphere in the beginning, an emitting line renderer and a box light as narrow as the line that is shining in the direction of the line to get the hit highlight effect. In order to not have the line show through objects, its length is shortened by the distance of a ray cast if the ray is colliding with an object within a defined length.
Drone Animations
We also added animations to the drone to make its movements more realistic but also to make the drone seem even more “lively”. The drone has two separate types of animations. The first type is the model animation, where the spikes of the drone are moved via rig animations. The spikes resemble the drone’s arms and have no functionalities except for expressing the emotions. The second type is the display animation which shows the facial emotion of the drone. The display itself is a world space canvas, on which different images are shown or animated depending on the drone’s current behaviour. To make the display look mechanical, an image effect is added to the canvas which adds a grid onto the screen for a “rasterized” display. However, another effect that also pixelates the images would improve the effect. But our effect also reached the mechanical look.
Idle Animations
The drone enters the Idle state after both the player and the drone is standing still for a couple seconds. In this state, three emotions are triggered randomly every five to ten seconds. The emotions are happy, sad and warning and an animation clip of both model and display type is played that shows the drone’s emotion.
Movement Animations
As described before, the drone can be in three states: Idle, Moving and Interacting. To be precise, there is also a fourth state in which the drone hasn’t been turned on yet. But let’s ignore this, since both animations are turned off in this state. In the Moving state, only the model animation is changing while the display animation is set to idle which is the occasional blinking of an eye. The model movement blends between the two animation clips idle and flying depending on the drone speed. However, if the acceleration is above a specified threshold, the rolling animation clip is triggered and blends back to the main two clips after the end of the rolling animation.
Interaction Animations
The drone enters the Interaction state after the player has triggered an interaction which is followed by either scan or dialogue as described above. Additionally, there is also the initial interaction with the drone which includes the booting sequence. Both booting and scanning behaviour of the drone has its own display animation that is played when this behaviour occurs.
For the dialogue, one of the three emotions is played whenever it fits to the content of the conversation.
Audio
Music
To fit the setting of our game world, story, and level design, we decided to use a calm track that is being played silently throughout the game as background music. Therefore, we chose a song from bensound, called “November”.
SFX
To make the game feel like a whole and more lively, we added various SFX.
Footsteps
With our level design, we created different biomes the player can walk through. The main walkable part of the map is a forest, another big part is a swamp, and the rest is a grassy landscape. We also have many different rocks and rock formations, as well as concrete stairs and pillars throughout the level. For every different material the player can walk on, we added a different footstep sound. Therefore, we have walking sounds on grass, gravel, stone, mud, leaves, and bushes.
Drone
We added some sounds the drone emits when it shows certain emotions, such as being happy, sad, or when it is warning the player. Moreover, the drone emits SFX when it is booting and when it is scanning.
Ambient
The wind system we added comes with its own sound system, where wind blowing sounds are emitted throughout the level. Additionally, interactables such as the pillar that can be moved play sounds when the player interacts with them. Another example is the campfire which constantly emits a fire-cracking sound as it is burning.
Water Behavior
There are two kinds of water effects in our scene. A river that flows flat through the environment and a waterfall with individual particles.
Shading an interactive water surface
In order to visualize a river flowing through the environment, several effects need to be implemented. The usual effects that can also be seen in basic water shaders (small waves, reflection and refraction), collision with the static environment, and directional flow. Additionally, we add dynamic collisions to visualize the interaction with moving objects.
Basic water shader
A simple transparent plane object is used as the base of the river. The effect of small waves is mostly created by the distortion of reflection and refraction on the surface. This leads to an impression of height displacements on a flat surface. Using the scriptable render pipeline HDRP (High Definition Render Pipeline) of Unity, we have multiple options to show reflections, which are Sky Reflections, Screen Space Reflections, Reflection Probes and Planar Reflection Probes. We use the Planar Reflection Probe because it’s the most accurate method. The resulting reflection is dependent on the normal vector of the surface so it’s enough to manipulate the normals via normalmap. The normalmap is used twice in the shader and both are moved over time in different directions to create a less predictive wave pattern. To visualize refractions, the pre-rendered image behind the surface is used and offsetted with the direction and strength of the normal vector on the pixel. Blending the results of the reflection and the refraction with Fresnel’s law, the base of the river is created. Optionally, we could have added a light loss effect under water, but since our river is not deep, this is not needed in our case.
Collision with the static environment
Since the river is flowing through a static environment, the waves must not move through the terrain but hit the border of the river. For a predefined wave movement, we use a flow map where the red and green values are used to decribe the flow direction on the x-z axis at the representative position. To generate the flow map, we use the heightmap of the terrain and convert it into a normal map, which can be used as a flowmap since the flow direction basically depends on the normal direction. In the shader, the normal map that is created for the base water as described above is stretched over time repeatedly in the direction described in the flowmap. This creates a flow effect on the surface. This effect is used twice with an offset in time and blending to smoothen the loop.
Inspiration for this approach was this tutorial video of a flow shader.
Directional flow
The river itself has a flow direction that is not visualized yet through the previously described effects. The flowmap generated from the terrain heightmap could have achieved the river flow effect, but the slope of the river on our terrain is too low to be noticed on the flowmap. That is why we create another flowmap and add another flow effect onto our water surface. As an input for the flowmap, we place several objects in our environment as “flow zone” objects. We make use of their Transform properties to describe the flow direction inside the flow zone. The position of the transform describes the center position of the flow zone, the rotation describes the direction of the flow, and the scale describes the strength of the flow and the radius and falloff radius of the zone. These inputs are passed on to a compute shader which generates a flowmap from them. This new flowmap is then used in the water shader to create an additional flow behaviour of the water surface. Potentially, the dynamically generated flowmap can also be used for other effects such as wind zones.
Dynamic collisions
Our environment also consists of moving objects that collide with the river, eg. the falling pillar. We could have added another dynamically generated flowmap with the moving objects as inputs. But to decouple this specific behaviour that is changing on runtime, we use decals instead. Decals is a feature provided by Unity HDRP that manipulates the color and normals of a surface the decal projector object collides with. We manipulate only the normals of the surface with a normal map of a water ripple pattern. Our effect prefab, that is placed at the collision position, animates a group of decal projectors so that it behaves like water ripples.
Simulating a surrealistic water behaviour with particles
In our setting of space with gravitational anomalies, we initially decided to also have surrealistic water behaviour where it weirdly floats in an unnatural way.
First approach: VFX graph
One way to achieve this effect is to use the liquid effect used in the Unity project provided by alelievr on GitHub. In this method, the VFX Graph of HDRP is used to manipulate single water particles. Each movement of a particle can be moved as we like with which we can visualize surrealistic water behaviour. To keep the overall effect look like water, alelievr made several steps.
At first, the particles of the VFX graph are rendered as unlit and stretched spheres with the local vertex offsets as colours (xyz as rgb). Second, a Custom Pass is used to render only the VFX particles onto a temporary buffer. This buffer is getting blurred via a compute shader, so that the particles nicely intersect within each other. Using the pre-rendered image of the main opaque pass and the blurred buffer as normal input, a dynamic and volumetic water effect with refraction and reflection can be generated in the buffer. Finally, this buffer is applied to the final render that is seen by the player. However, we encountered a problem when applying the effect onto the final render. The depth buffer of the effect seems to be stored wrong which leads to false culling of the effect. Because this bug is highly visible to the player, we decided to discard the effect.
Simple waterfall
Instead of surrealistically start and end the river, we now add a waterfall on each end using the Particle System. We use a particle effect from the Particle Pack by Unity Technologies in the Asset Store (https://assetstore.unity.com/packages/essentials/asset-packs/unity-particle-pack-5-x-73777) and adjust the materials in order to make the particle effect work on HDRP.