Department of Astromaterial Acquisition
Academic / Group
Unity, C#, GitLab, HDRP
Tech Lead, Game Dev
08/2020 - 04/2021
Department of Astromaterial Acquisition (DAA) is a Unity first-person voxel sandbox game. I leveraged bleeding-edge technology from Unity to create this fantastic game.
As a tech lead, I was mainly responsible for integrating all the technologies my teammates made into the main game. Iterating on the game experience was also a big part of my responsibilities.
A shoutout to my teammates for making the terrain generation algorithm! They successfully multithreaded a chunked cube-marching algorithm in a spherical world and made it run in 60FPS! In addition, they wrote a fantastic blog about how they did this here.
The team started as 7 programmers and ended with 11 programmers. Unfortunately, we had no artists, sound designers, or game designers.
The three video below gives an overview of what the game is, and my personal contribution is under the colored bars.
Leadership: Technical Lead
I was the Technical Lead of over eight other programmers. I set up most of the initial project and determined the technology stacks we were working on.
I was also the tech support with the most Unity experience. I generally went around helping people break through their problems. I generally like to give the more creative and competent programmers space to do their magic, while mentoring the weaker ones.
As a leader, I also looked out for everyone's mental state and maintained a balance of emotions. I hosted games of Among Us as a team builder and hosted a hotpot to get everyone together.
As Tech Lead, I determined which third-party libraries we need to use and why. Below is a sample of the excel sheet I used to justify each third-party library.
I set up standards like the coding style and the folder structure in Unity. For coding style, we also used ClangFormat to format our code.
Over the course of the project, I managed my team members by keeping their task lists updated and informed about their progress. I am also responsible for integrating their feature into the "Main Scene" and thus will personally close each issue.
I dedicated a portion of my work to designing and coding the main ways the user interfaces with the game. First, I laid the foundation of the code following the Model-View-Controller (MVC) paradigm, where the View depends on the Model (data). This foundation facilitated easy iteration of the user interface, shown below.
User Interface Breakdown
I designed the initial UI to be similar to Minecraft's. Furthermore, implementing this proved the MVC architecture.
Initial UI design and explanations
After the initial concept, I wanted to keep the main screen open to see the beautiful world. However, pushing all the UI elements away caused the player's eyes to wander. Therefore, I wanted to minimize the area the eyes need to see.
There were also UI for orientation and jetpack that had little indication of what it represents. So I came up with the reactive UI concept to solve the confusion.
The final iteration of the user interface with explanation
In the same way, I changed the crafting/inventory menu as time progressed to shrink the amount of space the UI was using and focus the eyes.
Old interface: Suffers from using large spaces + wandering eyes + unclear interface
New Interface: Abide by Rule of Threes + consolidates eyes on center + More intuitive interface
Reactionary UI: Jetpack
Equipping the jetpack lights up its UI, and the color corresponds to the jetpack level.
Level 1 Jetpack
Level 3 Jetpack
Reactionary UI: Depth Meter
Before the orientation arrow, this has been changed to a depth meter that only comes up when the player equips the Goggles.
Reactionary UI: Selected Item
Feedback indicated players did not understand what their selected item does. So I added the icons and the UI that show what actions are available to the player at any point.
Reactive UI in Action
Informing User of the World - Feedback
Initially, I designed the feedback like Minecraft's; the player had to figure out which resources and collect them based on their own needs. However, we quickly realized that this does not work well in a voxel design because a player cannot pinpoint a voxel. Furthermore, there are multiple ores in a voxel. Thus, players will get a handful of everything when mining the voxel. Hence the player will get confused over what they're drilling.
Hence I quickly designed and implemented the Ore Analyzer. The team loved it so much that we eventually incorporated it as equipment attached to the advanced mining equipment.
The necessity to 'aim' the deformer tools soon became quickly apparent. However, Minecraft did not need such things because everything was flat; our circular game world was not. Hence I designed and implemented the aiming feedback system. It consists of telling the player where it will deform.
A raycast + a projected circle to notify players where it'll deform
Real-Time Physically-Based Rendering
I took charge of the High Definition Rendering Pipeline (HDRP) integration with the game. A lot of this is adjusting how the game looks without an artist.
Day-Night Cycle Lighting
I leveraged Unity's lighting system to write a Day-Night system. Unfortunately, there were many difficulties adjusting the light such that it works all the way from space and under the ground.
Another issue was that we wanted three global sources of light: sun, moon, and stars. So I had to balance all three depending on each other delicately.
Controlling the sun, moon, and star positions and emissions over time
I leveraged Unity's atmospheric lighting and adjusted it to work on a spherical world. Because of the open nature of the world, I spent much time adjusting the numbers so that they would look pleasant from up high, from the surface, and the caves. I also had to balance this with the real-time lighting system, from dawn to night sky.
Spherical Atmospheric Lighting
Camera Exposure Settings
Unity based their PBR on real-life cameras, which work with an exposure setting. In real life, a cameraperson would manually adjust the exposure settings. For example, a high noon setting would require a vastly different exposure setting than nighttime.
I experimented with many different functions that change the exposure according to the player's condition. In the end, I calculated the 'brightness' from the past few frames, giving more weight to the pixels near the middle of the screen and taking the average. To get rid of outlier values in a frame, I removed values below the 10th and above the 90th percentile. Finally, I limited the final exposure value between [3,13].
Debug view of the Automatic Histogram mode with the described settings
Below is a simplified code for controlling the sun and star emissions by the time of day.
Shader Graph and HLSL
My work also revolved around creating visual effects using Unity's Shader Graph and HLSL.
I leverage Unity's High Definition Rendering Pipeline (HDRP) to create an outline shader. This shader is used throughout the game to highlight objects of interest to the user.
Early in the project, I was in charge of making the ground "burn." To this end, I prototyped many different burning effects and evaluated their use in the game. I used Unity's Shader Graph to create these effects. In the end, these effects were too difficult to implement with a spherical chunked world. However, we reused these effects for the death animations of the enemies.
The effect is made by mixing a distortion effect and a heatwave effect. The heatwave effect is then split into four different levels and then applied back together to create a more "wavey" effect and for colors to be more distinct.
Crafting Tool for Designers
I created a graphic tool used for designers to visually arrange the ingredients to craft an item. Again, I leveraged Unity's graph system and the open-source xNode project to do this.
Backend Item Database
I spearheaded the design to use Unity's Scriptable Objects (SO) for items. Using SOs made life a lot easier as there is only one instance defined at any point in time. When we need to create the object in-game, we can clone it from the prototype. This decision simplified much code.