Fall semester [Systems & Level Design]:

Stage 1: Planning

We wanted the vertical slice for Visualizer to blend together elements of games like Crypt of the Necrodancer and Doom.

We took inspiration from how Crypt of the Necrodancer imbued every aspect of the levels and environment in the game with the music; making these elements move and/or flash on beat. Our main focus here was trying to make the player not only predict, but feel the beat in Visualizer using Crypt of the Necrodancer as a reference.

We drew inspiration from Doom to help guide the develop of our weapon/enemy behavior and interactions. In Doom, each enemy feels unique and distinct from all other enemies and as such asks the player to use a specific type of behavior in order to best defeat the enemy. We wanted to mimic this in creating enemies that could be easily classified into roles like, “tank”, or “grunt”, or “ranged” in order to allow us more control over the gameplay that would emerge from different enemy spawn combinations or locations.

For our vertical slice, we aimed to set our game in a night club as this was the most conducive environment to play with a musically influenced level design. We planned to have three distinct enemies and two unique weapons at the player’s disposal. We wanted to enrich combat by developing a combo system. The combo would increase when the player shot on beat, and the player would have the ability to either use a dash or an area of effect blast ability if their combo was high enough. We wanted to reward skilled play and give the player as many reasons to shoot on beat as possible. In terms of the vertical slice game loop, we wanted to utilize a wave based progression system that would increase in difficulty on each subsequent floor of the club.


We recognized and were very cognizant of the difficulties of tying elements of our game to the beat of the music and we designed both Visualizer and our development road-map with that in mind. I had prior experience with creating audio visualizers in Unity and reading audio samples via microphone or file input. Therefore, I had a basic idea of how we could go about creating a rudimentary beat detection system. That being said however, we wanted to allocate plenty of development time to polishing and refining this behavior, as this was the backbone of the game.

With no dedicated programmer on our team, Simon Estabrook and I were very meticulous about how we were going to split our design and programming related tasks to account for this lack of a programmer. We decided that I would take the game loop, weapon functionality, sound design/programming, and level design tasks while Simon would take AI, player movement/abilities, UI programming, and beat detection tasks. In addition to this, we developed a strong back and forth design discussion whenever we had questions or concerns when it came to making a system or overall game direction decision. We figured that this was the most effective means of task delegation as our respective tasks would have enough overlap that we would both constantly be in the loop of what the other was working on. This allowed us to both have an equal understanding of what the design direction was, which allowed for us address any issues very quickly.

Additionally, since we also had no dedicated producer on our team, we needed to do our part in team management, sprint planning, and road-mapping. The team had decided very early on that we wanted to handle these production roles as a collective. However, we recognized that we needed dedicated team members on certain aspects of this in order to keep things organized. As such, we had a member dedicated to sprint planning, one on road-mapping and documentation organization, and I took on the loose role of product manager and team manager.

Stage 2: Vertical Slice Development

At the stage where we had decided to continue with our prototype of Visualizer, we had rudimentary player movement, a combo system, AI, and beat detection implemented. We were then ready to start building out our play area as we continued to refine these features. I then worked to block out our first level to get a sense of scale and shape language so that I could establish a strong pipeline with my environment artist that would make building out future levels faster and more efficient.

Level 1: The starting area where the player has time to get their bearings on movement, abilities, and shooting before being put into combat.

Level 1: The starting area where the player has time to get their bearings on movement, abilities, and shooting before being put into combat.

Due to the lack in AI programming experience on the current team, we knew that our vertical slice AI would be a bit basic. We planned for the AI to have a bit more of a generic “swarm” tendency since that was more plausible given our time and experience. Taking this into account, I started playing with areas of verticality in the level design that would allow the player more easily create distance between themselves and the AI by moving to higher areas of the level. Additionally, I included areas of cover like bars and counters in the more flat areas, that the player could use to kite enemies and maintain this distance.

Level 1: The blockout for the main room, areas in blue represent couches, counters, and stairs, while the areas in green represent plant beds, decorations and other non-gameplay influencing assets.

Level 1: The blockout for the main room, areas in blue represent couches, counters, and stairs, while the areas in green represent plant beds, decorations and other non-gameplay influencing assets.

With our basic play area blocked out, it was then ready to be populated; receiving necessary edits as we play tested. With the first level headed in the right direction, my focus then shifted from level design to designing and programming the basic weapon functionality for our two primary weapons. At this point, our player movement and our AI behavior was starting to come together, which highlighted the weaknesses in our weapon behavior. Our prototype was currently utilizing projectile based weapons which made shooting an enemy on beat extremely hard. The player would have to factor in bullet travel time and account for player and enemy movement for every shot. It was a quick and easy decision to move to hitscan weapons, allowing shots make contact with whatever the cursor was aimed at instantly. This entirely removed the travel time element from our weapons, which caused us to immediately see a huge uptick in play-ability in our QA tests.

In our early concepting phase we wanted to include a number of unique weapons to vary gameplay. However, since we were unsure we would be able to create and polish a large number of weapons we settled on simply featuring our basic pistol and shotgun in our vertical slice. As a a stretch goal, we wanted to leave the space to create more weapons, so I needed to create our weapon manager with that in mind. I worked to create a script that was modular enough to allow new weapon integration with relative ease. I also took extra care in setting up each weapon’s functionality to be as customizable as possible to allow for rapid iteration on the design and balance side of things. For example, we had full control over number of pellets the shot gun would fire and full control over the pellet spread.

At this point, with our weapon manager created, we had all of our basic systems in place and in a testable spot. We were now ready for implementation of our character models, animations, and environment art as we continued to polish our systems and mechanics. We jumped back into level design to finish level 1 and populate it with our new assets, as we implemented our character models/animations and as much feedback as we could.

One of the biggest challenges early on with Visualizer was communicating the beat to the player. We went through a few iterations of our beat detection system before landing on a solution that worked surprisingly well. In our earliest rendition of our prototype, we were trying to isolate the drum frequency in Nirvana’s “Smells like teen spirit” which seemed to only occasionally detect the proper beat. Through some research we realized that it would be impossible to detect our beat like this, and developing a system that would detect a beat automatically was out of our current skill sets and time frame. We instead transitioned to detecting the beat in real time using a song’s bpm and effectively driving gameplay like a metronome. We did want the flexibility of using other songs in our game however, so we created a basic framework for detecting a songs beat by playing a muted “click track” over the music. We then detected if this clicktrack made a sound and called our beat functions. This allowed us to have more control over the game’s music as it allowed for a more varied and interesting beat.

With a more established beat detection system, we were better able to play the game. However we were still lacking in visual feedback. We wanted our UI to be as diegetic and unobtrusive as possible; leaning into our inspiration Crypt of the Necrodancer a bit more. The idea was to have level elements pulse to the beat in an attempt to immerse the player in the environment and make them feel the beat rather than just see it in a visualizer or UI bar. Below is an example of how we tried to do this:

The club walls were given an audio visualizer effect, environment materials were swapped on beat, and the player’s reticle pulsed on the beat.

The club walls were given an audio visualizer effect, environment materials were swapped on beat, and the player’s reticle pulsed on the beat.

In addition to visual feedback, the game’s audio feedback status was at critical levels. Players had a very hard time telling if they were shooting on beat or off beat, only having a particle trail to base their success off of. To remedy this, I worked to develop our on/off beat sounds for both the pistol and the shotgun. I remixed and combined different weapon sound files to distinguish on beat and off beat sounds. One of the biggest challenges I faced in doing this was designing the sounds in such a way where it was immediately clear if a shot was on beat or off beat. I tried a number of different variations in making on beat sounds feel more “full” and off beat sounds feel more pathetic and unpleasant. Ultimately with the assistance of particle effects, camera shake, and multiple rounds of iteration, we reached a point where on/off beat shots felt distinguished from each other.

In addition to this I had to be very mindful of removing the sense of realism, not only from the weapon sounds but from all game sounds. We were very careful to design our enemies and setting in such a way where we removed a sense of “realness” as we were concerned about the implications of setting our game in a night club. Because of this, I faced unique challenges like making a shotgun sound simultaneously like a laser gun and still retain elements of a shotgun blast in order to read as a shotgun to the player.

By the point where we had most of our SFX in the game, the semester was closing in and we needed to get our final gameloop ironed out. This meant adding a second level. Learning from the mistakes I had mad in the first level, I created a much more open floor plan.

Level 2: A bird’s eye view

Level 2: A bird’s eye view

In the previous level, I had noticed players getting stuck or hung up on corners of objects and it made traversing the level very difficult and punishing. This was especially true given that we had been using our strongest enemy, the “Brutilizer”, in tandem with our weaker two enemies the “Drifter” and “Strider”. In opening up the design in level 2, the player was able to make better use of their dash ability, kite enemies to pull off a strong AoE attack, and avoid the Brutilizer’s projectiles. This contributed to a much more fair fight on both levels 1 and 2. Level 1 now only featured the “Drifter” and “Strider”, and level 2 was both large and open enough that the player was able to face all three enemy types at once.

Level 2: I provided the player with two raised platforms each with multiple exit points that the player could use to pick off enemies from a distance and take a moment to release some tension.

Level 2: I provided the player with two raised platforms each with multiple exit points that the player could use to pick off enemies from a distance and take a moment to release some tension.

Old Level 2 (1).jpg

In creating the second level, it highlighted the biggest flaw in our game at the time; the game loop. Upon entering the level the player was required to shoot the play button by the stage to start the game. Then upon completing the four waves, they needed to exit the level in a different elevator than the one they entered. I compound this, almost none of that was effectively communicated to the player, barring the play button mechanic. Instead we relied on context clues and vague UI text telling the player to “go to the elevator” or “shoot the play button”.


I developed a tutorial in which the player walked down a one way hallway, leading to the elevator we have been using for scene switching and general progression. Along the way, the player is stopped by walls that contain a checklist of items they must complete, like “Shoot on beat 5 times” or “Use the Dash ability”. Only once all of the wall’s requirements have been met, the player is then allowed to shoot the play button at the base of the wall. This deleted the wall and allowed the player to progress to the next wall. This both taught the player how all of our game mechanics worked, and developed a design language with the player that read “Play button = Start” and “Elevator = Finish”. Overall his helped communicate to the player how they needed to play a bit more, but the communication was still a bit lacking.

It was at this point, where the vertical slice for Visualizer was ready to be shown to the faculty. This was the start of the game at the end of the Fall semester or 2018: