Hamster Ball VR Project Week 8

Hamster Cage Level

This is the second to last week working on our VR Hamster Ball game.

Tree Bounce Puzzle

During our meeting, we decided that the tree bounce puzzle should involve the water bowl. On the second floor, the water bowl is placed on top of the wooden overhang. The blue rock placed below the overhang serves to retrieve the water bowl. Blue arrows are placed on each palm tree as context clues. This is technically an optional puzzle, but it would be inconvenient to skip. The rock’s mass is five times heavier than the bowl. It would be harder to push it up a ramp which is the required mechanic needed to finish the next puzzle. Food pellets are placed alongside the water bowl to reinforce the act of the water bowl falling.

Continue reading

Discussing VR Concern

Spending money Google Cardboard XG Virtual Reality headset.

Virtual Reality has its concerns; I was asked to write about some of the concerns and standardization regarding the state of virtual reality.

Is VR Safe for kids?

Over Thanksgiving, I discussed the idea of a Samsung Gear VR as a holiday gift with my uncle. He said that there were some scientific studies showing that kids are too young to use VR games.

I wanted to learn more. PlayStation made an announcement that their PlayStation VR is not suitable for kids under the age of 12. From what I found, it is not that VR is dangerous, but rather the distance between one’s eyes, otherwise known as interpupillary distance (IDP), is key for a smooth VR experience. Adults IDP ranges from 48mm and 73mm, based on race and gender. Kids IDP ranges between 40mm and 55mm. This smaller IDP is not supported by their headset, suggesting that kids would have headaches from blurry and distorted images. For example, the HTC Vive supports an IDP between 63mm and 73mm. This addresses concerns of the health and safety precautions involved in VR. With further concrete information made available to concerned consumers, VR will continue to have better adoption.

How do we know if VR will work?

Another concern about VR hardware: The accessibility and price of mobile VR equates to a lower barrier of entry. Should consumers find the mobile VR space valuable, but not satisfiable, they may opt for the higher-end PC route. Many companies are working together such as AMD and Oculus, using the term “VR Ready”. This term identifies a computer that can run VR games at 90fps per eye and meet the minimum specifications to run a VR headset. This is another standardization to help ease the fears of consumers.

Will VR provide good value for the price?

A large concern for mobile VR is the idea of disposable games. The marketplace for VR is broad, such that enough content exists to meet the demands of consumers. Is the game too short, was it worth it, and did it have cool features? Breadth provides variety, not quality. The depth of content is still being developed, which is where some concerns may lie. Depth will influence these concerns.

It is easy to find short and simple VR games. A bit of rigorous research will bring many solid games full of depth as well as premium experiences. What constitutes a deep game is subjective and to the consent of the consumer. As a market, this deepness should have a definition.

Look back to 1983 when consumers lost confidence in console games. The Nintendo “Official Seal of Quality” provided confidence in the Nintendo Entertainment System to weary consumers during the Video Game Crash of 1983. This may be less important to games than they were, but for a new marketplace, this depth needs a neutral basis that the public can identify and relate too. Perhaps this is one method to standardize what constituents a deep and engaging VR experience.


Consumer confidence is a major force that influences the adoption rate of new products, especially for VR. By providing standards that are neutral, unbiased, and informative, people will be willing to consider VR as an option for media consumption.



Hamster Ball VR Project Week 7

This week I was assigned a few tasks.

The first task was to update the game’s audio. This included two new songs, one for each scene. There were also some improved sounds clips that needed to be swapped in.

Organizing the assets folder was next. The directory had many imported assets in their own folders. The other assets were clumped together under a common name (e.g. materials). I decided to create a two folders called content and no longer using. All assets go in the content folder. Each has two subfolders named assets and prefabs. Assets no longer in use go in the other, such as those from the prototype go in the other.

The final task was to set up a puzzle involving launching an object using the palm tree colliders. I forecast that this will be a difficult puzzle to solve, since it does not involve a button like the other puzzles in our game. Instead, the player will need to hit wooden beams to free a ramp that leads to a button. Like all designs, the team will need to agree upon this first.

Hamster Ball VR Project Week 6

This week I was required to add sound into our hamster VR game.

I have performed this task many times, so it was straightforward. Normally, games use FMOD or Wise to make dynamic sound, but we decided to use Unity’s built in audio

My first session included setting up the system that would play the sound. My second session involved getting it playing correctly. I started by making a class to manage the sound playback information I store each AudioSource and information needed to play it, such as looping, in a separate class.  At the start of the application, these sounds are loaded into a dictionary that can be accessed by an enum called Sounds. When a sound wants to be played, the SoundManager plays it without other classes needing to know about the AudioClip. Furthermore, I have a separate class inherit from SoundManager that initializes sounds specific to each scene. This approach is designer friendly. Our designer can swap out AudioClips in the inspector without having to know the code. In addition, since the sounds were documented ahead of time, we can be sure that all the enumeration values are included

Another sound is important in this game, a rolling sound. It fades in and out depending on the velocity of an object. I programmed this in a separate class that can be attached as a component. Currently, the ball, rocks, and other movable objects use it.

Continue reading

Anti-Aliasing Research

This week I watched the Technical and Design VR tips Unite video from 2015: https://www.youtube.com/watch?v=_2T0dwGYP0s.

The most important takeaway is that “framerate is king”. Most of the tips were basic. For example, humans do not like terrifying imagers or surprises, especially in VR. Audio quality is very important. Stereoscopic 3D combined with other depth perception methods like parallax shading is effective at producing the illusion of depth. Most of all, humans do not like acceleration, especially vertical forces.

What struck me was their suggestion to include anti-aliasing. They say to have x4 MSAA enabled for mobile devices and x16 MSAA enabled for gaming computers. I decided to look into this further. Needless to say, there are many forms of anti-aliasing.

Types of Anti-Aliasing

SSAA, is simple anti-aliasing that performs supersampling on a texture. Supersampling is when a pixel’s color is averaged with surrounding pixels.

MSAA, or Multi Sampling Anti-Aliasing, involves forward rendering. It is more efficient than SSAA. It looks at the edges of polygons and performs sampling there. Unlike deferred rendering, forward rendering does not blur the image. This is important for virtual reality.

FXAA, or Fast Approximate Anti-Aliasing, is faster than MSAA. It uses deferred rendering. It fails to smooth edges, especially which high contrast to its surroundings. Despite being “good enough” for some, the edge aliasing makes it less adequate for virtual reality.

CSAA which is called Coverage Sampling Antialiasing is used by NVIDIA. It looks a little better than 16x MSAA at the performance of 4x MSAA.

MLAA, or Morphological Anti-Aliasing. Like FXAA, it is a post-processing effect that uses blurring. It is very fast.

Image used under the creative commons license. Author: Jeff Atwood

No AA (left), 4x MSAA (middle), FXAA (right). Image used under the creative commons license. Author: Jeff Atwood

I understand why Unity’s talk would suggest MSAA rather than other methods. From my research, I believe that a combination not supersampling and MSAA would produce an adequate result. Unlike looking a computers screen, the need for proper anti-aliasing is more important for virtual reality when the pixels are up next to the eyes.

This is very much related to our class’s hamster VR game. Last week, we fixed an issue with transparency by using cutout geometry. This caused our hamster cage to be very nauseating and disorienting. Pixels were moving even when idle. I added an FXAA anti-aliasing shade to the main camera. I will change it MSAA and see if there are improvements.


In-depth explanation of Anti-Aliasing: https://mynameismjp.wordpress.com/2012/10/24/msaa-overview/

MSAA: https://en.wikipedia.org/wiki/Multisample_anti-aliasing

Cool Video comparisons different Anti-Aliasing methods: http://www.iryoku.com/smaa/downloads/SMAA-Enhanced-Subpixel-Morphological-Antialiasing.mp4

Hamster Ball VR Project Week 5

top down view

This week for our virtual reality hamster game, I was tasked with prototyping a main menu. I did not have much instruction or directions to follow. Given that this is a VR game, I decided to make the menu a playable space. The walls have text which explain how to play the game. In summary, this is a decent prototype tutorial which is expected to change.

play area

A simple play area to learn the game’s mechanics


There is an area to push rocks around, an area to bounce into a palm tree, and a how to play wall. I was a little clever and put a credits wall down a small passage. I figure curiosity will maximizing the chance that someone will see it.

I did manage to figure out why global illumination stopped working when I unchecked auto build. It clears the cache. To fix it, the build button needs to be pressed. Obvious.

I spent some time making a mystical game entrance since this task did not take very long. I played with a combination of semitransparent walls, point lights, and particle emitters. The latter two ended up being used. Unity’s built in particle systems is flexible. Now for some great screenshots.

Big Entrance

This is definitely going to change when the game is finished, but it looks cool.

The beloved how to play wall. This is literally a wall of text.

The beloved how to play wall. This is literally a wall of text.


Hamster Ball VR Project Week 4

This is what the skybox looks like. It is a little.

My job for this week was to make a skybox of the interior of the Design Building. I was also tasked with fixing a transparency ordering bug too. The skybox was process to figure out.


A skybox requires a cubemap, a set of six images to form a panorama. To make these six images, you need to take a panorama and convert it to them. My phone can take panoramas. This tool can convert them into the desired images. It is a blender project. https://aerotwist.com/tutorials/create-your-own-environment-maps/

I took a panorama from my phone turned it into a cubemap. I inserted these six images into unity and it worked. One problem persisted: it was distorted incorrectly. Apparently, these cubmaps work best with panoramas shot using a fisheye lens.

I could not find any free or paid tools to convert a panorama into the desired one. There are plenty of tutorials that show how to remove fisheye distortion from photos, but not to add it to panoramas. There are tutorials to add fisheye distortion to normal image. It is interesting, but not useful for my needs. https://www.youtube.com/watch?v=PfhBDt5qZ5I

Facebook has a 360-photo capture tool. I do not want to use Facebook to do this work, but their promotional video showed a person taking multiple panoramic shots. This gave me an idea. I decided to take five panoramic shots of my room at multiple angles.

Photoshop has an image converter tool that can stitch photos together, which is nice. A similar PhotoMerge tool is available in Photoshop Elements. It is hit or miss with panoramic photos and often fairs. The best setup, which requires further editing was when distortion correction was enabled. I inserted them into the PhotoMerge. It worked… kind of.

Another look at our awesome skybox.

Another look at our awesome skybox.

Continue reading

Legend of Dungeon VR Review

Legend of Dungeon


I decided to play Legend of Dungeon for my VR midterm report. Legend of Dungeon was developed by Robot Loves Kitty. It is a “…is a randomly generated action RPG Beat’em’up with heavy Rogue-like elements, striking visuals, and dynamic music” in which you will die, many times. I discovered the game at PAX East 2 years ago. I bought it because VR was being developed. I am glad to finally have a chance to use this awesome feature.


How Legend of Dungeon plays in VR

This game plays very well. It feels more comfortable playing in VR. It supports both the Oculus Rift and the HTC Vive.

The game requires a controller to play in VR using the Vive. The keyboard works without VR, but the game world is rendered away from the keyboard. I believe that it mapped it to the edge of the play area so one can sit in the middle. Thumbsticks are used to move around, the X and Y buttons to cycle through items, A to attack, and B to drop an item. All controls can be remapped in-games. It is quite convenient.


The Music

The ambient music does not really add to the VR experience. It uses stereo sound, rather than spatial 3D sound. This game added VR after release. Like the controls, the music was not adapted to support all VR features and this is okay.


This is a sitting down VR game

As I am seated using my controller to move the character, the weight of the headset became fairly noticeable. This could definitely lend itself better if the headset was lighter. The HTC Vive is quite heavy. I adjusted it during my second play session, but it was still apparent.


The View and Controls

Unlike many VR games, scene is presented using an omniscient point of view in the third person. Legend of Dungeon offers an option to have the camera follow the character or to be stationary. The stationary option feels like a museum. The entire scene is rendered during play and can be seen from a distance. Sitting down, this is pretty tiring. I found myself having to look left and right many times to see where the character is.

The stationary camera made it difficult to control the character when stage left or right. The joystick responds to up and down. The character would move in and out, but not closer or further away from the camera. The other options led to a better experience: Better character control with a small trade-off; there was an initial discomfort when the camera follows the player. It is like motion sickness because the camera  moves, but not the user. My eyes had to adjust to the screen moving left and right too. After ten minutes, this woozy feeling went away.

Normally, the cameras renders what is around the character. In VR, the character is just one part of the scene. All objects can be viewed unless the game generates a room with no lights. Then you need a lantern or a light emitting hat.


Lava pit

Screenshot by Mesa.


Room Safety

Legend of Dungeon has rooms. Each room has doors, which lead to more rooms. The player can only see one room at a time. In this door-hallway, the player is safe from attack. In VR, one can see the entire room. While playing, I could plan my attack before advancing which was very beneficial. I could also absorb the room’s aesthetics in safety too, if I wanted a small breather.


Depth Perception

The game is 3D, but the characters, items, and npc’s are 2D sprites. The camera uses a mid-distance semi-high angle shot makes this distinction clear, but it is low enough where depth perception can be difficult. I ask myself, “am I aligned so my attack will hit the enemy?”

Depth perception is not an issue in VR. The head tracking lets me move around and look at the game at various angles. It also makes it easier to judge the position the character too. I never jumped into a lava pit by accident because of this. Fun fact, it can be a common occurrence to see “fell into a lava pit” in the leaderboards.

Another benefit in VR is that no light seeps in from the “real world”. Legend of Dungeon has a black background, favoring play in the dark. As such, this was inherently more immersive, since the darkness helped me focus on playing the game in front of me.


A dark open room. Not so spooky, not so deadly.

A dark open room. Not so spooky, not so deadly.


Session Overview

I played for a total of 2 hours. I played this game for 3 hours before doing a review, so I am familiar with how it plays. Again, this review is targeted towards the VR feature.

Session 1: 1 hour

To Legend of Dungeon in VR, beta mode needs to be enabled. Right click on the game, go to properties. Beta mode can be changed there. This mode  requires a controller to play in VR too. The controller was not initially working.  Apparently, the headset needed to be close enough to the play area for it to work. I think this is to prevent accidental button inputs. There were no other real setup problems. In fact, it was straightforward. I played timed.


Session 2: 30 minutes

I put the game down for about 20 minutes before playing again. I readjusted the headset, but it was still heavy on my forehead. I quickly died in-game. It’s a roguelike; it happens. On the next run, I found a strong weapon and totally forgot about what was going on around me. It was very relaxing. Everything was presented in front of me. I did not have to look around. I did not have to worry about what was going on behind me because it was dark and quiet. I was completely immerse you in the game, even though it is rendered in a third person perspective. That was, until a horde of oger frog monsters had their minions to attack me.


Session 3, the next day: 30 minutes

I played Legend of Dungeon without using the Vive. The game was physically easier to play because there was no heavy headset, but the added level of immersion was lost. The most apparent non-VR usability feature is the cycling item menu. In VR, this menu is located below the play area. I needed to focus elsewhere to use it. Normally, this menu is located in the lower left corner of the screen. I did not have to spend as much time remembering what I had available, additionally making the game mentally easier to play.

There is one challenge many roguelikes face: making inventory management accessible and easy to discern. One way that this could be solved could be to use hand ge\gures to pick an item from a grid, similar to Terraria’s inventory screen. Though, Legend of Dungeon system works, especially since VR was added after released.


A Little Legend of Dungeon background

I decided to see more development history of Legend of Dungeon. It is pretty neat. Robot Loves Kitty has a Tumblr blog. The game was made in Unity, successfully funded on Kickstarter, was Greenlit on Steam, was presented at PAX, and is available for sale. It even has, or is being developed, to have Twitch integration.

The verdict, I like playing it more in VR. A great add on-feature. I just wish the Vive headset was lighter


Hamster Ball VR Project Week 3

This is the week 3 update. The project is going very well. The artists have made palm trees, a hamster house, and some paws. A cage is being modeled this week. The level designers have been making some great designs too. Notably, the level will feature a two story hamster house instead of a simple ground area. The puzzle pressure plates and switches are still being implemented. I was tasks with two jobs. First, I needed to prevent the hamster ball from moving up steep inclines. Second, I needed to slow the hamster ball down when controller’s triggers are pressed rather than stopping them immediately.

  • After some consideration, I decided it was good to use invisible walls to address places with steep inclines… Classic.
  • The second problem was rather easy; it was finished within a few minutes. Instead of multiplying the ball’s velocity by zero, it is multiplied by 0.95. This might require some tuning, since the deceleration can cause dizziness.

  • In addition, I fixed a weird acceleration bug related to falling. The code that checks to see if an object is falling was referencing the wrong game object. This would allow the player to move in midair, but it has been fixed. 

  • Lastly, I pushed the project to GitHub. No more flash drive backups, it is using source control now.


A Virtual Reality App for Education

The growing field of virtual reality has many applications and education is one of them. As seen in the many exhibits in the The Art of Video Games museum series, education and serious games can work together. What if there was a virtual reality application that could be used for engineering and architecture.


Students using this application would be able to design and draw their concepts in virtual reality. A comparison would be Tilt Brush. In that application, the player can paint in 3D. In this school application, students can create CAD drawings, blueprints, or mock-ups in virtual reality. At the same time, the computer would build the 2D drawings to exact specifications. This would allow creations to be printed and shared.

A key user experience concern would be navigating menus. Navigating this application would most likely involve an in-game menu. Users could select items using the controllers. Hand gestures would be more precise, good for tuning the thread of a bolt.

This would be more than an interactive design interface. It would provide additional physics and engineering features. People could interact with their structures and drawings by seeing particular metrics of interest. Examples would include air flow, heat distribution, structural integrity  and points of failure. These features would enhance the application from being drawing simulation, like Tilt Brush, to a suite of tools that can train students to handle real world scenarios.

The logistics are possible. Collaborate editing would be very important, but still requires more advancements in motion tracking. For example, a teacher could have a class demonstration. The simulation applied to each model might require faster computing, unless a college has access to a supercomputer. Unlike games, these simulations would need to be accurate and process information in real-time. 

This would truly be a great product for the STEM field.