My job for this week was to make a skybox of the interior of the Design Building. I was also tasked with fixing a transparency ordering bug too. The skybox was process to figure out.
A skybox requires a cubemap, a set of six images to form a panorama. To make these six images, you need to take a panorama and convert it to them. My phone can take panoramas. This tool can convert them into the desired images. It is a blender project. https://aerotwist.com/tutorials/create-your-own-environment-maps/
I took a panorama from my phone turned it into a cubemap. I inserted these six images into unity and it worked. One problem persisted: it was distorted incorrectly. Apparently, these cubmaps work best with panoramas shot using a fisheye lens.
I could not find any free or paid tools to convert a panorama into the desired one. There are plenty of tutorials that show how to remove fisheye distortion from photos, but not to add it to panoramas. There are tutorials to add fisheye distortion to normal image. It is interesting, but not useful for my needs. https://www.youtube.com/watch?v=PfhBDt5qZ5I
Facebook has a 360-photo capture tool. I do not want to use Facebook to do this work, but their promotional video showed a person taking multiple panoramic shots. This gave me an idea. I decided to take five panoramic shots of my room at multiple angles.
Photoshop has an image converter tool that can stitch photos together, which is nice. A similar PhotoMerge tool is available in Photoshop Elements. It is hit or miss with panoramic photos and often fairs. The best setup, which requires further editing was when distortion correction was enabled. I inserted them into the PhotoMerge. It worked… kind of.
Our college held a 24 hour game in the design building. Two hours before the event, the building was empty. I shot some panoramas. I went back to my room to stitch them together.
Out of curiosity, I looked up what hamsters see. I learned that hamsters have blurry vision. Most of their eyes are filled with cones. These cones help them see in low light which does not help during the day. Unity has prebuilt shaders which I used to emulate their visions. I applied blue, depth of field, and desaturation. I also made a checkbox to disable and enable this effect should the team decide against it.
The next day, after the game jam I implemented the features and synced it with GitHub.