“Stitch Media’s main goal is to use new media like VR to focus on enhanced storytelling, rather than jumping on a particular game trend—like for instance battle royal. We’re looking for a way to weave more immersive narratives through new forms of technology.”
There’s no denying that what started out as a Mac exclusive game engine, Unity, grew in popularity with the advent of the app store where its roots made it a favoured platform for the burgeoning mobile gaming industry. Unity was adopted as their engine of choice by many indie game developers, who were not online working on apps, but developing games across multiple platforms—and VR is no exception! Here are the top 5 tips for success with developing in Unity for VR from the lead developer of Flow Weaver.
Some of VR gaming’s biggest hits have been developed in Unity, including Beat Saber, Superhot VR, Boneworks, The Room VR: A Dark Matter, and last year’s hit Phasmophobia VR.
I had the opportunity to chat with Tony Tovar, Lead Developer here at indie game studio, Stitch Media, whose latest project—Flow Weaver—could be added to that list.
I asked Tony about Stitch’s foray into VR gaming; “Stitch Media’s main goal is to use new media like VR to focus on enhanced storytelling, rather than jumping on a particular game trend—like for instance battle royal. We’re looking for a way to weave more immersive narratives through new forms of technology.”
It’s not uncommon for studios with a history in transmedia and interactive storytelling projects to get swept up in the potential virtual reality has to offer; it’s a medium like no other and offers creatives an exciting new avenue to explore. But transitioning from the interactive web to full game development can be a difficult road for some teams; choosing which game engine to use for your projects can make a big difference.
Game engines act as a base platform of code that the features of your particular game can be added—like the skeleton framework of a building. Historically, games were built from the ground up, each one a completely unique entity. As game development matured, certain aspects of development were used over and over, so the idea of a framework makes sense; character or camera movement, dialog, inventory, and other systems are features found in many games and having a modular engine that helps in their implementation can save a lot of effort.
“Back in the early 2000s, Unity was just for mobile game development on iOS—Android wasn’t even a player in that space yet. Non-mobile games were being developed in Unreal, Source, or CryEngine, but back then most companies would just build their own.” As the game engine from Unity Technologies evolved, it began to branch out, being used by many independent studios to build games for console and PC. “Unity is just a little more accessible to newer developers,” says Tony. As more and more studios sprung up around the indie game resurgence, the community of Unity developers supported those entering the market. “Unreal isn’t the most user-friendly tool for new game developers. Unity took that and ran with that.”
For Tony, the choice for Stitch Media was clear: “As far as video games are concerned, we’re still getting into our stride, and Unity is a lot more useful for us. An engine like Unreal requires a lot more side-development for tools and utilities which isn’t as out-of-the-box and would take more time and cost a lot more money.”
For the team developing Flow Weaver, getting started with the engine allowed for a smooth transition from concept development to actual production; according to Tony, “Unity has always lent itself to an ease of use that makes it jump in and get going very quickly.”
“From a scripting perspective, Unity benefits from using C# (C Sharp); a programming language that requires less knowledge of specific macros and code preparation than an engine based on C++”
I asked Tony if he had any advice that he gives members of his team; things that aspiring VR game developers could use when starting out with Unity. Here are five tips that—according to Tony—help developers to work towards a successful project!
“The first step—before even starting up the Unity project—you need to use some form of source control like Unity Collaborate, GitHub, or Apache Subversion, among others.” It’s imperative that developers take the time to properly setup their work environment, regardless of the size of team they’re working with.
“Whichever one you choose; Unity has documentation on how to setup a project to use source control. It helps by setting up the project parameters to force metafiles to be unhidden and separate, or forces Unity scenes and prefabs to be plaintext instead of binary. A lot of version controls are more efficient when using text and ASCII over binary.”
Tony also advises that developers ensure they’re all using the same tools; “If you are working with a team, make sure that everyone is using the same versions of everything, including Unity itself and the Universal Render Pipeline, if you’re using it. You want to make sure that you’re consistent across the board otherwise it could cause a lot of wild issues that are going to be super-hard to track down.”
Additionally, Tony warns against upgrading versions during production. It might be tempting to upgrade to new features, but this often results in unforeseen consequences that can set development back significantly and impact your timelines.
Designers and developers both like to customize their digital workspaces, to emphasize the tools they use and maximize screen real estate. However, Tony points out that in Unity, there are fewer options to individualize your interface layout; “As far as the Unity interface goes, it’s not as open-ended as something like say Adobe Photoshop.” This is by design, as the key windows and tools need to be present. “You can drag windows and tabs around, so as far as the location of things that’s up to the user, but we all still have the same things present… just in different places.”
It’s important to familiarize yourself with the different parts of the interface, which to the uninitiated can seem overwhelming at first glance.
Game engines need to render all that visual information to present it to the player in real-time, but unlike screen-based games, virtual and augmented reality (collectively known as XR) games need to do this twice, once for each eye. Because of the nature of stereoscopy—how humans perceive depth—those renders will be slightly different based on whether they’re for the left or right eye.
“While we were working on Flow Weaver you would go to the Package Manager—where Unity handles all the project add-ons for things like web development or networking & multiplayer, among others,” explains Tony. “Valve and Oculus would have packages you could add to set how graphics are rendered specifically for their brand of VR headsets.”
“Applying one of these packages created a section in the projects settings labeled ‘XR’ where you could set the type of rendering method you wanted to use. Depending on the package you chose, you would have different options for these settings.”
Unity Technologies and the community of developers that made Unity’s growth possible have continued to support the engine through online resources to help developers learn how to use the engine or find premade assets that can help ease devs into game development.
Unity Learn is a platform created by Unity Technologies itself. “I would definitely direct new users to the Unity tutorials,” remarks Tony of the platform. Covering everything from the basics to more advanced aspects of Unity development, the Learn platform is a great way to get started and continue to improve using Unity, but there are plenty of other sources for tutorials.
“There’s also just a phenomenal number of resources out there on YouTube—outstanding people who have devoted all of their free time to just making Unity tutorials and helping people learn how Unity works.” YouTube has become the go-to for tutorials on just about everything, and Unity development is no exception. “Brackeys alone was just an amazing YouTube channel; he’s got years of Unity tutorials on everything from setting up your first game to walkthroughs of game basics, to shaders, to animation, to all sorts of things.”
While the channel creators announced in September 2020 that they would no longer be releasing regular new content, their existing catalog of videos on YouTube as well as their Discord server and downloadable assets would remain available.
One thing to keep in mind; make sure to find the tutorials that cover the latest version of Unity. There’s a wealth of information available, and while some might be helpful regardless of when it was released, you’ll want to make sure that you’re watching the most relevant content.
So now you’ve learned enough to get started, but looking at a blank project, you don’t know exactly where to start. Or maybe you’ve made it a ways into the development of your first game, but are looking for a simple solution to a feature you’d like to include. “If there’s something that that we need—in terms of art assets or systems—the Unity Asset Store is well populated with items we can purchase or license and just run with, rather than building it all out ourselves.” Tony mentions that much of what makes Unity so enticing is the availability of existing plugins, 3D assets, and entire game systems that can be added to a game; “There are a lot of individuals and even whole companies whose business model is based on making and selling assets in the Asset Store.” By leveraging these assets, developers can reduce the effort in needlessly reinventing an aspect of their game already accomplished by others.
Our final tip— the one that I’m personally most invested in—has more to do with game design and development, regardless of the game engine you choose… but it will have the greatest impact on your project.
You need to reconsider the aspects of game design that are tied to its screen-based roots..
Designing a game where the player is effectively inside the game world has some very significant implications on the systems we’re used to from straight-up PC/console gaming.
According to Tony, “unlike a game played on a screen, you don’t have that static frame… the screen is your eyes.” I’ve said before that VR is a frameless medium, and what Tony is alluding to here follows that principle; the player’s perspective is from within the game-world, and not sitting in the real world looking at a frame (monitor, TV, phone screen, etc) that contains the game-world.
Think about the typical Heads-up Display (HUD)—the information such as health, score, subtitles, and so one that typically are mounted to the corners of that frame. Where do these go in VR? There’s no corners, there’s not even a frame! “It’s been noted that games that have that static UI in VR are almost nauseating because anywhere you look that HUD follows you around.” By simply following the same approach in VR as was used on-screen, interface design can cause discomfort and lead to a less-enjoyable game experience.
In order to create a better interface, we need to consider the virtual environment more like our real-world one. Diegetic interface design uses in-world elements for player to interact with, and according to Tony “it makes it so that you have that same sort of user interface without it being stuck to your face… so that you can detach yourself from it. A lot of people who are just entering into VR don’t think about that”
One aspect of development that can easily be overlooked by the uninitiated is that—when you’re in VR—stereo sound isn’t quite enough. “You don’t just have left ear and right ear… you have things in front of you, behind you, or above you.” Describing the omnidirectional aspect of sound design in VR, Tony hits on its core problem; “because you can move your head so freely the audio needs to change how it reflects sound.” Unity has tools out-of-the-gate and Oculus Integration comes with a spatializer for Oculus-branded headsets.
It’s fine to look at a screen and hear left/right stereo audio… with varying degrees of volume representing distance, but in VR this approach will lead to a ‘flatter’ soundscape which breaks the immersion. Soundwaves that emanate from behind you aren’t just quieter, they’re hitting your ears from a different angle, bouncing off your environment, and travelling through the air. This isn’t something that simple stereo sound can replicate; “You’re not just dealing with stereo anymore—sound has to be completely spatialized.”
Once you’ve started into Unity development, you’ll discover more tools and methods to develop game mechanics and graphics. Meanwhile Unity Technologies is constantly improving their product and introducing new ways to get things done. Unity’s Universal Render Pipeline is replacing the standard render pipeline to make it easier for developers to look through the render files and see exactly what the engine is doing.
Tony sees this as a significant development in optimizing your Unity project; “how the pipeline goes through and renders a scene are now parameters you can go through and tweak, or you can write your own from scratch.”
Writing a render pipeline from scratch has some clear benefits for Unity development, primarily cutting down on needless processes your particular game doesn’t need; “if your game doesn’t have any sort of real-time lighting, you could create a pipeline that avoids all that code and is faster for not having to process all that.”
But Tony also cautions against neophytes becoming over-ambitious; “If you’re interested in getting into graphics programming—and very intelligent—then I would say that looking through the package that comes with the Universal Render Pipeline would be a great way to discover how a typical render pipeline works. Otherwise I don’t think I would recommend it at all.”
So there you have it; some great tips on Unity development in VR from the Lead Developer on Flow Weaver, an exciting new game on the Oculus platform!