Introduction

This is a VR demo I produced of a fighter style game, using the oculus rift. I used the demo as an opportunity to develop an understanding of the fundamentals of VR development, mostly focusing on the aspects that different/ not to be found in traditional non VR games.

The porting of popular non VR mechanics to a VR game was of most concern for me, as VR development is still very new and the standard “best practises” for VR are underdeveloped. Another area I was interested in was workflow within the unity game engine, although a lot of the lessons and best practises also apply equally to other engines.

 

ice_screenshot_20161231-082139

 

Adaptation of Non VR mechanics

 

General Flying mechanics

I initially created quite a complex set of controls for the flying, using most of a standard xbox controllers inputs simply for the flying. For example I had flaps/airbrakes mapped to be, an afterburner /boost button mapped to A, and a cruise control button mapped to Y. This along with rudder, aileron and thrust control all mapped to left stick, RB and LB, and LT respectively.

This controller scheme although fun to use a non VR game with various boosts and flying modes, was not appropriate on for a VR game. The complexity of the layout confused the player, and not being able to physically see the controller made the problem worse than a difficult layout in a non VR game.  Furthermore the lack of visual feedback from highly technical flying controls I had created broke the immersion for the player, as they had to think about a controller in there hands, that is not part of the VR world, and again that they cannot see in-game.

I solved these issues by the following;

Mechanics in VR should use the controller to a minimum, and use the player’s sight on real world controls (cockpit controls in this case) to initiate a mechanic such as the after burners.

I added animations to all the flight controls, and every single dial on in the cockpit that I had a mechanic for. If I had a budget for a full game rather than a demo, I would most certainly have implemented a 3d character model as well to further emphasise the feedback from controls.

Lastly I stripped out a lot of the complex controls (such as the after burners and autofly), before publishing the game. I found that the player had a lot more to focus on within VR, and in fact a couple of interesting mechanics and controls in game-play was more than enough to take up the player’s full attention.

 

Shooting mechanic

The mechanics of shooting an enemy are an adaption of classic mechanics, but that use new control that VR and head tracking gives. Having to focus on the enemy while shooting has lead to interesting maneuvers for the user to perform.

Although this would be considered a basic porting of a classic mechanic, I did find that motion sickness was a major issue when developing the shooting system. If the player spent long periods of time with the cockpit not in focus, ie during a long engagement, the motion sickness was most exacerbated. Due to fact the player has free-look in VR, they would often try to stop the enemy targets by turning around and looking off to the sides.

I solved this issue by providing the player with additional visual information within the cockpit, the compass and heading to enemy target, thus naturally guiding the player to look back at the cockpit periodically when the enemy was out of sight, and removing the necessity to use only looking around to find enemy targets.

ice_screenshot_20170216-162814

 

Presentation Of Information

At this point in development I was sure that well thought out presentation of information within the game was invaluable in a VR game. Almost all mechanics I had implemented at that point had required the presentation of information to be tweaked to even make the mechanic feasible.

After fact-finding from a few other popular VR titles, I came up with a comprehensive set of methods to present information, that would be appropriate to a dog-fighter style game. Below are a couple of examples that I implemented;

Audio.

As soon as a target is spawned the player is informed over radio style coms of the enemy’s distance and the heading to the target. Furthermore once within 2000 metres of the target the clock direction (as in 1-12 o’clock) is given to the target.

Visual.

The current heading and heading to the target are shown in black and red respectively below the compass. The compass itself is a radar map at close range which shows the position of the enemy. The compass is shown as a physical display within the cockpit.

 

I found the combination of good audio and visual information to be a necessity. I tinkered early on with switching between audio and visual as the primary methods of informing the player where the enemy was during the game, with visual being prefered only if the target could not be seen by the player and within a range switching to visual, however I felt this not only broke the immersion of the game, but actually made tracking the enemy much harder in gameplay. The immersion that VR gives us allows us to trick the mind into thinking it is a realistic scenario, much like a boeing flight simulator does. This allows the player to take in much more information simultaneously than would be appropriate in a non VR game, as you could in real life as there are no external distractions. I believe this leads to the best experience for the player, a total VR experience.

Note that both of the of the examples I gave of information presentation took significant resources and time to implement. They did however prove to be as essential to the game as terrain textures or 3d models, so if you are making your own VR game, I advise you to make this a priority in your planning.

 

Labeled Information Pres

 

General Workflow

The workflow I used when developing this demo was different to the one I usually use. My standard practise is to split the features / mechanics of a game into chunks and work on them in a procedural manner (one after the other), as they come up or are depended upon within gameplay. For example I would not usually have included sounds such as wind, or clouds as early as I did during development.

 

However, VR poses new challenges as compared to normal non VR games. In the first couple of weeks of development, progress ground to a halt as a suffered with motion sickness, at the time I had merely a plane, and a terrain, with no sound or other visual effects and was trying to work on the shooting system with some simple ground targets. I swiftly change tack and moved onto a more iterative design process (Link to Unreal engine example), as I had seen earlier in a workflow demonstration from Unreal. This allowed me to get decent quality visual effects and reasonable placeholder sounds in place to an adequate level to continue development. I would take this as one of the more important lessons I learned during development.

 

ice_screenshot_20161231-082209

 

Procedurally generated clouds using 3d perlin noise, and the unity particle system. The alpha blended particles make the clouds quite realistic to fly through.

 

Engine related problems I had during development(unity specific).

Overall the VR framework was relatively stable whilst developing the game, however I did encounter a couple of game breaking bugs within unity (that were not patched).

1) The game cameras positioning would become unstable at large position coordinates (ie go too high), the camera would shake from what I assume is an issue with float precision going from the VR head tracking to unity.

I solved this issue in a rather hacky way, but it gets the job done until Unity patch the issue. The player’s position is reset to 0,0,0 after they get more than 500 metres away from origin, and all objects in the world are under one gameobject, and they are offset by minus the player’s position, thus keeping it stable.

 

2) You must set forward rendering on cameras that you intend to use as a separate UI layer (ie overdrawing another camera), in deferred rendering it was inconsistent in what would overdraw what (regardless of the draw order set). Again I expect this to be patched in the not too distant future.

 

ice_screenshot_20161231-082300

 

Generated 3D terrain using a third party asset of a real place, which uses bing maps for terrain height and textures.