My Favorite VR Experiences So Far

Now that I’ve had plenty of time to go through the launch content of both Oculus and Vive, I figured I’d highlight my favorite experiences you can try for both devices instead of a typical product review. Many of these games are available for both platforms, while some are exclusive.

IMG_2589

My Retail Vive Finally Arrived!

Adr1ft (Oculus)

This is the flagship release for Oculus and deservedly so. Although not a pure VR experience (it also works as a standard game), it’s an absolutely wild trip in VR. Billed as a First Person Experience (FPX), it ranks somewhere between a walking simulator like Firewatch and an adventure such as Bioshock on the “Is It a Game?” scale.

This is consistently one of the top-selling Oculus titles, yet ranks near the bottom on comfort. I had no nausea issues at all, but I generally don’t feel uncomfortable in most VR games. I can see how free-floating in zero gravity, desperately grasping at oxygen canisters as you slowly suffocate to death inside a claustrophobic space suit can cause issues with those prone to simulation sickness. Regardless, this shows that it pays to be hardcore when making VR experiences–especially at this early adopter stage of the market.

A stunning debut for Adam Orth’s threeonezero studio.

Firma (Oculus)

This perhaps one of my absolute favorite pure VR games so far. Think Lunar Lander, Space Taxi or Thrust in VR. If this was a standard video game, it would be mundane, but as a VR experience I really do feel like I have a job piloting a tiny lander craft on a desolate moon base. It actually sort of achieves presence for me–but not the feeling of being in another reality…more like being in an ‘80s sci-fi movie.

Originally available via Oculus Share for years–it’s obvious that a lot of work has been put into this game to get it here for the commercial Oculus release. There are tons of missions, great voice acting, and a lot of fun mechanics and scenarios. This game is giving me plenty of ideas on how to adapt my old Ludum Dare game to VR.

Strangely, this game is in Oculus’ Early Access section, even though I consider it a complete game.

The Apollo 11 Virtual Reality Experience (Oculus, Vive)

An astounding educational journey through America’s moon landing told via VR. This is better than any field trip I took as a kid to the Boston Museum of Science, that’s for sure. This is just the tip of the spear when it comes to education and VR.

Hover Junkers (Vive)

Hover Junkers requires the most physical activity out of any VR game I’ve played–So much so that after 20 minutes of shooting, cowering behind my hovercraft’s hull for cover, and frantically speeding around post-apocalyptic landscapes, my Vive was soaked in sweat. One thing is for sure, public VR arcades are going to need some kind of solution to keep these headsets sanitary. Hover Junkers certainly is the most exciting multiplayer experience I’ve had in VR so far.

Budget Cuts (Vive)

The absolute best example of room scale VR. I didn’t really get it when watching the videos, but when I was finally able to try the demo on my own Vive….wow. This is the game I let everyone try when they first experience Vive. It really nails the difference between seated, controller-based VR and a room scale hand-tracked experience. This is the first “real game” I’ve played that uses all of these elements. So many firsts here, and done so well.

The past month has been a very encouraging start for VR. At this early stage there are already several games that give me that “just one more try” lure. This is surprising given that many current VR titles are small in scope, and in some cases partially-finished early access experiences. With the launch of PSVR later this year, we’re sure to see more full-sized VR games…whatever that means.

The Beginner’s Guide: Dave the Madman Edition

I recently played The Beginner’s Guide after buying it during the annual Holiday Steam Sale over the break. It’s a quick play through, and an interesting way to tell a story within a game. Without giving too much away, the experience reminded me of a similar event in my young-adulthood–When I encountered an amazing game developer who created incredible works I couldn’t hope to match. I’ve since forgotten his real name and don’t know much about him.  But I do have the 4 double-sided floppy disks he sent me of all his games at the time.

48C306D0-095A-4165-840F-0978DF42B7E7

Madsoft 1-4, recovered in great condition

This was the early ‘90s–I’d say around 1990-1991. I had made a bunch of Commodore 64 games (often with my late friend Justin Smith) using Shoot ‘Em Up Construction Kit: an early game development tool that let you build neat scrolling shooters without any programming knowledge.

ais

Adventures in Stupidity, one of my SEUCK creations

I used to upload my games to local BBSes in the New England area and wait for the response on the message boards. In the process, I downloaded some games made by a user known by the handle “MADMAN.”  Some of his games also used the moniker, “Dave the Madman.” He made seemingly professional quality games using Garry Kitchen’s Game Maker.  Not to be confused with YoYo’s GameMaker Studio.

Garry Kitchen’s Game Maker was an early game development tool published by Activision in 1985. I actually got it for my birthday in 1986, thinking that this was my key to becoming a superstar game designer. The thing is, Game Maker was a full blown programming language that, strangely, used the joystick to edit. It also included a sprite designer, music editor, and other tools. Everything a budding game developer would need to get started, right?

Although I did make a few simple games in Game Maker, its complexity was beyond my grasp at the time. Which is why Madman’s creations blew me away. They were so polished! He had developed so many completely different types of games! They all had cool graphics, animation, music, and effects I couldn’t figure out how to duplicate! My favorite was Space Rage: a sprawling, multi-screen space adventure that I simply could not comprehend. I had so many questions about how these games were made!

spacerage

SPACE RAGE!

We messaged each other on a local BBS. I blathered about how much of a fan I was of his work and he said he liked my games, too. I figured he was just being kind. After all, this was a MASTER saying this! We eventually exchanged phone numbers.

I have vague memories of talking to him on the phone, asking how he accomplished such amazing feats using Game Maker. I think he was a little older than me, but many of his games had a 1987 copyright date. Considering I was probably the same age at this time as he was in 1987, this made me feel quite inadequate.

As I recall, Madman was humble and didn’t have many aspirations beyond distributing his little games on BBSes. He seemed like a hobbyist that figured out Game Maker and really liked making games with it–nothing more, nothing less.

fatcat2

Fat Cat probably has the best animation of them all

After our call, he mailed me a complete collection of his games. A few years ago I found these floppy disks and copied them to my Mac using a 1541 transfer cable. The disks bear his handwriting, labeled “Madsoft” 1 – 4. I was able to rescue all of the disks, converting them to d64 format.

Playing through his creations was a real trip down memory lane. The most shocking thing I discovered is on the 2nd side of the 4th disk. His Archon-like game, Eliminators, features the text “Distributed by Atomic Revolution” on the bottom of the title screen. Atomic Revolution was a game ‘company’ I briefly formed with childhood friend, Cliff Bleszinski, around 1990 or so. It was a merger of sorts between my label, “Atomic Games”, and Cliff’s, “Revolution Games.” (The story about the C64 game he made in my parents’ basement is a whole other post!)

eliminators_title_2

An Atomic Revolution production?

I must have discussed handling the distribution of Eliminators with Dave; by uploading and promoting his awesome game all over the local BBS scene and sending them to mail-order shareware catalogs. At least that’s my best guess–I really have no recollection of how close we worked together. I must have done a terrible job since this game was almost completely lost to the mists of time.

I think we talked about meeting up and making a game together–but I didn’t even have my learner’s permit yet. On-line communication tools were primitive if they existed at all. We never really collaborated. I wonder what happened to “Dave the Madman” and his “Madsoft” empire? Is he even still alive? Did he go on to become a game developer, or at least a software engineer? Maybe he’ll somehow see this post and we’ll figure out the answer to this mystery!

ataxx

I remember he was most proud of his Ataxx homage

Until then, I’ll add the disk images of Madsoft 1-4 to this post. Check the games out, and let me know what you think. I’ve also put up some screenshots and videos of his various games–but I’m having problems finding a truly accurate C64 emulator for OSX. If anyone has any suggestions, let me know!

Here’s the link to the zip file. Check these games out for yourself!

 

The Basics of Hand Tracked VR Input Design

Ever since my revelation at Oculus Connect I’ve been working on a project using hand tracking and VR. For now, it’s using my recently acquired Vive devkit. However, I’ve been researching design techniques for PSVR and Oculus Touch to keep the experience portable across many different hand tracking input schemes. Hand tracking has presented a few new problems to solve, similar to my initial adventures in head tracking interfaces.

The Vive's hand controller

Look Ma, No Hands!

The first problem I came across when designing an application that works on both Vive and Oculus Touch is the representation of your hands in VR. With Oculus Touch, most applications feature a pair of “ghost hands” that mimic the current pose of your hands and fingers. Since Oculus’ controllers can track your thumb and first two fingers, and presumably the rest are gripped around the handle, these ghost hands tend to accurately represent what your hands are doing in real life.

Oculus Touch controller

This metaphor breaks down with Vive as it doesn’t track your hands, but the position of the rod-like controllers you are holding. Vive games I’ve tried that show your hands end up feeling like waving around hands on a stick–there’s a definite disconnect between the visual of your hands in VR and where your brain thinks they are in real life. PSVR has this problem as well, as the Move controllers used with the current devkit are similar to Vive’s controllers.

You can alleviate this somewhat. Because there is a natural way most users tend to grip Move and Vive controllers, you can model and position the “hand on a stick” in the most likely way the controllers are gripped. This can make static hands in VR more convincing.

In any case, you have a few problems when you grab an object.

For Oculus, the act of grabbing is somewhat natural–you can clench your first two fingers and thumb into a “grab” type motion to pick something up. In the case of Bullet Train, this is how you pick up guns. The translucent representation of your hands means you can still see your hand pose and the gripped object at the same time. There’s not much to think about other than where you attach the held object to the hand model.

It also helps that in Bullet Train the objects you can grab have obvious handles and holding points. You can pose the hand to match the most likely hand position on a grabbed object without breaking immersion.

With Vive and PSVR you have a problem if you are using the “hand on a stick” technique. When you “grab” a virtual object by pressing the trigger, how do you show the hand holding something? It seems like the best answer is, you don’t! Check this video of Uber Entertainment’s awesome Wayward Sky PSVR demo:

Notice anything? When you grab something, the hand disappears. All you can see is the held object floating around in front of you.

This is a great solution for holding arbitrary shaped items because you don’t have to create a potentially infinite amount of hand grip animations. Because the user isn’t really grabbing anything and is instead clicking a trigger on a controller, there is no “real” grip position for your hand anyway. You also don’t have the problem of parts of the hands intersecting with the held object.

This isn’t a new technique. In fact, one of the earliest Vive demos, Job Simulator, does the exact same thing. Your brain fills in the gaps and it feels so natural that I just never noticed it!

Virtual Objects, Real Boundaries

The next problem I encountered is what do you do when your hand passes through virtual objects, but the objects can’t? For instance, you can be holding an object, and physically move your real, tracked hand through a virtual wall. The held object, bound by the engine’s physics simulation, will hit the wall while your hand continues to drag it through. Chaos erupts!

You can turn off collisions while an object is held, but what fun is that? You want to be able to knock things over and otherwise interact with the world while holding stuff. Plus, what happens when you let go of an object while inside a collision volume?

What I ended up doing is making the object detach, or fall out of your virtual hand, as soon as it hits something else. You can tweak this by making collisions with smaller, non-static objects less likely to detach the held object since they will be pushed around by your hand.

For most VR developers these are the first two things you encounter when designing and experience for hand-tracking VR systems. It seems Oculus Touch makes a lot of these problems go away, but we’ve just scratched the surface of the issues needed to be solved when your real hands interact with a virtual world.

HoloLens is Ready for Prime Time

Microsoft recently invited me to try out HoloLens at their Venice space on Abbot-Kinney. Having just won the AR/VR award in the Tango App Contest for InnAR Wars, I jumped at the chance.  After developing for Google Tango for over a year, I had a long wishlist of features I’m looking for in an AR platform. In the unlikely event that HoloLens was for real, I could make InnAR Wars exactly how I envisioned it at the start of the project.

Entering the Hololens Demo Zone

 

My skepticism was well warranted. Having worked in AR for the past 5 years, I’ve seen my share of fake demo videos and smoke-and-mirrors pitches. Every AR company is obligated to put out faked demo videos that they inevitably fail to live up to. Just look at this supercut of utterly ridiculous promotional videos. I was curious if the staged HoloLens demos weren’t much better.

I had heard many polarizing opinions about HoloLens from people who have tried it. Some felt it was an incredible experience while others told me it was the single most overhyped disappointment in consumer electronics history.

After trying it for myself, I can say HoloLens is the real deal.

The demo I tried was the “X-Ray” game shown at BUILD no too long ago. This version was a little simpler than this staged demonstration–your arm isn’t magically turned into a plasma cannon. Instead, your hold your hand out in front of the device and “air tap” to fire at enemies that appear to be coming out of cracks in the walls. Occasionally you can give a voice command to freeze time, Matrix-style, and take out enemies while they are vulnerable.

A simple game, for sure, but a great demonstration of the HoloLens’ capabilities.  

The device is clearly a prototype. It’s bulky, looks like a vision of the future from a ’90s sci-fi flick, and it even BSODed on me which was kind of hilarious.  Still, I was thoroughly impressed with HoloLens and here’s why:

Meshing

When the game starts, you have to look around the room and watch it build geometry out of the walls and other objects in the area. HoloLens builds a mesh out of the real world with a depth camera that is then used in gameplay. It’s kind of like building a video game level out of all the walls and floors in your room. This mesh is then used to place virtual wall cracks and spawn points for enemies during gameplay. Once you’ve build a complete mesh out of the room, the game begins.

This same meshing process is possible with Google Tango, but it’s slow and temperamental. Still, very impressive given it’s in a tablet you can buy right now. In fact, I used Tango’s meshing capabilities to place floor traps in InnAR Wars.

I was impressed with HoloLens’ rapid meshing as I moved around my environment. It even handled dynamic objects, such as the woman guiding me through the demo. When I looked at her during the meshing phase, she quickly transformed into a blocky form of red polygons.

Display

Initially I was disappointed in the display. Much like Epson’s Moverio or ODG’s R7 glasses, it projects the augmentation on a part of the “glass” in front of your eyes. This means you see a distracting bright rectangle in the middle of your vision where the augmentation will appear. Compared to ODG’s R7s, HoloLens seemed to have higher contrast between the part of the display that’s augmented and the part that’s not. There also is an annoying reflection that looks like a blue and green “shadow” of the augmentation above and below the square.

While playing the game this didn’t matter at all. Although everything is somewhat translucent, if the colors are bright enough the virtual objects appeared solid. Combined with rock solid tracking on the environment, I soon forgot all about contrast issues and internal reflection problems on the display. A these issues can be dealt with through art and the lighting of the room you are playing in. Plus, a Microsoft representative assured me the display is even better in the current version still in their labs.

Field of View

The top issue people have with HoloLens is the field of view. People complain that it only shows graphics in a postcard sized space in front of your vision. I had rock bottom expectations for this having developed applications on previous generation wearable AR displays. Although HoloLens’ augmentation is limited to a small square in front of your vision, this space is easily 2X the size of other platforms. In the heat of the action while playing X-Ray, I mostly forgot about this restriction.

Field of view is not an easy thing to solve–it’s a fundamental problem with the physics of waveguide optics. I’m not sure how much progress Microsoft can make here. But the FOV is already wide enough for a lot of applications.

I’m All In

Part of the pitch is the “Windows Holographic” platform. That is, in the future you won’t have screens. With HoloLens you’ll be able to place Windows applications in mid-air anywhere in your office. Applications will virtually float around you like monitors hovering in space. (Magic Leap’s fake demo video shows an example of how this would work) This can actually be done right now with HoloLens and existing Windows applications. Supposedly, some Microsoft engineers wear HoloLens and have integrated the “holographic” workspace into their daily use.

Beyond gaming applications, I am on board with this screen-free future. Your tablet, computer, even your phone or smartwatch will merely be a trackable object to display virtual screens on. Resolution and screen size will be unlimited, and you can choose to share these virtual screens with other AR users for collaboration.  No need to re-arrange your office for that huge 34 inch monitor. You can simply place it in 3D space and overlay it on top of reality. Think of all the extra stuff your phone could do if it didn’t have to power a giant LCD display! It’s definitely on its way. I’m just not sure exactly when.

VR with a Gamepad Sucks

I was kind of bummed my first day of Oculus Connect 2.

FullSizeRender

Last year’s Oculus Connect was revelatory to me. Despite having worked on two different Gear VR titles at the time, the Crescent Bay demo was incredible in comparison. From Oculus’ own vignette demos to Epic’s Showdown sequence–the leap in quality from DK2 to Crescent Bay was astounding. Everyone walked out of that demo with a huge simile on their faces.

The first demos I tried at OC2 were the Gamepad demos. Oculus spent an absurd amount of time at their E3 keynote talking about how amazing it was that they were launching with the XBox 360 controller as the input device. At Oculus Connect, I put this claim to the test.

Every game from EVE Valkyrie to Edge of Nowhere seemed like playing a regular video game strapped to my face. I felt like I was playing an XBOX One through binoculars. In fact, a few of the games made me a little queasy–which I’m usually not susceptible to.

Maybe I’m just jaded having been developing gamepad VR experiences on Gear VR for a while, I thought.

Later on I tried Toybox which is a cool tech demo but doesn’t really illustrate how you’d play an actual game for any length of time with the Touch controllers. In fact, I found the controllers a little hard to use compared to the Vive. They have tons of confusing buttons and getting the finger gestures right seemed to take a little bit of work.

I was leaving the demo area and getting ready to head home when a friend of mine who works for Oculus stopped to ask what I thought. I told him honestly that I felt last year’s demos were better–they were more immersive and interesting. Although a little taken aback at my impressions, he strongly suggested I come by tomorrow for the next set of demos. He couldn’t tell me what they were, but promised they’d be awesome.

The Oculus Connect app sent a notification alerting me that new demo registrations would be available at 8 AM. I set my alarm and woke up the next morning to register for the Touch demos via my iPhone. I promptly slept through the keynote and arrived on the scene at noon for my demo.

We were only allowed to try two games, and it was heavily suggested I try Epic’s “Bullet Train” experience. Having not seen the keynote, I had no idea what I was getting into.

Bullet Train is mind blowing.

Bullet Train is essentially Time Crisis in VR. When I saw the Showdown demo last year I thought a game like this in VR would be a killer app. One of my favorite coin-ops of all time is Police 911–which motion tracks your body with a pair of cameras to duck behind obstacles. I thought doing this in VR would be amazing. However, last year there were no hand tracking controls–it was just a vague idea.

Here, Epic took the Touch controllers and made an incredible arcade shooter experience that should be a killer app should Epic choose to develop this further. Oculus really needs to do everything in their power to get Epic to produce this as a launch title for the Touch controllers.

The touch controls make all the difference. From handling weapons and grenades to plucking bullets out of the air in slow motion, Bullet Train really drives home how flexible the Touch controls are. Unlike Vive which is like holding a set of tools, these let you reach out and grab stuff–Even pump a shotgun.

The combination of standing in a motion tracked volume and visceral interaction with the world using your hands–even with Touch’s primitive finger gesture technology–really immerses you in an experience way beyond what’s possible sitting in a chair with an XBox controller.

It’s disappointing that Touch won’t launch with Oculus’ headset. Hand tracking is absolutely required for a truly immersive experience. Developing games that support both Gamepad and Touch control is going to be difficult without diluting features for one or the other. I’ve experienced a similar issues developing games that work with Gear VR’s touchpad and Bluetooth gamepad.

I left Oculus Connect 2 I reinvigorated with the feeling that VR with hand tracking is the One True VR Experience. Gamepad is fine for mobile VR at the moment, but all of my PC and Console VR projects are now being designed around Touch and hand tracked input. It’s the only way!

The Challenge of Building Augmented Reality Games In The Real World

InnAR Wars Splash Image - B

Last week I submitted the prototype build of my latest augmented reality project, InnAR Wars, to Google’s Build a Tango App Contest. It’s an augmented reality multiplayer space RTS built for Google’s Tango tablet that utilizes the environment around you as a game map. The game uses the Tango’s camera and Area Learning capabilities to superimpose an asteroid-strewn space battlefield over your real-world environment. Two players holding Tangos walk around the room hunting for each other’s bases while sending attack fleets at the other player’s structures.

Making InnAR Wars fun is tricky because I essentially have no control over the map. The battlefield has to fit inside the confines of the real-world environment the tablets are in. Using the Tango’s Area Learning capabilities with the positions of players, I know the rough size of the play area. With this information I adjust the density of planetoids and asteroids based on the size of the room. It’s one small way I can make sure the game at least has an interesting number of objects in the playfield regardless of the size of the area. As you can see from the videos in this post, it’s already being played in a variety of environments.

This brings up the biggest challenge of augmented reality games–How do you make a game fun when you have absolutely no control over the environment in which it’s played? One way is to require the user to set up the play space as if she were playing a board game. By using Tango’s depth camera, you could detect the shapes and sizes of objects on a table and use those as the playfield. It’s up to the user to set it up in a way that’s fun–much like playing a tabletop war game.

For the final release, I’m planning on using Tango’s depth camera to figure out where the room’s walls, ceilings, and floors are. Then I can have ships launch from portals that appear to open on the surfaces of the room. Dealing with the limited precision and performance of the Tango depth camera along with the linear algebra involved in plane estimation is a significant challenge. Luckily, there are a few third-party solutions for this I’m evaluating.

Especially when looking at augmented reality startups’ obligatory fake demo videos, the future of AR gaming seems exciting. But the practical reality of designing a game to be played in reality–which is itself rather poorly designed–can prevent even the most amazing technology from enabling great games. It’s probably going to take a few more hardware generations to not only make the technology usable, but also develop the design language to make great games that work in AR.

If you want to try out the game, I’ll have a few Tangos on hand at FLARB’s VRLA Summer Expo table. Stop by and check it out!

How To Support Gear VR and Google Cardboard In One Unity3D Project

Google Cardboard is a huge success. Cardboard’s userbase currently dwarfs that of Gear VR. Users, investors, and collaborators who don’t have access to Gear VR often ask for Cardboard versions of my games. As part of planning what to do next with Caldera Defense, I decided to create a workflow to port between Gear VR and Cardboard.

Always keep a Cardboard on me at ALL TIMES!

I used my VR Jam entry, Duck Pond VR, as a test bed for my Unity3D SDK switching scripts. It’s much easier to do this on a new project. Here’s how I did it:

Unity 4 vs. Unity 5

Google Cardboard supports Unity 4 and Unity 5. Although Oculus’ mobile SDK will technically work on Unity 5, you can’t ship with it because bugs in the current version of Unity 5 cause memory leaks and other issues on the Gear VR hardware. Unity is working on a fix but I haven’t heard any ETA on Gear VR support in Unity 5.

This is a bummer since the Cardboard SDK for Unity 5 supports skyboxes and other features in addition to the improvements Unity 5 has over 4. Unfortunately you’re stuck with Unity 4 when making a cross-platform Gear VR and Cardboard app.

Dealing With Cardboard’s Lack of Input

Although Gear VR’s simplistic touch controls are a challenge to develop for, the vast majority of Cardboards have no controls at all! Yes, Google Cardboard includes a clever magnetic trigger for a single input event. Yet, the sad fact is most Android devices don’t have the necessary dock connector to use this.

You have a few other control options that are universal to all Android devices: the microphone and Bluetooth controllers. By keeping the microphone open, you can use loud sounds (such as a shout) to trigger an action. You can probably use something like the Pitch Detector plug-in for this. Or, if your cardboard has a head strap for hands-free operation, you can use a Bluetooth gamepad for controls.

Because of this general lack of input, many Cardboard apps use what I call “stare buttons” for GUIs. These are buttons that trigger if you look at them long enough. I’ve implemented my own version. The prefab is here, the code is here. It even hooks into the new Unity UI event system so you can use it with my Oculus world space cursor code.

Gear VR apps must be redesigned to fit within Cardboard’s constraints. Whether it’s for limited controls or the performance constraints of low end devices. Most of my Cardboard ports are slimmed down Gear VR experiences. In the case of Caldera Defense, I’m designing a simplified auto-firing survival mode for the Cardboard port. I’ll merge this mode back into the Gear VR version as an extra game mode in the next update.

Swapping SDKs

This is surprisingly easy. You can install the Cardboard and Gear VR SDKs in a single Unity project with almost no problems. The only conflict is they both overwrite the Android manifest in the plugin folder. I wrote an SDK swapper that lets you switch between the Google Cardboard and Oculus manifests before you do a build. You can get it here. This editor script has you pick where each manifest file is for Cardboard and Gear VR and will simply copy over the appropriate file to the plugin folder. Of course for iOS Cardboard apps this isn’t an issue.

Supporting Both Prefabs

Both Oculus and Cardboard have their own prefabs that represent the player’s head and eye cameras. In Caldera Defense, I originally attached a bunch of game objects to the player’s head to use for traces, GUI positioning, HUDs, and other things that need to use the player’s head position and orientation. In order for these to work on both Cardboard and Oculus’ prefabs, I placed all objects attached to the head on another prefab which is attached to the Cardboard or Oculus’ head model at runtime.

Wrapping Both APIs

Not only do both SDK’s have similar prefabs for the head model, they also have similar APIs. In both Cardboard and Oculus versions, I need to refer to the eye and head positions for various operations. To do this, I created a simple class that detects which prefab is present in the scene, and grabs the respective class to wrap the eye position reference around. The script is in the prefab’s package.

Conclusion

For the final step, I made separate Cardboard versions of all my relevant Gear VR scenes which include the Cardboard prefabs and modified gameplay and interfaces. If no actual Oculus SDK code is in any of the classes used in the Cardboard version, the Oculus SDK should be stripped out of that build and you’ll have no problem running on Cardboard. This probably means I really need to make an Oculus and Cardboard specific versions of that CameraBody script.

The upcoming Unity 5.1 includes native Oculus support which may make this process a bit more complicated. Until then, these steps are the best way I can find to support both Cardboard and Gear VR in one project. I’m a big fan of mobile VR, and I think it’s necessary for any developer at this early stage of the market to get content out to as many users as possible.