Location Based VR World Tour or THE VOID VS ZERO LATENCY VS VRCADE VS IMAX VR

Ever since developing last year’s Holographic Easter Egg Hunt with Microsoft for VRLA, I’ve been interested in creating location-based VR and AR experiences. These are cool projects to me since you can build hardware specific to the experience, design software for one fixed hardware configuration, and really go wild within the constraints of your budget, location, and audience. Plus, there’s the additional challenge of keeping the event profitable based on the number of customers you can run through the exhibit per hour.

Throughout the past year, I’ve managed to try most major location-based VR experiences. After finally trying The VOID this week at Disneyland, I figured I’d write up a quick series of impressions of all the ones I’ve tried.

The VOID / Secrets of the Empire

The newest location-based VR I’ve experienced is “Secrets of the Empire” by The VOID installed at Downtown Disney in Anaheim. Taking place before the events of Rogue One, this is a Star Wars adventure that puts you and a friend in the roles of two Rebel Alliance agents disguised as Stormtroopers who have to sneak into an Imperial base on Mustafar and retrieve critical intelligence for the Rebellion’s survival.

The VOID uses a custom headset and vest with backpack PC. The first thing I noticed is that it was really heavy–it felt like I was wearing at least 20 pounds of gear. However, the vest and headset have a lot of innovative features. My favorite is the force feedback pads placed all around your body. When you are hit by blaster fire you can feel the impact and know where it’s coming from.

The headset has image quality comparable to Oculus Rift and uses LEAP Motion so you can see your hands. This is important because you can reach out and grab real-world objects such as blaster rifles that are tracked in VR when you pick them up or even hit real buttons on virtual control panels to unlock doors. If you see a droid, reach out and touch it! It’s really there! The hands don’t quite line up with the real world position of the objects you see in VR, but it’s close enough.

The game itself is a 20 or so minute experience where you team up with another player to infiltrate an Imperial base. While sneaking around you’ll be shot at by Stormtroopers, clamber out on perilous ledges over lakes of molten lava (you can feel the heat!), and use teamwork to solve puzzles and defend against waves of enemies.

The graphics are great and tracking for both the player and your weapon is rock solid. The redirected walking and other tricks done with space and movement effectively give the sensation of exploring a small section of a large Imperial base. Everything does kind of feel cramped and constrained, but this adds to the tension of firefights when you and your partner are jammed up in a room with hordes of Stormtroopers firing through the door.

IMG_0226

Mission complete!

I really enjoyed Secrets of the Empire–it’s perhaps less ambitious than Zero Latency’s offering, but executed FAR better than anything else I’ve tried. At $30 a pop (not to mention merch sales), they’re supposedly doing 700-800 people a day on weekends. I’m not sure how the math works out, but this seems like a success to me.

Zero Latency / Singularity

I tried Zero Latency’s “Singularity” experience at LevelUp in the Las Vegas MGM Grand several months back. Zero Latency’s “Free Roam VR” platform shares similarities with The VOID in that it uses a backpack PC with a positionally tracked weapon. However, instead of teams of two moving around inside a constrained area that you can reach out and touch, Zero Latency accommodates up to 8 players at once in a large, empty trackable space.

The Singularity is a shooter game where your team has to exit a shuttlecraft and venture into a dangerous, killer robot-infested base ruled by a hostile AI. Armed with a gun that can be switched between various ammo types (shotgun, laser, blaster, etc.) you and your team must journey to the core and take out the AI once and for all in an epic boss battle.

The experience amounts to a lot of mindless shooting. The gameplay itself doesn’t seem very well designed as robots get stuck on parts of the scenery, different weapon types don’t seem to do much, and the visuals at times can be just downright bad. I guess it has positional audio, but it’s not very well done as I kept getting surprised by enemies firing from behind that I simply didn’t notice.

There are flashes of brilliance–and, dare I say, ambition. Zero Latency does some pretty crazy things with redirected walking and developed one particularly thrilling scenario where your party gets split in half and both groups must fend off drone attacks while carefully walking along a catwalk suspended hundreds of feet in the air. There’s even a part that does the whole 2001 thing where you walk up a wall in zero gravity. They take a lot of chances in this experience which makes those parts of Singularity very memorable.

Zero Latency’s backpack is much lighter than The VOID’s.  However, they are using vastly inferior OSVR headsets with terrible positional tracking on both the player and the weapon. I’m assuming the backpack PC has a much lower spec because the visuals are quite a step down from The VOID.

Tracking is an issue. Singularity was a jittery, janky mess. Characters skidded all around while their IK made them contort in unnatural poses. The game also blares a klaxon in your ear when someone is in the wrong position or close to touching another player. This got super annoying after awhile.

img_2082.jpg

After finishing the 30-minute experience, I came to the conclusion that it’s a really solid alpha. I can’t tell if the game is underwhelming because of weak game development or there isn’t enough juice in the hardware. I tend to think it’s the former, given the quality of VR I’ve experienced on far less powerful platforms. Content aside, the tracking is just so awful that I can’t imagine even a better game would fix this alone. They need to upgrade the hardware, too.

VRStudios / VR Showdown in Ghost Town

On the lower end is VR Studios’ “VR Showdown in Ghost Town” which you can currently play at Knott’s Berry Farm in Southern California. This has to be judged on a different scale because it’s much smaller in scope. This game is a $6 6-minute experience using much simpler hardware in a single-room sized tracking volume. It seems much less expensive for the operator to install and maintain, and cheaper for the user to play (although the price per minute is the about same as The VOID).

It uses VRStudios’ VRCade platform which seems to be like Gear VR on steroids. You wear a somewhat unwieldy, self-contained VR headset with tracking balls on it, along with a gun that is also tracked with the same technology. Two players in the same room defend against a seemingly infinite amount of zombies attacking an old west town. You can pick up power-ups to give you more effective shots and some cool bullet-time effects, but at the end of 6 minutes, it’s over regardless.

IMG_2366

The headset is clunky with a low refresh rate and narrow FOV, and the game itself really isn’t very good. But it’s a cheap way for people to try VR for the first time and a seemingly inexpensive way for locations to provide a VR experience. Still, you can have far better experiences at home with a game like Farpoint.

IMAX VR

IMAX VR is perhaps the most disappointing as it has the ambiance of a dentist’s office with a bunch of VR you can largely experience at home on Rift, Vive, or PSVR. IMAX VR is notable for being one of the few places you can try the Starbreeze’s StarVR wide FOV headset. However, the John Wick StarVR game I tried isn’t even as good as Time Crisis, and that came out over 20 years ago! Honestly, they need to gut this place and start over. Doing something ambitious like what The VOID or Zero Latency has done makes more sense than a bunch of kiosks playing games you can already get at home.

IMG_0471

The sterile, featureless waiting room at IMAX VR

Then again, maybe the economics work out–it might be easier to sell individual tickets to solo experiences than waiting to fill up an 8-player co-op session at a premium price. Last year they were bragging about how much money the site was bringing in–but $15,000 a week isn’t a lot. I bet a Starbucks at the same location would do 3 times the business. In fact, the VOID does 3 times that on any given Saturday.

The Future of Location-Based VR

I’m really encouraged by the range of experiences I’ve tried at these different VR facilities. Many of these platforms seem to boast a similar set of features–including the ability to update the physical location with a new experience in a matter of minutes. A representative from The VOID told me it would be possible to swap out Secrets of the Empire for a new game (say, Ghostbusters) in about 15 minutes.

I can’t help but think a lot of companies that build these locations will be disrupted by a new generation of developers who can use off the shelf tracking solutions and next generation backpack computers to build far more compelling experiences. With the Vive Pro including vastly improved lighthouse tracking and removing the need for cables with the Vive Wireless Adapter, we might see a generational leap in quality as experienced game developers will be able to enter the market instead of companies that managed to shoehorn in a tracking solution and stick it in a random mall storefront they have access to.

​ARKit, ARCore, Facebook and Snapchat or THE BATTLE FOR SMARTPHONE AR WORLD SUPREMACY

I haven’t written a blog post in awhile. Over the past 6 months, I’d try to pontificate on the topic of Augmented Reality but some major new development would always occur. I have a bunch of scrapped posts sitting in Google Drive that are now totally irrelevant. Cruising through December, I figured the coast was clear. I was considering writing a dull year in review post when the final paradigm shift occurred with Snap’s release of Lens Studio. So, let’s try and get this out before it’s obsolete!

The Return of Smartphone AR

Smartphone AR is definitely back.  After Apple’s announcement, everyone wanted to talk about ARKit. Despite developing the award-winning Holographic Easter Egg Hunt for HoloLens with Microsoft this past Spring, discussions with clients and investors became laser-focused on smartphone AR instead of mixed reality.

It looks like 2018 will be a big year for these platforms while mixed reality headset makers gear up for 2019 and beyond. Because of this renewed interest in smartphone AR, this is a good time to investigate your options if you’re looking to get into this platform.

ARKit and ARCore

Despite being announced after Facebook’s AR Camera Effects platform, it really was Apple’s ARKit’s announcement that set off this new hype cycle for smartphone AR. Google’s announcement of ARCore for Android was seemingly a me-too move, but also quite significant.

This isn’t about ARKit versus ARCore since there is no competition. They both do similar things on different platforms. ARCore and ARKit have a common set of features but implement them in ways that are subtly different from the user’s perspective. Because of this, it’s not super difficult to port applications between the two platforms if you are using Unity.

The biggest limitation of both ARKit and ARCore is that when you quit the application, it forgets where everything is. Although you can place anchors in the scene to position virtual objects in the real world, there is no persistence between sessions. I suspect ARCore might advance quicker in this department as Google’s ill-fated Tango technology had this in their SDK for years. I’m assuming we’ll see more and more Tango features merged into ARCore in 2018. Rumors suggest ARKit 2.0 will also see similar improvements.

ARKit does one up ARCore with the addition of face tracking for the iPhone X. This is the most advanced facial tracking system currently available on mobile phones. However, it’s only on one device–albeit a wildly popular one. ARKit’s facial tracking seems to produce results far beyond current mask filter SDKs as it builds a mesh out of your face using the TrueDepth camera. However, there doesn’t seem to be a reason why many of the basic facial tracking features can’t be brought over to phones with standard cameras. Maybe we’ll see a subset of these features trickle down into other iOS devices in the near future.

ARKit has far more penetration than ARCore. ARCore runs on a tiny fraction of Android devices, and this isn’t likely to improve. ARKit requires an iPhone 6S and above, but that’s still a large chunk of iOS devices. There probably is zero business case for focusing on ARCore first. If you truly need to develop a standalone AR app, your best bet is to target iOS primarily and Android second (if at all). If ARCore starts to get some of Tango’s features added to it ahead of ARKit, then there will be compelling use cases for ARCore exclusive apps.

Facebook Camera Effects Platform vs. Snapchat World Lens

When ARKit was first announced, I had a few meetings at large companies. They all thought it was cool, but didn’t want to develop standalone apps. Getting users to download yet another app is expensive and somewhat futile as most go unused after a few tries. There’s a lot more interest in distributing AR experiences inside apps people already have installed. Before Facebook Camera Effects was announced, the only option was Blippar. Which really isn’t an option since hardly anyone uses it.

I got access to Facebook Camera Effects early on and was really impressed with the tools. Leading up to the public release, Facebook has added a lot of features. I’ve seen everything from simple masks to full-blown multiplayer games built with Facebook’s AR Studio.

Screen Shot 2017-12-18 at 6.09.19 PM

Facebook’s AR Studio

Facebook developed an entire 3D engine inside the Facebook Camera. It has an impressive array of features such as a full-featured JavaScript API, facial tracking, SLAM/plane detection, bones (sadly only animated in code), 2D sprite animation, particles, shaders, UI, and advanced lighting and material options. You also can access part of the Facebook graph as well as any external URL you want. If you can fit it inside the filter’s size, poly count, and community guideline restrictions–you can make a fairly elaborate AR app far beyond simple masks.

The great thing about Camera Effects Platform is you are able to distribute an AR experience through an app that already has hundreds of millions of users. Because of this reach, a filter must be tested on a wide variety of phones to account for per-platform limitations and bugs. This is because Facebook AR filters run on a huge number of devices–whether they have native AR SDKs or not.

What’s tricky is after getting approval for distribution of your filter, you still have to somehow tell users to use it. Facebook provides a few options, such as attaching a filter to a promoted Facebook page, but discovery is still a challenge.

As Camera Effects Platform opened to all, Snap released Lens Studio for both Windows and Mac. This platform allows developers to create World Lens effects for Snapchat. I was really excited about this because a lot of clients were just not very enthusiastic about Facebook’s offering. I kept hearing that the valuable eyeballs are all on Snapchat and not Facebook, despite Snapchat’s flatlining growth. Brands and and marketers were chomping at the bit to produce content for Snapchat without navigating Snap’s opaque advertising platform.

Screen Shot 2017-12-18 at 6.07.56 PM

Snap’s Lens Studio

Lens Studio shares many similarities to Facebook’s AR Studio, including the use of JavaScript as a language. The big difference here is that Lens Studio does not expose Snapchat’s facial tracking features. You can only make World Lenses–basically placing animated 3D objects on a plane recognized by the rear camera.

World Lenses also have much tighter size and polycount restrictions than Facebook Camera Effects. However, Lens Studio supports the importing of FBX bone animations and morph targets, along with a JavaScript API to play and blend simultaneous animations. Lens Studio also supports Substance Designer for texturing and a lot of great material and rendering options that make it easier to build a nice looking World Lens despite having lower detail than Facebook.

As for distribution, you still have to go through an approval process which includes making sure your lens is performant on low-end devices as well as current phones. Once available you can link your lens to a Snapcode which you can distribute any way you want.

Which should you develop for? Unlike ARCore and ARKit, Facebook and Snapchat have wildly different feature sets. You could start with a Facebook Camera Effect and then produce a World Lens with a subset of features using detail reduced assets.

The easier path may be to port up. Start with a simple World Lens and then build a more elaborate Facebook AR filter with the same assets. Given how few people use Facebook’s stories feature, I feel that it may be smarter to target Snapchat first. Once Facebook’s Camera Effects Platform works on Instagram I’d probably target Facebook first. It really depends on what demographic you are trying to hit.

App vs. Filters

Should you develop a standalone AR app or a filter inside a social network platform? It really depends on what you’re trying to accomplish. If you want to monetize users, the only option is a standalone ARKit or ARCore app. You are free to add in-app purchases and ads in your experience as you would any other app. Facebook and Snap’s guidelines don’t allow this on their respective platforms. Are you using AR to create branded content? In the case of AR filters, they are usually ads in themselves. If you are trying to get as much reach as possible, a properly marketed and distributed AR filter is a no-brainer. A thorough mobile AR strategy may involve a combination of both native apps and filters–and in the case of Facebook’s Camera Effects Platform, they can even link to each other via REST calls.

spectrum

How each platform ranks sorted by feature complexity

2018 is going to be an exciting year for smartphone AR. With the explosive growth of AR apps on the AppStore and the floodgates opening for filters on social media platforms, you should be including smartphone AR into your mixed reality strategy. Give your users a taste of the real thing before the mixed reality revolution arrives.

Why I Don’t Care About Your New Mixed Reality Headset

I’m often approached by entrepreneurs in the AR/MR space offering me demos of new hardware.  Competition in this space is fierce. You need three major elements for me to take a new platform seriously.

vrlatryon_2

You Need These Three Things To Have A Successful Mixed Reality Device

The three requirements for any successful AR (or more specifically MR) device are: Display, Computer Vision, Operating System

Display

This is the first element of an AR/MR wearable, and usually this what all hardware companies have. There are a number of different displays out there, but they all seem to share the same limitations: additive translucent graphics, small FOV, and relatively low resolution. Often times devices with claims of wider FOVs end up with even lower resolution visuals as a compromise. Both low and high resolution displays I’ve seen are all additive, thus images appear as translucent. Some companies claim to have solved these problems. As far as I’ve seen, we’re a long ways off from a commercial reality.

getimage

Operating System

When I got my HoloLens devkits, the first thing that impressed me is that Microsoft ported the entirety of Windows 10 to Mixed Reality. Up until now, most AR headsets had simple gaze-optimized skins for Android. Windows Holographic makes even traditional 2D applications able to be run in mixed reality as application windows floating in space or attached to your walls. It’s all tied to a bulletproof content delivery ecosystem (Windows App Store) so distribution is solved as well.

2786932-hololens

Your device needs to be more than just something worn only to run a specific app. Mixed reality wearables will one day replace your computer, phone, and just about anything with a screen. You need a complete Mixed Reality operating system that can run everything from the latest games to a browser and your email client in this inevitable use case.

Computer Vision

I can’t tell you how many device manufacturers have shown me their new display but “just don’t have the computer vision stuff in.” Sorry, but this is the most important element of mixed reality. Amazing localization, spatialization, tracking, and surface reconstruction features are what puts HoloLens light years ahead of its nearest competition.

This stuff is hard to do. Computer Vision was formerly an obscure avenue of computer science not many people studied. Now augmented reality has created a war for talent in this sector, with a small (but growing) number of Computer Vision PhDs commanding huge salaries from well funded startups. There are very few companies that have the Computer Vision expertise to make mixed reality work, and this talent is jealously guarded.

[BONUS] Cloud Super-intelligence

The AR headset of the future is a light, comfortable, and truly mobile device you wear everywhere. This requires a constant, fast connection to the Internet. HoloLens is Wifi only for now, but LTE support must be on the horizon. Not only is this critical for everyday-everywhere use, but many advanced computer vision functions such as object recognition need cloud-based AI systems to analyze images and video. With the explosion of deep learning and machine learning technology, a fast 5G connection to these services will make Mixed Reality glasses something you never want to leave the house without.

Don’t Waste My Time

A lot of people seem impressed with highly staged demos of half baked hardware. It’s only when you begin to develop mixed reality apps that you understand what’s really needed to make these platforms successful. Demos without the critical elements listed in this post will be harder to impress with once more people are familiar with the technology.

My Week with PSVR

Full disclosure, I’ve had a PSVR devkit for some time now, so this isn’t my first experience with the device. However, this certainly is my first taste of most PSVR launch content. I figured I’d post my impressions after a week with my PSVR launch bundle.

Best Optics In the Business

PSVR does not use fresnel lenses, thus you don’t see any god rays and glare on high contrast screens. Vive and Rift both suffer from these problems, which makes PSVR look a lot better than the competition. Many cite the lower resolution of the PSVR display as a problem, but I don’t think numbers tell the whole story. The screen door effect is not very noticeable, and I suspect there’s some way PSVR is packing those pixels together that make the slightly lower resolution a non-issue. PSVR looks great.

Fully Integrated With Sony’s Ecosystem

The great thing about the platform is they are combining a mature online store and gaming social network with VR. In many cases PSVR is ahead of the competition in community features. When you first don the PSVR headset, you’ll see the standard PlayStation 4 interface hovering in front of you as a giant virtual screen. Thus, all current PSN features are available to you in VR already. You can even click the Share button and stream VR gameplay live. There’s also a pop up menu to manage your friends list, invites, etc. inside any VR experience. The only weird thing is when you get an achievement you hear the sound, but don’t see any overlay telling you what you did.

Tracking Issues

PSVR uses colored LED lights for optical tracking–essentially the same solution Sony created for their PS3 Move controllers in 2010. In fact, the launch bundle comes with what seem to be new, deadstock Move controllers as its hand tracking solution.

Tracking is iffy. It seems that lamps, bright lights, and sunlight streaking through windows can throw PSVR’s tracking off. I find that it works much better at night with the room lights visible to the PS4 Eye camera turned off. I also replaced my original PlayStation 4 Eye camera with the V2 version in the launch bundle to no avail.

Even more annoying is calibration. Holding the PSVR up in precise positions so that the lights are visible to the camera can be quite a pain. Not only that, but many games require their own calibration involving standing in a place where your head fits inside a camera overlay representing the best position to play in.

The hand controllers are jittery even under the best circumstances. Some games seem to have smoother tracking than others–probably via filtering Move input data. Still, given the price of the bundle, Move is an acceptable solution. Just not ideal.

One advantage to this approach is PSVR can also track the DualShock 4 via that previously annoying light bar on the back. Having a positionally tracked controller adds an element of immersion to non-hand tracked games previously unseen.

The Content

Despite PSVR using a PS4 which pales in power compared to, say, a juiced up Oculus-ready PC, the PSVR launch experiences are second to none. Sony is an old pro at getting together strong titles to launch a new platform. They have made some great choices here.

Worlds

The amount of free content you get with the Launch Bundle is staggering. In addition to the new VR version of Playroom and a disc filled with free demos, you also get Worlds–Sony London’s brilliant showcase of VR mini games and experiences. The Deep is a perfect beginner’s VR introduction–a lush, underwater experience that rivals anything I’ve seen on Rift or Vive. London Heist is my favorite, combining storytelling and hand-tracked action in what is often compared to a VR Guy Ritchie film.

Arkham VR

This is the single coolest VR experience I’ve ever had. It’s really more like a narrative experience with some light gameplay elements. Some are complaining that this barely qualifies as a game and is way too short for $20, but I disagree. This is the gold standard in VR storytelling–a truly unique experience that a lot of developers can learn from. It combines puzzle solving, story, interactive props, and immersive environments into a VR experience that makes you really feel like the Caped Crusader. This is the game I use to showcase PSVR and nobody has left disappointed.

cvb9h4uusae0dk4

Battlezone

Battlezone is my other favorite launch title right now, if I can find other people online (a definite problem given the small, but growing PSVR user base). This is a VR update to Atari’s coin-op classic in the form of a co-op multiplayer vehicle shooter. Guide a team of futuristic tank pilots over a randomly generated hexagonal map as you journey on a quest to destroy the enemy base. This game requires great teamwork and voice communication, which makes it all the more immersive. The positionally tracked DualShock 4 adds to the immersion in the cockpit as well.

Rigs

Guerilla does everything wrong (including uninterruptible tutorials) in VR here, defying all conventions. I have no problems with it, but this makes almost everyone I know violently ill. Apparently I am immune to VR sickness. Rigs is probably unplayable by the vast majority of players even with all the comfort modes turned on. If you want to test your so-called “VR Legs”, then try this game. If you can manage to play this without puking, you’re in for a great competitive online experience–that is, if you can find other players easily.

Wayward Sky

This game started out last year as a Gear VR launch title called Ikarus, which was pulled from the store shortly after its release. Uber’s small mobile VR demo has now reappeared on PSVR as the expanded and enhanced Wayward Sky–an innovative take on point-and-click adventure games in VR. The first stage is essentially a remixed and remastered version of the short Gear VR demo that came out last year. Once you complete this stage, the game opens up with a lot more levels and an all new story line. This is another gentle introduction to VR as it doesn’t involve a lot of movement or complicated mechanics. It’s largely point and click puzzle solving affair, with a few areas that require you to use your hands to manipulate objects.

In Conclusion

img_3678

My dream VR platform would be PSVR’s optics, Vive’s tracking, and Oculus’ controllers. Until that singularity happens, we’re stuck with all of these different systems. PSVR is incredibly compelling, and the platform I recommend to most people. It’s cheap and surprisingly good. Most of my current favorite VR games are on PSVR right now. I personally don’t find its limitations a problem–but it will be interesting to see how the average gaming public responds. Initial sales are promising, and there is way more high profile VR content on the horizon. Dare I say Sony has won this first round?

My Favorite VR Experiences So Far

Now that I’ve had plenty of time to go through the launch content of both Oculus and Vive, I figured I’d highlight my favorite experiences you can try for both devices instead of a typical product review. Many of these games are available for both platforms, while some are exclusive.

IMG_2589

My Retail Vive Finally Arrived!

Adr1ft (Oculus)

This is the flagship release for Oculus and deservedly so. Although not a pure VR experience (it also works as a standard game), it’s an absolutely wild trip in VR. Billed as a First Person Experience (FPX), it ranks somewhere between a walking simulator like Firewatch and an adventure such as Bioshock on the “Is It a Game?” scale.

This is consistently one of the top-selling Oculus titles, yet ranks near the bottom on comfort. I had no nausea issues at all, but I generally don’t feel uncomfortable in most VR games. I can see how free-floating in zero gravity, desperately grasping at oxygen canisters as you slowly suffocate to death inside a claustrophobic space suit can cause issues with those prone to simulation sickness. Regardless, this shows that it pays to be hardcore when making VR experiences–especially at this early adopter stage of the market.

A stunning debut for Adam Orth’s threeonezero studio.

Firma (Oculus)

This perhaps one of my absolute favorite pure VR games so far. Think Lunar Lander, Space Taxi or Thrust in VR. If this was a standard video game, it would be mundane, but as a VR experience I really do feel like I have a job piloting a tiny lander craft on a desolate moon base. It actually sort of achieves presence for me–but not the feeling of being in another reality…more like being in an ‘80s sci-fi movie.

Originally available via Oculus Share for years–it’s obvious that a lot of work has been put into this game to get it here for the commercial Oculus release. There are tons of missions, great voice acting, and a lot of fun mechanics and scenarios. This game is giving me plenty of ideas on how to adapt my old Ludum Dare game to VR.

Strangely, this game is in Oculus’ Early Access section, even though I consider it a complete game.

The Apollo 11 Virtual Reality Experience (Oculus, Vive)

An astounding educational journey through America’s moon landing told via VR. This is better than any field trip I took as a kid to the Boston Museum of Science, that’s for sure. This is just the tip of the spear when it comes to education and VR.

Hover Junkers (Vive)

Hover Junkers requires the most physical activity out of any VR game I’ve played–So much so that after 20 minutes of shooting, cowering behind my hovercraft’s hull for cover, and frantically speeding around post-apocalyptic landscapes, my Vive was soaked in sweat. One thing is for sure, public VR arcades are going to need some kind of solution to keep these headsets sanitary. Hover Junkers certainly is the most exciting multiplayer experience I’ve had in VR so far.

Budget Cuts (Vive)

The absolute best example of room scale VR. I didn’t really get it when watching the videos, but when I was finally able to try the demo on my own Vive….wow. This is the game I let everyone try when they first experience Vive. It really nails the difference between seated, controller-based VR and a room scale hand-tracked experience. This is the first “real game” I’ve played that uses all of these elements. So many firsts here, and done so well.

The past month has been a very encouraging start for VR. At this early stage there are already several games that give me that “just one more try” lure. This is surprising given that many current VR titles are small in scope, and in some cases partially-finished early access experiences. With the launch of PSVR later this year, we’re sure to see more full-sized VR games…whatever that means.

The Beginner’s Guide: Dave the Madman Edition

I recently played The Beginner’s Guide after buying it during the annual Holiday Steam Sale over the break. It’s a quick play through, and an interesting way to tell a story within a game. Without giving too much away, the experience reminded me of a similar event in my young-adulthood–When I encountered an amazing game developer who created incredible works I couldn’t hope to match. I’ve since forgotten his real name and don’t know much about him.  But I do have the 4 double-sided floppy disks he sent me of all his games at the time.

48C306D0-095A-4165-840F-0978DF42B7E7

Madsoft 1-4, recovered in great condition

This was the early ‘90s–I’d say around 1990-1991. I had made a bunch of Commodore 64 games (often with my late friend Justin Smith) using Shoot ‘Em Up Construction Kit: an early game development tool that let you build neat scrolling shooters without any programming knowledge.

ais

Adventures in Stupidity, one of my SEUCK creations

I used to upload my games to local BBSes in the New England area and wait for the response on the message boards. In the process, I downloaded some games made by a user known by the handle “MADMAN.”  Some of his games also used the moniker, “Dave the Madman.” He made seemingly professional quality games using Garry Kitchen’s Game Maker.  Not to be confused with YoYo’s GameMaker Studio.

Garry Kitchen’s Game Maker was an early game development tool published by Activision in 1985. I actually got it for my birthday in 1986, thinking that this was my key to becoming a superstar game designer. The thing is, Game Maker was a full blown programming language that, strangely, used the joystick to edit. It also included a sprite designer, music editor, and other tools. Everything a budding game developer would need to get started, right?

Although I did make a few simple games in Game Maker, its complexity was beyond my grasp at the time. Which is why Madman’s creations blew me away. They were so polished! He had developed so many completely different types of games! They all had cool graphics, animation, music, and effects I couldn’t figure out how to duplicate! My favorite was Space Rage: a sprawling, multi-screen space adventure that I simply could not comprehend. I had so many questions about how these games were made!

spacerage

SPACE RAGE!

We messaged each other on a local BBS. I blathered about how much of a fan I was of his work and he said he liked my games, too. I figured he was just being kind. After all, this was a MASTER saying this! We eventually exchanged phone numbers.

I have vague memories of talking to him on the phone, asking how he accomplished such amazing feats using Game Maker. I think he was a little older than me, but many of his games had a 1987 copyright date. Considering I was probably the same age at this time as he was in 1987, this made me feel quite inadequate.

As I recall, Madman was humble and didn’t have many aspirations beyond distributing his little games on BBSes. He seemed like a hobbyist that figured out Game Maker and really liked making games with it–nothing more, nothing less.

fatcat2

Fat Cat probably has the best animation of them all

After our call, he mailed me a complete collection of his games. A few years ago I found these floppy disks and copied them to my Mac using a 1541 transfer cable. The disks bear his handwriting, labeled “Madsoft” 1 – 4. I was able to rescue all of the disks, converting them to d64 format.

Playing through his creations was a real trip down memory lane. The most shocking thing I discovered is on the 2nd side of the 4th disk. His Archon-like game, Eliminators, features the text “Distributed by Atomic Revolution” on the bottom of the title screen. Atomic Revolution was a game ‘company’ I briefly formed with childhood friend, Cliff Bleszinski, around 1990 or so. It was a merger of sorts between my label, “Atomic Games”, and Cliff’s, “Revolution Games.” (The story about the C64 game he made in my parents’ basement is a whole other post!)

eliminators_title_2

An Atomic Revolution production?

I must have discussed handling the distribution of Eliminators with Dave; by uploading and promoting his awesome game all over the local BBS scene and sending them to mail-order shareware catalogs. At least that’s my best guess–I really have no recollection of how close we worked together. I must have done a terrible job since this game was almost completely lost to the mists of time.

I think we talked about meeting up and making a game together–but I didn’t even have my learner’s permit yet. On-line communication tools were primitive if they existed at all. We never really collaborated. I wonder what happened to “Dave the Madman” and his “Madsoft” empire? Is he even still alive? Did he go on to become a game developer, or at least a software engineer? Maybe he’ll somehow see this post and we’ll figure out the answer to this mystery!

ataxx

I remember he was most proud of his Ataxx homage

Until then, I’ll add the disk images of Madsoft 1-4 to this post. Check the games out, and let me know what you think. I’ve also put up some screenshots and videos of his various games–but I’m having problems finding a truly accurate C64 emulator for OSX. If anyone has any suggestions, let me know!

Here’s the link to the zip file. Check these games out for yourself!

 

My Week With Project Tango

A few weeks back I got into Google’s exclusive Project Tango developers program. I’ve had a Tango tablet for about a week and have been experimenting with the available apps and Unity3D SDK.

Project Tango uses Movidius’ Myriad 1 Vision Processor chip (or “VPU”), paired with a depth camera not too unlike the original Kinect for the XBOX 360. Except instead of being a giant hideous block, it’s small enough to stick in a phone or tablet.

I’m excited about Tango because it’s an important step in solving many of the problems I have with current Augmented Reality technology. What issues can Tango solve?

POSITIONAL TRACKING

First, the Tango tablet has the ability to determine the tablet’s pose. Sure, pretty much every mobile device out there can detect its precise orientation by fusing together compass and gyro information. But by using the Tango’s array of sensors, the Myriad 1 processor can detect position and translation. You can walk around with the tablet and it knows how far and where you’ve moved. This makes SLAM algorithms much easier to develop and more precise than strictly optical solutions.

Also, another problem with AR as it exists now is that there’s no way to know whether you or the image target moved. Rendering-wise, there’s no difference. But, this poses a problem with game physics. If you smash your head (while wearing AR glasses) into a virtual box, the box should go flying. If the box is thrown at you, it should bounce off your head–big distinction!

Pose and position tracking has the potential to factor out the user’s movement and determine the motion of both the observer and the objects that are being tracked. This can then be fed into a game engine’s physics system to get accurate physics interactions between the observer and virtual objects.

OCCLUDING VIRTUAL CHARACTERS WITH THE REAL WORLD

Anyway, that’s kind of an esoteric problem. The biggest issue with AR is most solutions can only overlay graphics on top of a scene. As you can see in my Ether Drift project, the characters appear on top of specially designed trading cards. However, wave your hand in front of the characters, and they will still draw on top of everything.

Ether Drift uses Vuforia to superimpose virtual characters on top of trading cards.

Ether Drift uses Vuforia to superimpose virtual characters on top of trading cards.

With Tango, it is possible to reconstruct the 3D geometry of your surroundings using point cloud data received from the depth camera. Matterport already has an impressive demo of this running on the Tango. It allows the user to scan an area with the tablet (very slowly) and it will build a textured mesh out of what it sees. When meshing is turned off the tablet can detect precisely where it is in the saved environment mesh.

This geometry can possibly be used in Unity3D as a mesh collider which is also rendered to the depth buffer of the scene’s camera while displaying the tablet camera’s video feed. This means superimposed augmented reality characters can accurately collide with the static environment, as well as be occluded by real world objects. Characters can now not only appear on top of your table, but behind it–obscured by a chair leg.

ENVIRONMENTAL LIGHTING

Finally, this solves the challenge of how to properly light AR objects. Most AR apps assume there’s a light source on the ceiling and place a directional light pointing down. With a mesh built from local point cloud data, you can generate a panoramic render of where the observer is standing in the real world. This image can be used as a cube map for Image-based lighting systems like Marmoset Skyshop. This produces accurate lighting on 3D objects which when combined with environmental occlusion makes this truly a next generation AR experience.

A QUICK TEST

The first thing I did with the Unity SDK is drop the Tango camera in a Camera Birds scene. One of the most common requests for Camera Birds was to be able to walk through the forest instead of just rotating in place. It took no programming at all for me to make this happen with Tango.

This technology still has a long way to go–it has to become faster and more precise. Luckily, Movidius has already produced the Myriad 2, which is reportedly 3-5X faster and 20X more power efficient than the chip currently in the Tango prototypes. Vision Processing technology is a supremely nerdy topic–after all it’s literally rocket science. But it has far reaching implications for wearable platforms.