How To Demo HoloLens Apps In Public

Last week’s VRLA Summer Expo was the first time the public got a look at my current HoloLens project, Ether Wars. Tons of people lined up to try it. I must have done well over 100 demos over the two day event. Since then, I’ve showed it to a variety of developers, executives, and investors ranging from zero experience to those who have used HoloLens quite a bit. Combined with all the demoing done at HoloHacks a few months ago, I’ve learned a lot of common sense tips when demoing mixed reality apps. I figured I’d sum up some of my presentation tricks here.

Know Your Space

HoloLens can be a very temperamental device. Although it features the most robust tracking I’ve ever seen with an AR headset, areas with a lot of moving objects (pets, crowds of people), featureless walls, windows, and mirrors can really mess things up. Also, rooms that are too dark or too bright can make the display look not so great.

If you are travelling somewhere to show your app, try to find out ahead of time what the room is like that you are demoing in. It might be possible to ask for a few alternative rooms if the space they’ve got you in is inappropriate.

And, how do you know if the space is inappropriate? Scan the room before the demo starts. In the case of Ether Wars, you have to scan the room before you play the game. This scan is saved, so subsequent games don’t have to go through that process. When I demo the game, I scan the room myself to make sure the room works before I let others use it. This not only lets me know if the room works but allows the rest of the users to skip this sometimes lengthy step.

Consider building demo-specific safety features. For instance, Ether Drift needs ceilings to spawn space stations from. In the case of a room with a vaulted ceiling the HoloLens can’t scan, a safety feature would be one that automatically spawns the bases at ceiling height for demo purposes.

Teach The Air Tap

Microsoft’s mantra for HoloLens the interfaces is “Gaze, Gesture, and Voice“-essentially a conroller-free interface for all HoloLens apps. Very cool in concept, but I find at least half the people who try the device can’t reliably perform the air tap. It’s a tricky and unnatural gesture. Most people want to reach out and poke the holograms with their finger. It takes quite a bit of explanation to teach users that they must aim with their head and perform that weird air tap motion to click on whatever is highlighted by the cursor.

airtap

Teach the user how to perform the air tap before the demo–perhaps by having them actually launch and pin the app on a wall. It might help to put a training exercise in the app itself. For instance, to start Ether Wars you have to gaze and air tap on a button to start the experience. I use this moment to teach the player how to navigate menus and use the air tap.

Worst case scenario, you can stick your arm over the player’s shoulder in view of the HoloLens and perform the air tap yourself if the user just can’t figure it out.

Check The Color Stack

Unlike VR, it’s difficult to see what the user is viewing when demoing a HoloLens app. You can get a live video preview from the Windows Device Portal. However, this can affect the speed and resolution of the app. Thus, degrading the performance of your demo. One trick I’ve used to figure out where the user currently is in the demo is to learn what the colors of the stacked display look like on different screens.

IMG_2687

Each layer of the display shows different colors

If you look at the side of the HoloLens display you’ll see a stack of colored lights. These colors change depending on what is being shown on the screen. By observing this while people are playing Ether Wars, I’ve learned to figure out what screen people are on based on how the lights look on the side of the device. Now I don’t have to annoyingly ask “what are you seeing right now” during the demo.

None of this is rocket science–just some tips and tricks I’ve learned while demoing Hololens projects over the past month or so. Let me know if you’ve got any others to add to the list.

So, You Wanna Make A Pokemon Go Clone?

I told you not to do it.

IMG_3003

But suddenly my 2013 blog post about displaying maps in Unity3D is now my top page of the month. There are lots of Pokemon Go clones being built right now.

Well, if you absolutely insist, here’s how I’d go about it.

Step 1: Raise tons of money

You’re going to need it. And it’s not just for user acquisition. You’ll need a lot of dry powder for scaling costs in the unlikely event this game is as successful as you’ve claimed to your investors. For small apps, accessing something like the Foursquare API may be free–but it will require an expensive licensing deal to use it at the scale you’re thinking of and without restrictions.

Step 2: Buy every single location based game you can

Just having access to a places API such as Foursquare or Factual isn’t enough. You need location data relevant to a game–such as granular details about places inside of larger locations that are of interest to players. Pokemon Go has this from years of Ingress players submitting and verifying locations around the world.

Nearly 10 years ago, there was a frenzy of investment in location based games. The App Store is now littered with dead husks of old LBS games and ones that are on life support. With that pile of money you raised, it should be easy to go on a shopping spree and buy up these games. Not for their users, or even the technology, but for the data. Most of these games may have been fallow for years, making their location data stale. Yet, it may be possible with machine learning or old fashioned elbow grease to work that data into a layer of interesting sub-locations for your game to be designed around.

Step 3: Plan for Database Hell

Designing for scale at the start is a classic mistake for any startup. You’re effectively building a football stadium for a carload of people. That doesn’t mean you shouldn’t entertain the idea of scaling up a service once it’s successful.

Full disclosure, I’ve never built an app at the scale of Pokemon Go. Few people have. I suspect many of the server issues are related to scaling a geospatial database with that many users. It’s much harder to optimize your data around location than other usage patterns. Don’t take my word for it, check out this analysis.

It’s been years since I’ve looked at geospatial databases. Despite some announcements, it doesn’t look like a lot has changed. A cursory search suggests PostGIS is still a solid choice. Plus, there are a lot of Postgres experts out there that can help with scaling issues. MongoDB’s relatively new spatial features may also be an option.

As for fancier alternatives–Google App Engine is an easy way to “magically” scale an app. They have also started releasing really interesting new geospatial services. Not to mention some great support for mobile apps that may make integrating with Unity3D a bit easier. However, GAE  is very expensive at scale, and the location features are still in alpha. Choosing Google App Engine is a risky decision, but also may be an easy way to get started.

To avoid vendor lock-in, have a migration strategy in mind. One of which may be using your pile of money to recruit backend people from startups with large amounts of users.

Step 4: Get Ready for the Disappointing State of Mobile AR

Pokemon Go has sparked a lot of renewed interest in AR. Much like geospatial databases, not much has changed in the past 5 years as far as what your average smartphone can do. Sure, beefier processors and higher res cameras can get away with some limited SLAM functionality. But, these features are very finicky. Your best bet is to keep AR to a minimum, as Pokemon Go smartly did. Placing virtual objects on real world surfaces in precise locations, especially outdoors, is the realm of next generation hardware.

Step 5: ??????

Ok, this isn’t a precise recipe for a Pokemon Go clone. But hey, if you’ve completed step one, maybe you should contact me for more details?

The Beginner’s Guide: Dave the Madman Edition

I recently played The Beginner’s Guide after buying it during the annual Holiday Steam Sale over the break. It’s a quick play through, and an interesting way to tell a story within a game. Without giving too much away, the experience reminded me of a similar event in my young-adulthood–When I encountered an amazing game developer who created incredible works I couldn’t hope to match. I’ve since forgotten his real name and don’t know much about him.  But I do have the 4 double-sided floppy disks he sent me of all his games at the time.

48C306D0-095A-4165-840F-0978DF42B7E7

Madsoft 1-4, recovered in great condition

This was the early ‘90s–I’d say around 1990-1991. I had made a bunch of Commodore 64 games (often with my late friend Justin Smith) using Shoot ‘Em Up Construction Kit: an early game development tool that let you build neat scrolling shooters without any programming knowledge.

ais

Adventures in Stupidity, one of my SEUCK creations

I used to upload my games to local BBSes in the New England area and wait for the response on the message boards. In the process, I downloaded some games made by a user known by the handle “MADMAN.”  Some of his games also used the moniker, “Dave the Madman.” He made seemingly professional quality games using Garry Kitchen’s Game Maker.  Not to be confused with YoYo’s GameMaker Studio.

Garry Kitchen’s Game Maker was an early game development tool published by Activision in 1985. I actually got it for my birthday in 1986, thinking that this was my key to becoming a superstar game designer. The thing is, Game Maker was a full blown programming language that, strangely, used the joystick to edit. It also included a sprite designer, music editor, and other tools. Everything a budding game developer would need to get started, right?

Although I did make a few simple games in Game Maker, its complexity was beyond my grasp at the time. Which is why Madman’s creations blew me away. They were so polished! He had developed so many completely different types of games! They all had cool graphics, animation, music, and effects I couldn’t figure out how to duplicate! My favorite was Space Rage: a sprawling, multi-screen space adventure that I simply could not comprehend. I had so many questions about how these games were made!

spacerage

SPACE RAGE!

We messaged each other on a local BBS. I blathered about how much of a fan I was of his work and he said he liked my games, too. I figured he was just being kind. After all, this was a MASTER saying this! We eventually exchanged phone numbers.

I have vague memories of talking to him on the phone, asking how he accomplished such amazing feats using Game Maker. I think he was a little older than me, but many of his games had a 1987 copyright date. Considering I was probably the same age at this time as he was in 1987, this made me feel quite inadequate.

As I recall, Madman was humble and didn’t have many aspirations beyond distributing his little games on BBSes. He seemed like a hobbyist that figured out Game Maker and really liked making games with it–nothing more, nothing less.

fatcat2

Fat Cat probably has the best animation of them all

After our call, he mailed me a complete collection of his games. A few years ago I found these floppy disks and copied them to my Mac using a 1541 transfer cable. The disks bear his handwriting, labeled “Madsoft” 1 – 4. I was able to rescue all of the disks, converting them to d64 format.

Playing through his creations was a real trip down memory lane. The most shocking thing I discovered is on the 2nd side of the 4th disk. His Archon-like game, Eliminators, features the text “Distributed by Atomic Revolution” on the bottom of the title screen. Atomic Revolution was a game ‘company’ I briefly formed with childhood friend, Cliff Bleszinski, around 1990 or so. It was a merger of sorts between my label, “Atomic Games”, and Cliff’s, “Revolution Games.” (The story about the C64 game he made in my parents’ basement is a whole other post!)

eliminators_title_2

An Atomic Revolution production?

I must have discussed handling the distribution of Eliminators with Dave; by uploading and promoting his awesome game all over the local BBS scene and sending them to mail-order shareware catalogs. At least that’s my best guess–I really have no recollection of how close we worked together. I must have done a terrible job since this game was almost completely lost to the mists of time.

I think we talked about meeting up and making a game together–but I didn’t even have my learner’s permit yet. On-line communication tools were primitive if they existed at all. We never really collaborated. I wonder what happened to “Dave the Madman” and his “Madsoft” empire? Is he even still alive? Did he go on to become a game developer, or at least a software engineer? Maybe he’ll somehow see this post and we’ll figure out the answer to this mystery!

ataxx

I remember he was most proud of his Ataxx homage

Until then, I’ll add the disk images of Madsoft 1-4 to this post. Check the games out, and let me know what you think. I’ve also put up some screenshots and videos of his various games–but I’m having problems finding a truly accurate C64 emulator for OSX. If anyone has any suggestions, let me know!

Here’s the link to the zip file. Check these games out for yourself!

 

The Basics of Hand Tracked VR Input Design

Ever since my revelation at Oculus Connect I’ve been working on a project using hand tracking and VR. For now, it’s using my recently acquired Vive devkit. However, I’ve been researching design techniques for PSVR and Oculus Touch to keep the experience portable across many different hand tracking input schemes. Hand tracking has presented a few new problems to solve, similar to my initial adventures in head tracking interfaces.

The Vive's hand controller

Look Ma, No Hands!

The first problem I came across when designing an application that works on both Vive and Oculus Touch is the representation of your hands in VR. With Oculus Touch, most applications feature a pair of “ghost hands” that mimic the current pose of your hands and fingers. Since Oculus’ controllers can track your thumb and first two fingers, and presumably the rest are gripped around the handle, these ghost hands tend to accurately represent what your hands are doing in real life.

Oculus Touch controller

This metaphor breaks down with Vive as it doesn’t track your hands, but the position of the rod-like controllers you are holding. Vive games I’ve tried that show your hands end up feeling like waving around hands on a stick–there’s a definite disconnect between the visual of your hands in VR and where your brain thinks they are in real life. PSVR has this problem as well, as the Move controllers used with the current devkit are similar to Vive’s controllers.

You can alleviate this somewhat. Because there is a natural way most users tend to grip Move and Vive controllers, you can model and position the “hand on a stick” in the most likely way the controllers are gripped. This can make static hands in VR more convincing.

In any case, you have a few problems when you grab an object.

For Oculus, the act of grabbing is somewhat natural–you can clench your first two fingers and thumb into a “grab” type motion to pick something up. In the case of Bullet Train, this is how you pick up guns. The translucent representation of your hands means you can still see your hand pose and the gripped object at the same time. There’s not much to think about other than where you attach the held object to the hand model.

It also helps that in Bullet Train the objects you can grab have obvious handles and holding points. You can pose the hand to match the most likely hand position on a grabbed object without breaking immersion.

With Vive and PSVR you have a problem if you are using the “hand on a stick” technique. When you “grab” a virtual object by pressing the trigger, how do you show the hand holding something? It seems like the best answer is, you don’t! Check this video of Uber Entertainment’s awesome Wayward Sky PSVR demo:

Notice anything? When you grab something, the hand disappears. All you can see is the held object floating around in front of you.

This is a great solution for holding arbitrary shaped items because you don’t have to create a potentially infinite amount of hand grip animations. Because the user isn’t really grabbing anything and is instead clicking a trigger on a controller, there is no “real” grip position for your hand anyway. You also don’t have the problem of parts of the hands intersecting with the held object.

This isn’t a new technique. In fact, one of the earliest Vive demos, Job Simulator, does the exact same thing. Your brain fills in the gaps and it feels so natural that I just never noticed it!

Virtual Objects, Real Boundaries

The next problem I encountered is what do you do when your hand passes through virtual objects, but the objects can’t? For instance, you can be holding an object, and physically move your real, tracked hand through a virtual wall. The held object, bound by the engine’s physics simulation, will hit the wall while your hand continues to drag it through. Chaos erupts!

You can turn off collisions while an object is held, but what fun is that? You want to be able to knock things over and otherwise interact with the world while holding stuff. Plus, what happens when you let go of an object while inside a collision volume?

What I ended up doing is making the object detach, or fall out of your virtual hand, as soon as it hits something else. You can tweak this by making collisions with smaller, non-static objects less likely to detach the held object since they will be pushed around by your hand.

For most VR developers these are the first two things you encounter when designing and experience for hand-tracking VR systems. It seems Oculus Touch makes a lot of these problems go away, but we’ve just scratched the surface of the issues needed to be solved when your real hands interact with a virtual world.

The Challenge of Building Augmented Reality Games In The Real World

InnAR Wars Splash Image - B

Last week I submitted the prototype build of my latest augmented reality project, InnAR Wars, to Google’s Build a Tango App Contest. It’s an augmented reality multiplayer space RTS built for Google’s Tango tablet that utilizes the environment around you as a game map. The game uses the Tango’s camera and Area Learning capabilities to superimpose an asteroid-strewn space battlefield over your real-world environment. Two players holding Tangos walk around the room hunting for each other’s bases while sending attack fleets at the other player’s structures.

Making InnAR Wars fun is tricky because I essentially have no control over the map. The battlefield has to fit inside the confines of the real-world environment the tablets are in. Using the Tango’s Area Learning capabilities with the positions of players, I know the rough size of the play area. With this information I adjust the density of planetoids and asteroids based on the size of the room. It’s one small way I can make sure the game at least has an interesting number of objects in the playfield regardless of the size of the area. As you can see from the videos in this post, it’s already being played in a variety of environments.

This brings up the biggest challenge of augmented reality games–How do you make a game fun when you have absolutely no control over the environment in which it’s played? One way is to require the user to set up the play space as if she were playing a board game. By using Tango’s depth camera, you could detect the shapes and sizes of objects on a table and use those as the playfield. It’s up to the user to set it up in a way that’s fun–much like playing a tabletop war game.

For the final release, I’m planning on using Tango’s depth camera to figure out where the room’s walls, ceilings, and floors are. Then I can have ships launch from portals that appear to open on the surfaces of the room. Dealing with the limited precision and performance of the Tango depth camera along with the linear algebra involved in plane estimation is a significant challenge. Luckily, there are a few third-party solutions for this I’m evaluating.

Especially when looking at augmented reality startups’ obligatory fake demo videos, the future of AR gaming seems exciting. But the practical reality of designing a game to be played in reality–which is itself rather poorly designed–can prevent even the most amazing technology from enabling great games. It’s probably going to take a few more hardware generations to not only make the technology usable, but also develop the design language to make great games that work in AR.

If you want to try out the game, I’ll have a few Tangos on hand at FLARB’s VRLA Summer Expo table. Stop by and check it out!

Why I’m All In On Mobile VR

Last month I released Caldera Defense, a Virtual Reality tower defense game on Gear VR. This is the second Gear VR title I’ve worked on, and the first I’ve built and published from the ground up. (Not including my Oculus Mobile VR Jam submission) Caldera Defense is a free early access demo–basically a proof of concept of the full game–and the reaction has been great. Thousands of people have downloaded, rated, and given us valuable feedback. We’re busy incorporating it into the first update.

Caldera Defense featured on the Gear VR store

Originally I planned to use this as a demo to fund an expanded PC and Morpheus launch version of the game with greatly improved graphics, hours of gameplay, and additional features such as multiplayer and second-screen options.

However, pitching even a modestly budgeted console and PC VR game experience to publishers, or even the platforms themselves, is a tough sell. I’m sure at E3 next month we will see all sorts of AAA VR announcements. Yet, many traditional funding avenues for games remain skeptical of the opportunity VR presents.

Since the Caldera project began last year, mobile VR has morphed into a unique opportunity. With over a million Google Cardboards in the wild and new versions of the Gear VR headset in retail stores worldwide, there will be millions of mobile VR users before there’s comparable numbers on Oculus desktop, Vive, and Morpheus.

Is it possible that mobile VR will be a viable business before it is on PC and consoles? Most of my colleagues are skeptical. I’m not.

The economics work out. Due to the mobile nature of the experience, games and apps for these platforms tend towards the bite-sized. This greatly reduces the risk of mobile VR since assets optimized for mobile are simpler and casual VR experiences require less content to be built overall.

I can make a dozen mobile VR minimum viable products for the same budget of one modestly scoped Morpheus experience. From these MVPs I can determine what types of content gains the most traction with VR users and move in that direction. I can even use this data to guide development of larger AAA VR experiences later.

By this time next year it will be possible to monetize these users significantly, whether through premium content or advertising. It may be more valuable to collect a lot of eyeballs in mobile VR than breaking even on a multi-million dollar AAA launch tile. As we’ve seen in the past, acquiring a huge audience of mobile players can lead to tremendous revenue streams.

Being on the Oculus desktop, Vive, or Sony’s Morpheus deck at launch is an enormous opportunity. In fact, I’m still searching for ways to produce the console and desktop version of Caldera Defense. However, if you lack the capital to produce at that scale, smaller mobile projects are much easier to bootstrap and the upside is huge.

Adult Contemporary Video Games

One of my favorite Combat Jack podcasts of 2014 is when they interviewed legendary hip hop producer, Marley Marl over the Summer.  Marly Marl invented the modern hip-hop sound most take for granted and created the Juice Crew, one of the most important groups of MCs ever.

The Juice Crew

Before producing hit records, Marley had a career as an on-air DJ, starting on Mr. Magic‘s show on KISS-FM in New York.  In the ’90s he went on to host “Future Flavas” with Pete Rock on Hot 97.  Marley Marl was also still producing hit albums for the likes of LL Cool J and Lords of the Underground.

Times change, and Marley Marl isn’t producing music for 20 year olds anymore.  While many DJs desperately hang on to their fading youth, Marley tried another tactic.  He moved over to WBLS which plays old school hip hop for a mature audience.

it just so happens, rap fans in their fourties and beyond have far more disposable income than those in their teens and twenties.  His WBLS show has gone on to be a great success.  It turns out that despite being a youth-powered movement, there’s plenty of advertising dollars in hip-hop appealing to older rap fans.

This got me thinking about video games.

A lot of veteran developers are debating about the decline of AAA games in the face of the disruptive waves of free2play and mobile.  Many gamers in their demographic agree.  If that’s the case, why not appeal to this older audience?

The challenge to monetizing these gamers is that although they have the same taste in games they may have had over a decade ago, their play styles are vastly different due to lifestyle changes.  If you’ve got kids or a demanding job, perhaps you no longer have 120+ hours to spend playing an RPG. However, you might digest the same style of game in shorter episodic bursts on a tablet or smartphone.

Some developers have caught on to this and produce what I call Adult Contemporary Video Games.  A good example is the 1980s pencil and paper RPG, Shadowrun.  Microsoft’s attempt at AAA shooter based on Shadowrun was an abject failure (although I quite liked it).  Five years later, Harebrained Schemes went from a surge of support on Kickstarter for “Shadowrun Returns” to a series of popular mobile and PC downloadable games based on the franchise.

Shadowrun for iPad

This is a smart strategy–delivering content aimed at an older audience on newer devices.  Those of us who grew up not on just the original RPG, but the SNES and Genesis games were ripe for a new entry in the series.  This model has also seen success with Wasteland 2, and surely the upcoming Bard’s Tale sequel will continue the trend.

It remains to be seen if you can develop a new IP targeted at this audience.  A lot of what you hear on Adult Contemporary radio is old artists making new music.  In games it may be the same. So far, the genre seems to bank on nostalgia by resurrecting classic franchises for an older audience on new devices with updated play styles. Especially if you include teh current wave of retro remakes. While some veteran developers excel at creating games for the new mobile f2p masses, others may be more suited for this viable slice of the market.