How To Demo HoloLens Apps In Public

Last week’s VRLA Summer Expo was the first time the public got a look at my current HoloLens project, Ether Wars. Tons of people lined up to try it. I must have done well over 100 demos over the two day event. Since then, I’ve showed it to a variety of developers, executives, and investors ranging from zero experience to those who have used HoloLens quite a bit. Combined with all the demoing done at HoloHacks a few months ago, I’ve learned a lot of common sense tips when demoing mixed reality apps. I figured I’d sum up some of my presentation tricks here.

Know Your Space

HoloLens can be a very temperamental device. Although it features the most robust tracking I’ve ever seen with an AR headset, areas with a lot of moving objects (pets, crowds of people), featureless walls, windows, and mirrors can really mess things up. Also, rooms that are too dark or too bright can make the display look not so great.

If you are travelling somewhere to show your app, try to find out ahead of time what the room is like that you are demoing in. It might be possible to ask for a few alternative rooms if the space they’ve got you in is inappropriate.

And, how do you know if the space is inappropriate? Scan the room before the demo starts. In the case of Ether Wars, you have to scan the room before you play the game. This scan is saved, so subsequent games don’t have to go through that process. When I demo the game, I scan the room myself to make sure the room works before I let others use it. This not only lets me know if the room works but allows the rest of the users to skip this sometimes lengthy step.

Consider building demo-specific safety features. For instance, Ether Drift needs ceilings to spawn space stations from. In the case of a room with a vaulted ceiling the HoloLens can’t scan, a safety feature would be one that automatically spawns the bases at ceiling height for demo purposes.

Teach The Air Tap

Microsoft’s mantra for HoloLens the interfaces is “Gaze, Gesture, and Voice“-essentially a conroller-free interface for all HoloLens apps. Very cool in concept, but I find at least half the people who try the device can’t reliably perform the air tap. It’s a tricky and unnatural gesture. Most people want to reach out and poke the holograms with their finger. It takes quite a bit of explanation to teach users that they must aim with their head and perform that weird air tap motion to click on whatever is highlighted by the cursor.

airtap

Teach the user how to perform the air tap before the demo–perhaps by having them actually launch and pin the app on a wall. It might help to put a training exercise in the app itself. For instance, to start Ether Wars you have to gaze and air tap on a button to start the experience. I use this moment to teach the player how to navigate menus and use the air tap.

Worst case scenario, you can stick your arm over the player’s shoulder in view of the HoloLens and perform the air tap yourself if the user just can’t figure it out.

Check The Color Stack

Unlike VR, it’s difficult to see what the user is viewing when demoing a HoloLens app. You can get a live video preview from the Windows Device Portal. However, this can affect the speed and resolution of the app. Thus, degrading the performance of your demo. One trick I’ve used to figure out where the user currently is in the demo is to learn what the colors of the stacked display look like on different screens.

IMG_2687

Each layer of the display shows different colors

If you look at the side of the HoloLens display you’ll see a stack of colored lights. These colors change depending on what is being shown on the screen. By observing this while people are playing Ether Wars, I’ve learned to figure out what screen people are on based on how the lights look on the side of the device. Now I don’t have to annoyingly ask “what are you seeing right now” during the demo.

None of this is rocket science–just some tips and tricks I’ve learned while demoing Hololens projects over the past month or so. Let me know if you’ve got any others to add to the list.

So, You Wanna Make A Pokemon Go Clone?

I told you not to do it.

IMG_3003

But suddenly my 2013 blog post about displaying maps in Unity3D is now my top page of the month. There are lots of Pokemon Go clones being built right now.

Well, if you absolutely insist, here’s how I’d go about it.

Step 1: Raise tons of money

You’re going to need it. And it’s not just for user acquisition. You’ll need a lot of dry powder for scaling costs in the unlikely event this game is as successful as you’ve claimed to your investors. For small apps, accessing something like the Foursquare API may be free–but it will require an expensive licensing deal to use it at the scale you’re thinking of and without restrictions.

Step 2: Buy every single location based game you can

Just having access to a places API such as Foursquare or Factual isn’t enough. You need location data relevant to a game–such as granular details about places inside of larger locations that are of interest to players. Pokemon Go has this from years of Ingress players submitting and verifying locations around the world.

Nearly 10 years ago, there was a frenzy of investment in location based games. The App Store is now littered with dead husks of old LBS games and ones that are on life support. With that pile of money you raised, it should be easy to go on a shopping spree and buy up these games. Not for their users, or even the technology, but for the data. Most of these games may have been fallow for years, making their location data stale. Yet, it may be possible with machine learning or old fashioned elbow grease to work that data into a layer of interesting sub-locations for your game to be designed around.

Step 3: Plan for Database Hell

Designing for scale at the start is a classic mistake for any startup. You’re effectively building a football stadium for a carload of people. That doesn’t mean you shouldn’t entertain the idea of scaling up a service once it’s successful.

Full disclosure, I’ve never built an app at the scale of Pokemon Go. Few people have. I suspect many of the server issues are related to scaling a geospatial database with that many users. It’s much harder to optimize your data around location than other usage patterns. Don’t take my word for it, check out this analysis.

It’s been years since I’ve looked at geospatial databases. Despite some announcements, it doesn’t look like a lot has changed. A cursory search suggests PostGIS is still a solid choice. Plus, there are a lot of Postgres experts out there that can help with scaling issues. MongoDB’s relatively new spatial features may also be an option.

As for fancier alternatives–Google App Engine is an easy way to “magically” scale an app. They have also started releasing really interesting new geospatial services. Not to mention some great support for mobile apps that may make integrating with Unity3D a bit easier. However, GAE  is very expensive at scale, and the location features are still in alpha. Choosing Google App Engine is a risky decision, but also may be an easy way to get started.

To avoid vendor lock-in, have a migration strategy in mind. One of which may be using your pile of money to recruit backend people from startups with large amounts of users.

Step 4: Get Ready for the Disappointing State of Mobile AR

Pokemon Go has sparked a lot of renewed interest in AR. Much like geospatial databases, not much has changed in the past 5 years as far as what your average smartphone can do. Sure, beefier processors and higher res cameras can get away with some limited SLAM functionality. But, these features are very finicky. Your best bet is to keep AR to a minimum, as Pokemon Go smartly did. Placing virtual objects on real world surfaces in precise locations, especially outdoors, is the realm of next generation hardware.

Step 5: ??????

Ok, this isn’t a precise recipe for a Pokemon Go clone. But hey, if you’ve completed step one, maybe you should contact me for more details?

There’s Nothing To Be Learned From Pokemon Go

Pokemon Go is a watershed moment in gaming. I’ve never seen a game have this much traction this fast. My neighborhood is filled with wandering players of all demographics, strolling around with phone in hand looking for Pokemon. Since the game’s launch, everyday has looked like Halloween without the costumes.

In general, the job of a venture capitalist is really easy. For most, you simply wait around for another firm to invest in something and then add to that round. Or, you can wait for something to be really successful and cultivate clones of it. I can guarantee there are now a few VCs with deals in motion to build a “fast follow” mimic of Pokemon Go.

Please don’t.

There is absolutely no way another developer can duplicate the success of this game. In fact, it remains to be seen if this game will be a success beyond its initial pop. No game has ever had an opening weekend of this scale–but still, remember Draw Something or maybe even Fallout Shelter? I’m enjoying Pokemon Go myself, but many of my colleagues are questioning whether it has legs. Regardless of that, any location-based game you may be thinking of making is probably missing a few key ingredients to Pokemon Go’s success.

IMG_2963

My pathetically low level character

Niantic has the Best Location Data in the Business

I’ve spent time building location-based service apps in the past. The biggest problem with making games that play over the real world is populating the map with interesting stuff to do. Firstly, there’s access to map data–on Pokemon Go’s scale, this is not cheap (although there are open source solutions). Simply having a map is one piece of the puzzle–you need to have information about how the locations are used. Which places are busiest? Where do players like to group up at?

Niantic has this data from years of running Ingress–pretty much the largest location-based game ever made. Over the years Ingress was running as a project fully funded and supported by Google, Niantic built an incredibly valuable data layer on top of the real world that has been repurposed for Pokemon Go.

You could possibly license similar information from other companies (Foursquare comes to mind), but Niantic’s data is probably more geared towards the activity patterns of mobile gamers than those who want to Instagram their lunch. (Granted, there’s a lot of overlap there)

Pokemon Is One of the Biggest IPs in the World

Previous to Pokemon Go, even prior to Ingress, there have been plenty of location-based games. Anyone remember Shadow Cities? Or Booyah? They may have just been too early–back then there weren’t enough smartphones to solve the density problem you have with location-based games. Now that smartphones are ubiquitous, how do you get enough players to fill up the world map? One way is to use one of the biggest video game IPs on the planet.

The demand for Nintendo IPs on other platforms is unprecedented.  The fervor for Pokemon in particular is huge–with lots of false Pokemon apps taken off Google Play and the App Store over the years. Investors have responded to this craze, with Nintendo’s stock jumping 25% since the release of Pokemon Go.

There really isn’t another IP as big as Pokemon that can be applied to a game of this scale. Sprinkle a little Pokemon on to a little Ingress and the results are explosive.

There’s nobody else on the planet that can do this. 

HoloHacks Is A Step Towards Mixed Reality Domination for Microsoft

Microsoft has been holding a series of hackathons for their new HoloLens platform in a number of cities across the US. I couldn’t make it to the original event in Seattle, but managed to compete in HoloHacks Los Angeles at the Creative Technology Center downtown.

IMG_2916

The event went from Friday evening to Sunday afternoon, concluding with final presentations and the judges awarding prizes to the winning apps. My team’s app, “A Day at the Museum,” won the visual design prize with a mixed reality replacement for museum audio tour earpieces.

The event worked like most hackathons I’ve attended–Come up with ideas, form teams, and get to hacking. Microsoft had us covered on the hardware front–not only did each team get to borrow two HoloLens devices, but you could even get a loaner Surface Book for the event if you don’t have a Windows PC to hack on. (As a MacBook Pro user, that option was a lifesaver)

I was lucky to sit at a table with Leone Ermer, Edward Dawson-Taylor, Chris Horton, Ed Hougardy, and Steven Winston. After brainstorming a few ideas we came up with the museum tour concept and got to work. We worked brilliantly as a team and everyone was critical in taking this project across the finish line.

image

HoloHacks isn’t just a promotional event. It’s an important step in building a community around the platform. By iterating with refreshing openness and allowing even complete novice developers to build apps to learn about HoloLens’ capabilities, Microsoft is placing themselves way ahead of the competition not just in technology but in the ecosystem that supports it.

No other augmented reality platform can operate at this scale. By the time other platforms can catch up to Hololens’ features, Microsoft will have a thriving ecosystem of Windows Holographic developers that other hardware vendors just can’t compete with. Why use some other platform when you can easily find developers and tools for HoloLens to get the job done? If the competition were smart, they’d focus on the developer community aspect just as much as the technology.

Debugging HoloLens Apps in Unity3D

I’ve been developing on HoloLens for a few weeks now, and I’m being re-acquainted with the tricky part of debugging hardware-specific augmented reality apps in Unity3D. I went through a lot of these issues with my Google Tango project, InnAR Wars, and so I’m sort of used to it.  However, having to wear the display on your head while testing code brings a whole new dimension of difficulty to debugging Augmented Reality applications. I figured I’d share a few tips that I use when debugging Unity3D HoloLens apps other than the standard Unity3d remote debugging tools you are used to from mobile development.

IMG_2757

Debugging in the Editor vs. Device

The first thing you need to do is to figure out how test code without deploying to the device. Generating a Visual Studio project, compiling, and uploading your application to one (or more) HoloLens headsets is a real pain when trying to iterate on simple code changes. It’s true, Unity3D can do none of HoloLens’ AR features in the editor, but there are times when you just have to test basic gameplay code that doesn’t require spatialization, localization, or any HoloLens specific features. There’s a few steps to make this easier.

Make A Debug Keyboard Input System

HoloLens relies mostly on simple gestures (Air Tap) and voice for input. The first thing you need to test HoloLens code in the Unity3D editor is a simple way to trigger whatever event fires off via Air Tap or speech commands through the keyboard. In my case, I wrote a tiny bit of code to use the space bar to trigger the Air Tap. Basically–anywhere you add a delegate to handle an Air Tap or speech command, you need to add some input code to trigger that same method via keyboard.

Use An Oculus Rift Headset

I was pleasantly surprised to find out Unity HoloLens Technical Preview supports Oculus Rift. Keep your Rift plugged in when developing for HoloLens. When you run your application in the Unity editor, it will show up inside the Rift–albeit against a black background. This is extremely helpful debugging code that uses gaze, positional audio, and even limited movement of the player via Oculus’ positional tracking.

Use The HoloLens Companion App

Microsoft provides a HoloLens companion app in the Windows Store with a few handy features. The app connects to HoloLens via WiFi and lets you record videos live from the headset (very useful for documented reproducible bugs and crashes). It lets you stop and start apps remotely which can be useful when trying to launch an app on multiple HoloLenses at the same time. You can also use your PC’s keyboard to send input to a remote HoloLens. This is convenient for multiplayer testing–use Air Tap with the one on your face, the companion app to trigger input on the other device.

These tips may make building HoloLens apps a little easier, but I really hope Microsoft adds more debugging features to future versions of the SDK. There are some simple things Microsoft could do to make development more hassle-free, although there’s really a limit to what you can do in the Unity Editor versus the device.

Developing Applications for HoloLens with Unity3D: First Impressions

I started work on HoloLens game development with Unity3D over the past week. This included going through all of the example projects, as well as building simple games and applications to figure out how all of the platform’s features work.  Here’s a some takeaways from my first week as a HoloLens developer.

CiPSvqKVAAAsvoP

Baby steps…

The Examples Are Great, But Lack Documentation

If you go through all of the Holo Academy examples Microsoft provides, you’ll go from displaying a basic cube to a full-blown multi user Augmented Reality experience. However, most of the examples involve dragging and dropping pre-made prefabs and scripts into the scene. Not a lot about the actual SDK is explained. The examples are a good way to get acquainted with HoloLens features, but you’re going to have to do more work to figure out how to write your own applications.

HoloToolkit is Incredibly Full Featured

All of the examples are based on HoloToolkit, Microsoft’s collection of scripts and prefabs that handle just about every major HoloLens application feature: input, spatial mapping, gesture detection, speech recognition, and even some networking.

I also found that features I needed (such as the placement of objects in the real world using real-time meshing as a collider) are features in the examples I could easily strip out and modify for my own C# scripts. Using these techniques I was able to get a very simple carnival milk bottle game running in a single Saturday afternoon.

Multiplayer Gets Complicated

I’m working on moving my award winning Tango RTS, InnAR Wars, to HoloLens. However, multiplayer experiences work much differently on HoloLens than Tango. In the case of Tango, each device shares a single room scan file and is localized in the same coordinate space. This means that once the game starts, placing an object (like a floating planet or asteroid) at any position will make it appear in the same real-world location on both Tangos.

HoloLens shares objects between devices using what are called Spatial Anchors. Spatial Anchors mark parts of the scanned room geometry as an anchored position. You can then place virtual objects in the real world relative to this anchor. When you share a Spatial Anchor with another device, the other HoloLens will look for a similar location in its own scan of the room to position the anchor. These anchors are constantly being updated as the scan continues, which is part of the trick to how HoloLens’ tracking is so rock solid.

Sure, having a single coordinate frame on the Tango is easier to deal with, but the Tango also suffers from drift and inaccuracies that may be symtomatic of its approach. Spatial Anchoring is a rather radical change from how Tango works–which means a lot of refactoring for InnAR Wars, or even a redesign.

First Week Down

This first week has been an enlightening experience. Progress has been fast but also made me aware of how much work it will be to produce a great HoloLens app. At least two independently published HoloLens games popped up in the Windows Store over the past few days. The race is on for the first great indie HoloLens application!

My Week With HoloLens

holobox

Microsoft ships the HoloLens and Clicker accessory in the box

My HoloLens development kits finally arrived a week ago. I’ve spent a great deal of time using the device over the past week. I figured I’d post my impressions here.

This Really Works

When I first put my HoloLens on, I placed an application window floating above my kitchen table. Suddenly, I realized I hadn’t taken out the garbage. Still wearing the device, I ran downstairs to put something in the trash. I narrowly missed my neighbor–successfully avoiding an awkward conversation about what this giant contraption on my face does.

When I returned to my kitchen, the application was still there–hovering in space.

As I’ve stated before, Microsoft blew me away. HoloLens is an absolutely incredible leap from previous generation AR glasses (and MUCH cheaper, believe it or not). It also does everything Tango does but at a much higher level of performance and precision. Which means most applications built on Tango can be directly moved over to HoloLens.

HoloLens is fast and intuitive enough to attempt getting actual work done with it. Yet, a lot of my time spent is just trying to make silly videos like this.

It’s A Full Blown Windows Machine

HoloLens isn’t just a prototype headset–it’s a full featured desktop Windows PC on your face. Not only can you run “Windows Holographic” apps, but any Universal Windows App from the Windows Store. Instead of these applications running in a window on a monitor, they float around in space–positioned wherever you choose.

Although HoloLens really does need a taskbar of some kind. It’s way too easy to forget where Skype physically is because you launched it in the bathroom.

It also helps to connect a Bluetooth keyboard and mouse when running standard applications. Gestures can’t give you the input fidelity of a traditional mouse, and typing in the air is a chore.

HoloLens’ narrow FOV makes using a regular Windows app problematic–as the screen will get cut off and require you to move your head around to see most of it. Also, if you push a window far enough into the background so you can see the whole thing, you’ll notice HoloLens’ resolution is a little low to read small text. We’re going to need a next generation display for HoloLens to really be useful for everyday computing.

Microsoft Has Created A New Input Paradigm

HoloLens can seemingly only recognize two gestures: bloom and “air tap”. Blooming is cool–I feel like a person in a sci-fi movie making the Windows start menu appear in the air by tossing it up with a simple gesture.

The air tap can be unintuitive. Most people I let try the HoloLens poke at the icons by stabbing them with a finger. That’s not what the air tap is for. You still have to gaze at a target by moving your head and then perform the lever-like air tap gesture within the HoloLens camera’s view to select what the reticule is on.

HoloLens can track the motion of your finger and use it as input to move stuff around (such as application windows), but not detect collisions between it and virtual objects. It’s as if it can detect the amount your finger moves but not its precise location in 3D space.

Using apps while holding your hand out in front of the headset is tiring. This is why Microsoft includes the clicker. This is a simple Bluetooth button that when pressed triggers the air tap gesture. Disappointingly, the clicker isn’t trackable–so you can’t use it as a true finger replacement.

Microsoft has adapted Windows to the holographic model successfully. This is the first full blown window manager and gesture interface for augmented reality I’ve ever seen and it’s brilliant. After a few sessions with the device, most people I’ve let use it are launching apps and moving windows around the room like a pro.

This Thing Is Heavy

Although the industrial design is cool in a retro ‘90s way, this thing is really uncomfortable to use for extended periods of time. Maybe I don’t have it strapped on correctly, but after a 20 minute Skype session I had to take the device off. I felt pain above the bridge of my nose. When I looked in the mirror, I saw what can only be described as ‘HoloHead’

holohead

The unfortunate symptom of “HoloHead”

The First Generation Apps Are Amazing

There already are great free apps in the Windows Store that show off the power of the HoloLens platform. Many made by Asobo Studio–a leader in Augmented Reality game development.

Young Conker

Young Conker is a great example of HoloLens as a games platform. The game is simple: after scanning your surroundings, play a familiar platform game over the floors, walls, tables and chairs as Conker runs, jumps and collects coins scattered about your room.

Conker will jump on top of your coffee table, run into walls, or be occluded by a chair as if he were walking behind it–well, depending on how accurate your scan is. The fact that this works as well as it does is amazing to me.

Fragments

One of the first true game experiences I’ve ever played in augmented reality. You play the part of a futuristic detective, revisiting memories of crimes as their events are re-created holographically in your location. Characters sit on your furniture. You’ll hunt for pieces of evidence scattered about your room–even under tables. It really is an incredible experience, As with Conker, it requires some pre-scanning of your environment. However, applications apparently can share scans between each other as Fragments was able to re-use a scan of my office I previously made with another app.

Skype

When Skyping with a person not using HoloLens, you simply place their video on a wall in your surroundings. It’s almost like talking to someone on the bridge of the Enterprise, depending on how big you make the video window.

When Skyping with another HoloLens user, you can swap video feeds so either participant can see through the other’s first person view. While looking at someone else’s video feed as a floating window, you can sketch over it with drawing tools or even place pictures from your photos folder in the other person’s environment. 2D lines drawn over the video feed will form around the other user’s real-world in 3D–bending around corners, or sticking to the ceiling. 

Conclusion

As a consumer electronics device, HoloLens is clearly beta–maybe even alpha, but surprisingly slick. It needs more apps. With Wave 2 underway, developers are working on just that. In my case, I’m moving all of my Tango projects to HoloLens–so you’ll definitely be seeing cool stuff soon!