Debugging HoloLens Apps in Unity3D

I’ve been developing on HoloLens for a few weeks now, and I’m being re-acquainted with the tricky part of debugging hardware-specific augmented reality apps in Unity3D. I went through a lot of these issues with my Google Tango project, InnAR Wars, and so I’m sort of used to it.  However, having to wear the display on your head while testing code brings a whole new dimension of difficulty to debugging Augmented Reality applications. I figured I’d share a few tips that I use when debugging Unity3D HoloLens apps other than the standard Unity3d remote debugging tools you are used to from mobile development.

IMG_2757

Debugging in the Editor vs. Device

The first thing you need to do is to figure out how test code without deploying to the device. Generating a Visual Studio project, compiling, and uploading your application to one (or more) HoloLens headsets is a real pain when trying to iterate on simple code changes. It’s true, Unity3D can do none of HoloLens’ AR features in the editor, but there are times when you just have to test basic gameplay code that doesn’t require spatialization, localization, or any HoloLens specific features. There’s a few steps to make this easier.

Make A Debug Keyboard Input System

HoloLens relies mostly on simple gestures (Air Tap) and voice for input. The first thing you need to test HoloLens code in the Unity3D editor is a simple way to trigger whatever event fires off via Air Tap or speech commands through the keyboard. In my case, I wrote a tiny bit of code to use the space bar to trigger the Air Tap. Basically–anywhere you add a delegate to handle an Air Tap or speech command, you need to add some input code to trigger that same method via keyboard.

Use An Oculus Rift Headset

I was pleasantly surprised to find out Unity HoloLens Technical Preview supports Oculus Rift. Keep your Rift plugged in when developing for HoloLens. When you run your application in the Unity editor, it will show up inside the Rift–albeit against a black background. This is extremely helpful debugging code that uses gaze, positional audio, and even limited movement of the player via Oculus’ positional tracking.

Use The HoloLens Companion App

Microsoft provides a HoloLens companion app in the Windows Store with a few handy features. The app connects to HoloLens via WiFi and lets you record videos live from the headset (very useful for documented reproducible bugs and crashes). It lets you stop and start apps remotely which can be useful when trying to launch an app on multiple HoloLenses at the same time. You can also use your PC’s keyboard to send input to a remote HoloLens. This is convenient for multiplayer testing–use Air Tap with the one on your face, the companion app to trigger input on the other device.

These tips may make building HoloLens apps a little easier, but I really hope Microsoft adds more debugging features to future versions of the SDK. There are some simple things Microsoft could do to make development more hassle-free, although there’s really a limit to what you can do in the Unity Editor versus the device.

Developing Applications for HoloLens with Unity3D: First Impressions

I started work on HoloLens game development with Unity3D over the past week. This included going through all of the example projects, as well as building simple games and applications to figure out how all of the platform’s features work.  Here’s a some takeaways from my first week as a HoloLens developer.

CiPSvqKVAAAsvoP

Baby steps…

The Examples Are Great, But Lack Documentation

If you go through all of the Holo Academy examples Microsoft provides, you’ll go from displaying a basic cube to a full-blown multi user Augmented Reality experience. However, most of the examples involve dragging and dropping pre-made prefabs and scripts into the scene. Not a lot about the actual SDK is explained. The examples are a good way to get acquainted with HoloLens features, but you’re going to have to do more work to figure out how to write your own applications.

HoloToolkit is Incredibly Full Featured

All of the examples are based on HoloToolkit, Microsoft’s collection of scripts and prefabs that handle just about every major HoloLens application feature: input, spatial mapping, gesture detection, speech recognition, and even some networking.

I also found that features I needed (such as the placement of objects in the real world using real-time meshing as a collider) are features in the examples I could easily strip out and modify for my own C# scripts. Using these techniques I was able to get a very simple carnival milk bottle game running in a single Saturday afternoon.

Multiplayer Gets Complicated

I’m working on moving my award winning Tango RTS, InnAR Wars, to HoloLens. However, multiplayer experiences work much differently on HoloLens than Tango. In the case of Tango, each device shares a single room scan file and is localized in the same coordinate space. This means that once the game starts, placing an object (like a floating planet or asteroid) at any position will make it appear in the same real-world location on both Tangos.

HoloLens shares objects between devices using what are called Spatial Anchors. Spatial Anchors mark parts of the scanned room geometry as an anchored position. You can then place virtual objects in the real world relative to this anchor. When you share a Spatial Anchor with another device, the other HoloLens will look for a similar location in its own scan of the room to position the anchor. These anchors are constantly being updated as the scan continues, which is part of the trick to how HoloLens’ tracking is so rock solid.

Sure, having a single coordinate frame on the Tango is easier to deal with, but the Tango also suffers from drift and inaccuracies that may be symtomatic of its approach. Spatial Anchoring is a rather radical change from how Tango works–which means a lot of refactoring for InnAR Wars, or even a redesign.

First Week Down

This first week has been an enlightening experience. Progress has been fast but also made me aware of how much work it will be to produce a great HoloLens app. At least two independently published HoloLens games popped up in the Windows Store over the past few days. The race is on for the first great indie HoloLens application!

My Week With HoloLens

holobox

Microsoft ships the HoloLens and Clicker accessory in the box

My HoloLens development kits finally arrived a week ago. I’ve spent a great deal of time using the device over the past week. I figured I’d post my impressions here.

This Really Works

When I first put my HoloLens on, I placed an application window floating above my kitchen table. Suddenly, I realized I hadn’t taken out the garbage. Still wearing the device, I ran downstairs to put something in the trash. I narrowly missed my neighbor–successfully avoiding an awkward conversation about what this giant contraption on my face does.

When I returned to my kitchen, the application was still there–hovering in space.

As I’ve stated before, Microsoft blew me away. HoloLens is an absolutely incredible leap from previous generation AR glasses (and MUCH cheaper, believe it or not). It also does everything Tango does but at a much higher level of performance and precision. Which means most applications built on Tango can be directly moved over to HoloLens.

HoloLens is fast and intuitive enough to attempt getting actual work done with it. Yet, a lot of my time spent is just trying to make silly videos like this.

It’s A Full Blown Windows Machine

HoloLens isn’t just a prototype headset–it’s a full featured desktop Windows PC on your face. Not only can you run “Windows Holographic” apps, but any Universal Windows App from the Windows Store. Instead of these applications running in a window on a monitor, they float around in space–positioned wherever you choose.

Although HoloLens really does need a taskbar of some kind. It’s way too easy to forget where Skype physically is because you launched it in the bathroom.

It also helps to connect a Bluetooth keyboard and mouse when running standard applications. Gestures can’t give you the input fidelity of a traditional mouse, and typing in the air is a chore.

HoloLens’ narrow FOV makes using a regular Windows app problematic–as the screen will get cut off and require you to move your head around to see most of it. Also, if you push a window far enough into the background so you can see the whole thing, you’ll notice HoloLens’ resolution is a little low to read small text. We’re going to need a next generation display for HoloLens to really be useful for everyday computing.

Microsoft Has Created A New Input Paradigm

HoloLens can seemingly only recognize two gestures: bloom and “air tap”. Blooming is cool–I feel like a person in a sci-fi movie making the Windows start menu appear in the air by tossing it up with a simple gesture.

The air tap can be unintuitive. Most people I let try the HoloLens poke at the icons by stabbing them with a finger. That’s not what the air tap is for. You still have to gaze at a target by moving your head and then perform the lever-like air tap gesture within the HoloLens camera’s view to select what the reticule is on.

HoloLens can track the motion of your finger and use it as input to move stuff around (such as application windows), but not detect collisions between it and virtual objects. It’s as if it can detect the amount your finger moves but not its precise location in 3D space.

Using apps while holding your hand out in front of the headset is tiring. This is why Microsoft includes the clicker. This is a simple Bluetooth button that when pressed triggers the air tap gesture. Disappointingly, the clicker isn’t trackable–so you can’t use it as a true finger replacement.

Microsoft has adapted Windows to the holographic model successfully. This is the first full blown window manager and gesture interface for augmented reality I’ve ever seen and it’s brilliant. After a few sessions with the device, most people I’ve let use it are launching apps and moving windows around the room like a pro.

This Thing Is Heavy

Although the industrial design is cool in a retro ‘90s way, this thing is really uncomfortable to use for extended periods of time. Maybe I don’t have it strapped on correctly, but after a 20 minute Skype session I had to take the device off. I felt pain above the bridge of my nose. When I looked in the mirror, I saw what can only be described as ‘HoloHead’

holohead

The unfortunate symptom of “HoloHead”

The First Generation Apps Are Amazing

There already are great free apps in the Windows Store that show off the power of the HoloLens platform. Many made by Asobo Studio–a leader in Augmented Reality game development.

Young Conker

Young Conker is a great example of HoloLens as a games platform. The game is simple: after scanning your surroundings, play a familiar platform game over the floors, walls, tables and chairs as Conker runs, jumps and collects coins scattered about your room.

Conker will jump on top of your coffee table, run into walls, or be occluded by a chair as if he were walking behind it–well, depending on how accurate your scan is. The fact that this works as well as it does is amazing to me.

Fragments

One of the first true game experiences I’ve ever played in augmented reality. You play the part of a futuristic detective, revisiting memories of crimes as their events are re-created holographically in your location. Characters sit on your furniture. You’ll hunt for pieces of evidence scattered about your room–even under tables. It really is an incredible experience, As with Conker, it requires some pre-scanning of your environment. However, applications apparently can share scans between each other as Fragments was able to re-use a scan of my office I previously made with another app.

Skype

When Skyping with a person not using HoloLens, you simply place their video on a wall in your surroundings. It’s almost like talking to someone on the bridge of the Enterprise, depending on how big you make the video window.

When Skyping with another HoloLens user, you can swap video feeds so either participant can see through the other’s first person view. While looking at someone else’s video feed as a floating window, you can sketch over it with drawing tools or even place pictures from your photos folder in the other person’s environment. 2D lines drawn over the video feed will form around the other user’s real-world in 3D–bending around corners, or sticking to the ceiling. 

Conclusion

As a consumer electronics device, HoloLens is clearly beta–maybe even alpha, but surprisingly slick. It needs more apps. With Wave 2 underway, developers are working on just that. In my case, I’m moving all of my Tango projects to HoloLens–so you’ll definitely be seeing cool stuff soon!

My Favorite VR Experiences So Far

Now that I’ve had plenty of time to go through the launch content of both Oculus and Vive, I figured I’d highlight my favorite experiences you can try for both devices instead of a typical product review. Many of these games are available for both platforms, while some are exclusive.

IMG_2589

My Retail Vive Finally Arrived!

Adr1ft (Oculus)

This is the flagship release for Oculus and deservedly so. Although not a pure VR experience (it also works as a standard game), it’s an absolutely wild trip in VR. Billed as a First Person Experience (FPX), it ranks somewhere between a walking simulator like Firewatch and an adventure such as Bioshock on the “Is It a Game?” scale.

This is consistently one of the top-selling Oculus titles, yet ranks near the bottom on comfort. I had no nausea issues at all, but I generally don’t feel uncomfortable in most VR games. I can see how free-floating in zero gravity, desperately grasping at oxygen canisters as you slowly suffocate to death inside a claustrophobic space suit can cause issues with those prone to simulation sickness. Regardless, this shows that it pays to be hardcore when making VR experiences–especially at this early adopter stage of the market.

A stunning debut for Adam Orth’s threeonezero studio.

Firma (Oculus)

This perhaps one of my absolute favorite pure VR games so far. Think Lunar Lander, Space Taxi or Thrust in VR. If this was a standard video game, it would be mundane, but as a VR experience I really do feel like I have a job piloting a tiny lander craft on a desolate moon base. It actually sort of achieves presence for me–but not the feeling of being in another reality…more like being in an ‘80s sci-fi movie.

Originally available via Oculus Share for years–it’s obvious that a lot of work has been put into this game to get it here for the commercial Oculus release. There are tons of missions, great voice acting, and a lot of fun mechanics and scenarios. This game is giving me plenty of ideas on how to adapt my old Ludum Dare game to VR.

Strangely, this game is in Oculus’ Early Access section, even though I consider it a complete game.

The Apollo 11 Virtual Reality Experience (Oculus, Vive)

An astounding educational journey through America’s moon landing told via VR. This is better than any field trip I took as a kid to the Boston Museum of Science, that’s for sure. This is just the tip of the spear when it comes to education and VR.

Hover Junkers (Vive)

Hover Junkers requires the most physical activity out of any VR game I’ve played–So much so that after 20 minutes of shooting, cowering behind my hovercraft’s hull for cover, and frantically speeding around post-apocalyptic landscapes, my Vive was soaked in sweat. One thing is for sure, public VR arcades are going to need some kind of solution to keep these headsets sanitary. Hover Junkers certainly is the most exciting multiplayer experience I’ve had in VR so far.

Budget Cuts (Vive)

The absolute best example of room scale VR. I didn’t really get it when watching the videos, but when I was finally able to try the demo on my own Vive….wow. This is the game I let everyone try when they first experience Vive. It really nails the difference between seated, controller-based VR and a room scale hand-tracked experience. This is the first “real game” I’ve played that uses all of these elements. So many firsts here, and done so well.

The past month has been a very encouraging start for VR. At this early stage there are already several games that give me that “just one more try” lure. This is surprising given that many current VR titles are small in scope, and in some cases partially-finished early access experiences. With the launch of PSVR later this year, we’re sure to see more full-sized VR games…whatever that means.

The Beginner’s Guide: Dave the Madman Edition

I recently played The Beginner’s Guide after buying it during the annual Holiday Steam Sale over the break. It’s a quick play through, and an interesting way to tell a story within a game. Without giving too much away, the experience reminded me of a similar event in my young-adulthood–When I encountered an amazing game developer who created incredible works I couldn’t hope to match. I’ve since forgotten his real name and don’t know much about him.  But I do have the 4 double-sided floppy disks he sent me of all his games at the time.

48C306D0-095A-4165-840F-0978DF42B7E7

Madsoft 1-4, recovered in great condition

This was the early ‘90s–I’d say around 1990-1991. I had made a bunch of Commodore 64 games (often with my late friend Justin Smith) using Shoot ‘Em Up Construction Kit: an early game development tool that let you build neat scrolling shooters without any programming knowledge.

ais

Adventures in Stupidity, one of my SEUCK creations

I used to upload my games to local BBSes in the New England area and wait for the response on the message boards. In the process, I downloaded some games made by a user known by the handle “MADMAN.”  Some of his games also used the moniker, “Dave the Madman.” He made seemingly professional quality games using Garry Kitchen’s Game Maker.  Not to be confused with YoYo’s GameMaker Studio.

Garry Kitchen’s Game Maker was an early game development tool published by Activision in 1985. I actually got it for my birthday in 1986, thinking that this was my key to becoming a superstar game designer. The thing is, Game Maker was a full blown programming language that, strangely, used the joystick to edit. It also included a sprite designer, music editor, and other tools. Everything a budding game developer would need to get started, right?

Although I did make a few simple games in Game Maker, its complexity was beyond my grasp at the time. Which is why Madman’s creations blew me away. They were so polished! He had developed so many completely different types of games! They all had cool graphics, animation, music, and effects I couldn’t figure out how to duplicate! My favorite was Space Rage: a sprawling, multi-screen space adventure that I simply could not comprehend. I had so many questions about how these games were made!

spacerage

SPACE RAGE!

We messaged each other on a local BBS. I blathered about how much of a fan I was of his work and he said he liked my games, too. I figured he was just being kind. After all, this was a MASTER saying this! We eventually exchanged phone numbers.

I have vague memories of talking to him on the phone, asking how he accomplished such amazing feats using Game Maker. I think he was a little older than me, but many of his games had a 1987 copyright date. Considering I was probably the same age at this time as he was in 1987, this made me feel quite inadequate.

As I recall, Madman was humble and didn’t have many aspirations beyond distributing his little games on BBSes. He seemed like a hobbyist that figured out Game Maker and really liked making games with it–nothing more, nothing less.

fatcat2

Fat Cat probably has the best animation of them all

After our call, he mailed me a complete collection of his games. A few years ago I found these floppy disks and copied them to my Mac using a 1541 transfer cable. The disks bear his handwriting, labeled “Madsoft” 1 – 4. I was able to rescue all of the disks, converting them to d64 format.

Playing through his creations was a real trip down memory lane. The most shocking thing I discovered is on the 2nd side of the 4th disk. His Archon-like game, Eliminators, features the text “Distributed by Atomic Revolution” on the bottom of the title screen. Atomic Revolution was a game ‘company’ I briefly formed with childhood friend, Cliff Bleszinski, around 1990 or so. It was a merger of sorts between my label, “Atomic Games”, and Cliff’s, “Revolution Games.” (The story about the C64 game he made in my parents’ basement is a whole other post!)

eliminators_title_2

An Atomic Revolution production?

I must have discussed handling the distribution of Eliminators with Dave; by uploading and promoting his awesome game all over the local BBS scene and sending them to mail-order shareware catalogs. At least that’s my best guess–I really have no recollection of how close we worked together. I must have done a terrible job since this game was almost completely lost to the mists of time.

I think we talked about meeting up and making a game together–but I didn’t even have my learner’s permit yet. On-line communication tools were primitive if they existed at all. We never really collaborated. I wonder what happened to “Dave the Madman” and his “Madsoft” empire? Is he even still alive? Did he go on to become a game developer, or at least a software engineer? Maybe he’ll somehow see this post and we’ll figure out the answer to this mystery!

ataxx

I remember he was most proud of his Ataxx homage

Until then, I’ll add the disk images of Madsoft 1-4 to this post. Check the games out, and let me know what you think. I’ve also put up some screenshots and videos of his various games–but I’m having problems finding a truly accurate C64 emulator for OSX. If anyone has any suggestions, let me know!

Here’s the link to the zip file. Check these games out for yourself!

 

The Basics of Hand Tracked VR Input Design

Ever since my revelation at Oculus Connect I’ve been working on a project using hand tracking and VR. For now, it’s using my recently acquired Vive devkit. However, I’ve been researching design techniques for PSVR and Oculus Touch to keep the experience portable across many different hand tracking input schemes. Hand tracking has presented a few new problems to solve, similar to my initial adventures in head tracking interfaces.

The Vive's hand controller

Look Ma, No Hands!

The first problem I came across when designing an application that works on both Vive and Oculus Touch is the representation of your hands in VR. With Oculus Touch, most applications feature a pair of “ghost hands” that mimic the current pose of your hands and fingers. Since Oculus’ controllers can track your thumb and first two fingers, and presumably the rest are gripped around the handle, these ghost hands tend to accurately represent what your hands are doing in real life.

Oculus Touch controller

This metaphor breaks down with Vive as it doesn’t track your hands, but the position of the rod-like controllers you are holding. Vive games I’ve tried that show your hands end up feeling like waving around hands on a stick–there’s a definite disconnect between the visual of your hands in VR and where your brain thinks they are in real life. PSVR has this problem as well, as the Move controllers used with the current devkit are similar to Vive’s controllers.

You can alleviate this somewhat. Because there is a natural way most users tend to grip Move and Vive controllers, you can model and position the “hand on a stick” in the most likely way the controllers are gripped. This can make static hands in VR more convincing.

In any case, you have a few problems when you grab an object.

For Oculus, the act of grabbing is somewhat natural–you can clench your first two fingers and thumb into a “grab” type motion to pick something up. In the case of Bullet Train, this is how you pick up guns. The translucent representation of your hands means you can still see your hand pose and the gripped object at the same time. There’s not much to think about other than where you attach the held object to the hand model.

It also helps that in Bullet Train the objects you can grab have obvious handles and holding points. You can pose the hand to match the most likely hand position on a grabbed object without breaking immersion.

With Vive and PSVR you have a problem if you are using the “hand on a stick” technique. When you “grab” a virtual object by pressing the trigger, how do you show the hand holding something? It seems like the best answer is, you don’t! Check this video of Uber Entertainment’s awesome Wayward Sky PSVR demo:

Notice anything? When you grab something, the hand disappears. All you can see is the held object floating around in front of you.

This is a great solution for holding arbitrary shaped items because you don’t have to create a potentially infinite amount of hand grip animations. Because the user isn’t really grabbing anything and is instead clicking a trigger on a controller, there is no “real” grip position for your hand anyway. You also don’t have the problem of parts of the hands intersecting with the held object.

This isn’t a new technique. In fact, one of the earliest Vive demos, Job Simulator, does the exact same thing. Your brain fills in the gaps and it feels so natural that I just never noticed it!

Virtual Objects, Real Boundaries

The next problem I encountered is what do you do when your hand passes through virtual objects, but the objects can’t? For instance, you can be holding an object, and physically move your real, tracked hand through a virtual wall. The held object, bound by the engine’s physics simulation, will hit the wall while your hand continues to drag it through. Chaos erupts!

You can turn off collisions while an object is held, but what fun is that? You want to be able to knock things over and otherwise interact with the world while holding stuff. Plus, what happens when you let go of an object while inside a collision volume?

What I ended up doing is making the object detach, or fall out of your virtual hand, as soon as it hits something else. You can tweak this by making collisions with smaller, non-static objects less likely to detach the held object since they will be pushed around by your hand.

For most VR developers these are the first two things you encounter when designing and experience for hand-tracking VR systems. It seems Oculus Touch makes a lot of these problems go away, but we’ve just scratched the surface of the issues needed to be solved when your real hands interact with a virtual world.

HoloLens is Ready for Prime Time

Microsoft recently invited me to try out HoloLens at their Venice space on Abbot-Kinney. Having just won the AR/VR award in the Tango App Contest for InnAR Wars, I jumped at the chance.  After developing for Google Tango for over a year, I had a long wishlist of features I’m looking for in an AR platform. In the unlikely event that HoloLens was for real, I could make InnAR Wars exactly how I envisioned it at the start of the project.

Entering the Hololens Demo Zone

 

My skepticism was well warranted. Having worked in AR for the past 5 years, I’ve seen my share of fake demo videos and smoke-and-mirrors pitches. Every AR company is obligated to put out faked demo videos that they inevitably fail to live up to. Just look at this supercut of utterly ridiculous promotional videos. I was curious if the staged HoloLens demos weren’t much better.

I had heard many polarizing opinions about HoloLens from people who have tried it. Some felt it was an incredible experience while others told me it was the single most overhyped disappointment in consumer electronics history.

After trying it for myself, I can say HoloLens is the real deal.

The demo I tried was the “X-Ray” game shown at BUILD no too long ago. This version was a little simpler than this staged demonstration–your arm isn’t magically turned into a plasma cannon. Instead, your hold your hand out in front of the device and “air tap” to fire at enemies that appear to be coming out of cracks in the walls. Occasionally you can give a voice command to freeze time, Matrix-style, and take out enemies while they are vulnerable.

A simple game, for sure, but a great demonstration of the HoloLens’ capabilities.  

The device is clearly a prototype. It’s bulky, looks like a vision of the future from a ’90s sci-fi flick, and it even BSODed on me which was kind of hilarious.  Still, I was thoroughly impressed with HoloLens and here’s why:

Meshing

When the game starts, you have to look around the room and watch it build geometry out of the walls and other objects in the area. HoloLens builds a mesh out of the real world with a depth camera that is then used in gameplay. It’s kind of like building a video game level out of all the walls and floors in your room. This mesh is then used to place virtual wall cracks and spawn points for enemies during gameplay. Once you’ve build a complete mesh out of the room, the game begins.

This same meshing process is possible with Google Tango, but it’s slow and temperamental. Still, very impressive given it’s in a tablet you can buy right now. In fact, I used Tango’s meshing capabilities to place floor traps in InnAR Wars.

I was impressed with HoloLens’ rapid meshing as I moved around my environment. It even handled dynamic objects, such as the woman guiding me through the demo. When I looked at her during the meshing phase, she quickly transformed into a blocky form of red polygons.

Display

Initially I was disappointed in the display. Much like Epson’s Moverio or ODG’s R7 glasses, it projects the augmentation on a part of the “glass” in front of your eyes. This means you see a distracting bright rectangle in the middle of your vision where the augmentation will appear. Compared to ODG’s R7s, HoloLens seemed to have higher contrast between the part of the display that’s augmented and the part that’s not. There also is an annoying reflection that looks like a blue and green “shadow” of the augmentation above and below the square.

While playing the game this didn’t matter at all. Although everything is somewhat translucent, if the colors are bright enough the virtual objects appeared solid. Combined with rock solid tracking on the environment, I soon forgot all about contrast issues and internal reflection problems on the display. A these issues can be dealt with through art and the lighting of the room you are playing in. Plus, a Microsoft representative assured me the display is even better in the current version still in their labs.

Field of View

The top issue people have with HoloLens is the field of view. People complain that it only shows graphics in a postcard sized space in front of your vision. I had rock bottom expectations for this having developed applications on previous generation wearable AR displays. Although HoloLens’ augmentation is limited to a small square in front of your vision, this space is easily 2X the size of other platforms. In the heat of the action while playing X-Ray, I mostly forgot about this restriction.

Field of view is not an easy thing to solve–it’s a fundamental problem with the physics of waveguide optics. I’m not sure how much progress Microsoft can make here. But the FOV is already wide enough for a lot of applications.

I’m All In

Part of the pitch is the “Windows Holographic” platform. That is, in the future you won’t have screens. With HoloLens you’ll be able to place Windows applications in mid-air anywhere in your office. Applications will virtually float around you like monitors hovering in space. (Magic Leap’s fake demo video shows an example of how this would work) This can actually be done right now with HoloLens and existing Windows applications. Supposedly, some Microsoft engineers wear HoloLens and have integrated the “holographic” workspace into their daily use.

Beyond gaming applications, I am on board with this screen-free future. Your tablet, computer, even your phone or smartwatch will merely be a trackable object to display virtual screens on. Resolution and screen size will be unlimited, and you can choose to share these virtual screens with other AR users for collaboration.  No need to re-arrange your office for that huge 34 inch monitor. You can simply place it in 3D space and overlay it on top of reality. Think of all the extra stuff your phone could do if it didn’t have to power a giant LCD display! It’s definitely on its way. I’m just not sure exactly when.