Unity 4.6 is here! (Well, in public beta form). Finally–the GUI that I’ve waited YEARS for is in my hands. Just in time, too. I’ve just started building the GUI for my latest Oculus Rift project.
One of the trickiest things to do in VR is a GUI. It seems easy at first but many lessons learned from decades of designing for the web, apps, and general 2D interfaces have to be totally reinvented. Given we don’t know what the standard controls may be for the final kit, many VR interfaces at least partially use your head as a mouse. This usually means having a 3D cursor floating around in world space which bumps into or traces through GUI objects.
Unity 4.6’s GUI features the World Space Canvas–which helps greatly. You can design beautiful, fluid 2D interfaces that exist on a plane in the game world making it much more comfortable to view in VR. However, by default Unity’s new GUI assumes you’re using a mouse, keyboard, or gamepad as an input device. How do you get this GUI to work with your own custom world-space VR cursor?
The answer is the use of Input Modules. However, in the current beta these are mostly undocumented. Luckily, Stramit at Unity has put up the source to many of the new GUI components as part of Unity’s announced open source policy. Using this code, I managed to write a short VRInputModule class that uses the result of a trace from my world space VR cursor and feeds it into the GUI. The code is here. Add this behavior to the EventSystem object where the default ones are.
In my current project, I have a 3D crosshair object that floats around the world, following the user’s view direction. The code that manages this object performs a trace, seeing if it hit anything in the UI layer. I added box colliders to the buttons in my World Space Canvas. Whenever the cursor trace hits one of these objects, I call SetTargetObject in the VRInputModule and pass it the object the trace hit. VRInputModule does the rest.
Note that the Process function polls my own input code to see if a select button has been hit–and if so, it executes the Submit action on that Button. I haven’t hooked up any event callbacks to my Buttons yet–but visually it’s responding to events (highlighting, clicking etc.)
It’s quick and dirty, but this should give you a good start in building VR interfaces using Unity’s new GUI.
Thanks Ralph! Excellent explanation and code. This has gotten me a good way towards Rift + Hydra input with uGUI. I’m still struggling getting InputFields to take focus; I’ve tried ExecuteEvents.Execute(…selectHandler), field.OnPointerClick(), and a few more, and all are bust so far. Have you made any headway with InputFields?
I haven’t tried anything other than buttons. And while this works for buttons–not all the time. Sometimes, the buttons don’t respond to the enter/exit event but yet still can be clicked. The whole ExecuteEvents thing is a black box that’s kind of impossible to debug.
Got it, thanks. I haven’t noticed that problem with the buttons yet (Unity 4.6b18), but need to test more. FYI, about the fields, apparently a .ActivateInputField() method is turning public in an upcoming release (http://forum.unity3d.com/threads/how-do-i-control-which-gui-item-has-input-focus.263679/)
Thank you so much for posting this! Was able to finally get a good GUI system going for my VR game.
Does this work with Oculus DK2? Currently I can’t even get the OVRCameraController to see the GUI… so was this built for DK1?
Hmm, I don’t think it should really matter–to be honest, I’ve only actually tested this with the GearVR. But I think it should work on everything–it’s just basically a raycast from the camera center through the GUI.
Well, I’m just wondering if anyone has figured out how to get the DK2 to work with the new uGUI.. It seems like DK2 is overriding the uGUI by using render textures to render the camera’s.
Pingback: Ceiba3D Studio | Use reticle like mouse for WorldSpace UI's
Pingback: Frequently Asked UI Questions | Ceiba3D Studio
Pingback: How To Support Gear VR and Google Cardboard In One Unity3D Project | Ralph Barbagallo's Self Indulgent Blog
Hi, im working on a proyect that involves the DK2 for the VR rift and im having some issues triyng to make a screen cursor (kinda the action of a mouse cursor) within the DK2 central camera wich allows me to higligth some objects in the scene that im looking at, so i can select the object an interact with it, ive try using your code, but it seems than some librarys( scripts) are missin on my proyect, can you giveme a hand its a school research proyect.
Hey Ralph, if I click on your link I get your piece of code but it tells me it needs the crosshair object and refrences to things in the script on that crosshair object (errors). Could you post a download link or send me that script and/or the crosshair prefab as well?
Stop it! No one wants to point their heads at menu options in order to select them. I want to push up and down on my d-pad. It’s quick it’s easy, that or use a mouse. Please don’t use VR to be gimmicky, use it for immersion. Make the gui look lovely and 3D, but leave the selection to what the hands do best.
Yeah, it’s funny–I used the d-pad/touch pad to navigate dialogs in Caldera Defense but a lot of VR devs HATE that scheme.
The problem is, on some platforms (such as Cardboard) you really only have gaze as an input.
Pingback: The Basics of Hand Tracked VR Input Design | Ralph Barbagallo's Self Indulgent Blog
This script is outdated but I don’t seem to find alternative to it in my OVR package. Is there any alternative that works with Unity 5.3?
Is there any updated version for this? I still don’t see an alternative that would work with EventSystem. But your script is outdated.
I’ve found the solution that works currently (using OVR 1.0 in Unity): https://forums.oculus.com/developer/discussion/16710/new-unity-ui-ovr-look-based-input-howto/p1
Yeah nowadays what I use is a combination of the VR eye raycaster from Unity’s VR Standard Assets and a modified version of this laser pointer UI input module that’s originally for Vive–except I attach it to your head and turn off the laser: http://wacki.me/blog/2016/06/vr-gui-input-module-for-unity-htc-vive/
Okay I’ve just noticed the one I found was a simplified script. It works fine for basics but it can’t handle scrolling or the buttons stay highlighted when we press a button (pointer exit no handled well). So I’m still looking for a better implemented solution. Yours can handle Oculus (eg. Gear VR) too? Is there no similarly simple solution as that script that you can just add to EventSystems and already works.