As part of my brain dump series on writing creative tools off the back of Substance 3D Modeler and Dreams I want to talk about the state of Virtual Reality for productivity and creativity. So let’s talk about VR fatigue.
Conclusion ; TLDR
- Use VR, you will solve fatigue issues as you find them
- User will be seated, head tilted down with arms at their side
- Passthrough is important, immersion is optional
- You should support
XR_ENVIRONMENT_BLEND_MODE_ALPHA_BLEND
- Precision: Controllers. Comfort: Hands
- Most apps should explore One Controller / One Hand as default
- Headless is a great comfort option; advice on how
- Hardware vendors should support
XR_MND_HEADLESS
USE VR EVERYDAY!
Look, you need to have some courage and use VR every single day you are developing it. I’m so tired of VR devs avoiding VR. By far the biggest weakness. If you can’t stomach it why would your users. If you actually spend time in VR you will find the issues and fix them. Dumb, simple but it needs to be said because omg the amount of dusty dev headsets I’ve seen.
Hands down the best solution?
I’m trying to not repeat things said in a million other talks but this one needs retreading.
Your user will be:
- Seated
- Head Tilted Down
- Hands at their side
Yes some users will stand, about as many use a standing desk. Not own one, but actually use it in standing mode. Turns out that wears you out.
Your hands will be down and at your side because lifting your arms takes energy and humans are creatures of optimisation. Unless there is a supporting surface like a desk, which is a valid UX I want to talk about in the future, they are going to have their hands down low. They will want the bulk of their boring work to be done with the most efficient low energy motions possible. This isn’t a game where we pump up their excitement and get them moving. BE LAZY! There is a counterpoint that if you are working on a 3d volume you will have your arms up and in the work area. Also in cases where the user wants a high degree of control they will want big motions. For the same reasons artists draw with their arms, not their wrists.
Head tilted down comes from two factors, the above paragraph about arms and the fact you have a weight torquing your head down. Headset weight is too often discussed and torque and angular momentum are not discussed enough.
Design your UX for these basic realities of what I call the VR worker pose.
Why is Passthrough Critical?
I’m not going to ask you how often you fullscreen a productivity application. Because even when you do I bet you have a second monitor, phone screen close at hand or other distraction. So instead I will ask you how often you IMMERSE yourself into a productivity tool. Yes, sometimes you get into a flow state and are head down in your productivity software but often that is not the reality of creativity or productivity.
So we need a non-immersive default for productivity and creative apps for VR and that means we need passthrough. VR devices may in the future be multiple application devices, Apple sure is building an OS for that reality from today. Their contribution to the space has been vital to waking people up to this reality but honestly I want to give a lot of the credit to Varjo who not only pushed this early on, had Apple as one of their biggest customers, but also set the standard for HOW TODO PASSTHROUGH.
Meta early on released the XR_FB_passthrough
as a solution for passthrough. This extension became popular and was adopted by Pico and HTC as they all used similar chipsets. This was the wrong solution because it was built under the assumption the application controls the passthrough and would have access to the camera feed. This was a huge problem for PCVR via streaming, the most common use case for PCVR with Quest, Pico and HTC. As the camera data was sent through the system, the PC-based compositor ended up working with a delayed camera image. I understand some not all the legacy decisions behind this thinking applications would do more with mixed reality.
APPLICATIONS DON’T CARE ABOUT YOUR DESKTOP!
Neither should your passthrough. I will put spatial anchors and some other cool tech to the side for now. That is a whole separate article. The solution was what Varjo did instead, XR_ENVIRONMENT_BLEND_MODE_ALPHA_BLEND
.
Unlike the default opaque, or the additive mode designed for wave optics displays this allows our application to send a framebuffer with alpha. This means the headset can display passthrough cameras, virtual environments or other applications. We don’t care. And when a user wants to focus on our applications we can just render as much as we need and no more.
This is the golden secret for productivity applications and something I’ve pushed in every Vendor meeting for 3+ years. Finally progress is being made but this is a big shift.
Doing it one handed?
You need to ask:
- Do I need precision control?
- Do I need precision control on both hands?
Most productivity and creativity applications need precision control, but almost none need it with both hands. There are exceptions like some apps that don't need precision.
Precision Two Handed - Default with two controllers
Lower precision - Hand tracking
Hand tracking is great because no batteries, you can drink your coffee and it is one less thing to worry about. But the truth is most applications need precision in one hand. Well Meta have been working on a solution called Multi Modal. OpenXR extension outlined here in Meta docs.
This is fantastic and gives you not only the freedom to work with a dominant hand while using the other to do real world interactions, assuming you have passthrough, but also unlocks a secret power. Hands have a MUCH larger vocab than any controller could hope to unlock. This gives you a huge menu of shortcuts should you wish to unlock them.
A bit of creativity here could really remove 2-3 click deep menus and provide detailed shortcuts for a range of actions.
Going Headless!
The final mega upgrade. Look, I believe in VR and I think it has a bright future. I have a saying I will repeat here. VR is the most uncomfortable it will ever be today. Year by year and cycle by cycle it will improve. That said there is a dirty secret, a large reason why most productivity and creativity apps excel in VR way beyond their flat rivals when done well is due to the controllers. VR six degrees of freedom controllers are magic.
I still think for most users the Dreams TV + Move Controllers experience is the ideal experience most of the time. Do I think it gets better when you put on a headset? Yes. Is the improvement bigger than the inconvenience of the headset? Maybe.
The answer used to be no but headsets have improved. That being said, we need to stop ignoring this use case. This should be supported using the extension, XR_MND_HEADLESS
. Pico and SteamVR have support for it but this is slightly bugged. I fought this fight for a while but there wasn’t much vendor interest in solving this because it’s against their commercial narrative. Modeler now has a fake headless mode where we don’t use the extension but don’t send any frames to render to the headset.
This works quite well but it often requires the user to shove a t-shirt or similar in the headset and point it at the controller volume. Unless you have an option to disable the presence sensor and power saving stuff. Less than ideal but it works.
You will need an easy to use shortcut the user can frequently press to reset to the rest position. As now the headset is detached from the user. I ended up using double stick click but honestly the mode never got the love it deserved. If you want to see it done well load up Dreams with the Move controllers.
There are 3D TV displays. Also lighthouse based tracking should be ideal for this. You can level this up with a small tracker on the head or using a webcam for facial tracking ala this old Wii Demo, no 3D TV required.
The point is we are only one decent HW vendor or creative OpenXR runtime implementation away from proper support for this completely valid use case. Hell the Quest Pro controllers track themselves and there was this interesting reference design from Intel not long ago I believe. Also a few interesting monitors at AWE and CES so who knows what the future holds. I just hope they build on OpenXR, because if you do the difference between supporting Monitor + VR controllers and VR is smaller than most people realise for productivity applications.
Conclusion is at the top
Thanks for reading. Have a 🍪