Meta Quest Pro

My Meta Quest Pro arrived three days ago, and I’ve been working in VR, playing in VR, and socializing in VR. Whilst not ready to give a full review, I’m happy to express some thoughts on my experience with the $1,500 headset. Let’s address the price first. I get it. I had initial sticker-shock for this headset, but the more I saw other people’s experiences. The more I felt that as a VR prosumer, evangelist, and academic. I wanted this headset.

My previous VR HMD is the Oculus/Meta Quest 2, so I’ll primarily draw comparisons to that.

Overall Design

First up, let’s consider the aesthetics. The Meta Quest Pro looks like a prosumer device, I can definitely see this on the desk next to an employee’s regular set-up. I’m a Senior Director of DevSecOps, and I host Horizon Workrooms-based Office Hour drop-in sessions on a near-daily basis for colleagues to pair-program, review deployments, and swarm issues, or just drop in and chat, given we’re spread over five countries.

The inclusion of the charging station is a huge bonus for me and is a great move to passively encourage users to create deskspace for the device, and to include the device in a dedicated location for their daily activities. Other reviews have stated that the pin alignment between the charging station and the HMD & controllers has to be exact and can be difficult to achieve. Total nonsense, it’s easy, and all the charging indicator lights point upwards so it’s very easy to confirm a successful connection.

The elimination of the top strap does make sense for enterprise usage. Who wants to worry about wrecking their hairstyle whilst in the office? Or flipping between different media meetings? But it does drive some pressure to the top of my forehead, which if not seated correctly can escalate over time. The comfiest of HMDs I’ve ever worn has been the CV1, and it took me a few months to find the perfect position and angle for everything. It’d be nice to have the option of a third-party top strap for the Quest Pro for extra-long sessions.

Unless I charge from the Oculus/Meta Link cable whilst using the device, battery time with eye-tracking and expression-tracking enabled is just over two hours. Which puts it in the same region as the regular Oculus/Meta Quest 2. I’ve been a little spoiled in that area, as I’ve been using the pro-strap with the extended battery since I first got my Oculus/Meta Quest 2.

Setup

Now keep in mind that I’ve not added a new Meta HMD to my account since the Oculus/Meta Quest 2 launch day. I found the in-headset experience very-very slick. Reassuring ambience, informative graphics and animations. The whole process seemed geared towards making the hardware more accessible to non-XR-savvy folks. Considering the market being targeted, this is a very smart move. After all, with enterprise adoption we’re always battling the 3rd Douglas Adams’ Technology Rule:

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you’re thirty-five is against the natural order of things.

— Douglas Adams.

Visuals

Well, of course, the first thing you notice once the thing is on is the visuals. The screens display vibrant colours with seemingly more range than the Oculus/Meta Quest 2. At first, I mistook this for deeper black, but it’s more a range of contrast that makes the blacks appear deeper in vibrant scenes.

This is my first experience with Pancake lenses, and all of my previous VR experiences have been with Fresnel lenses (I can’t recall what the 1990s Virtuality machines used.) The Pancake lenses are thin! So much so, that unlike all previous headsets I can comfortably wear my glasses, and because of the recess between the lens and the lens bevel I feel comfortable doing so without the lenses of my glasses rubbing against the HMD lenses. That said, as I’m using glasses I’m unable to use the silicon Blinkers and full VR gasket accessories. I’ll still get prescription lens inserts if they’re released in the future because of this.

The lenses themselves drive more pixels to the centre of FOV, which makes for a higher definition where it counts. In Horizons Workrooms I was able to read small text on Whiteboard Notes whilst seated, and other colleagues could not. Talking of FOV, it feels like it’s slightly wider, but I also consider I might be interpreting the fairly seamless bleed from meat space in my periphery as an expanded FOV.

I’ve had a few experiences, more noticeable in Horizons Workrooms where I’ve noticed fixed-foveated rendering whilst not explicitly searching for it. It detracts a little from the immersion and is not something I’ve noticed in the same experience in the Oculus/Meta Quest 2.

Audio

The same approach to audio as the Oculus/Meta Quest 2, but the audio seems to have more range and bass. The bass was really noticeable when I first used the headset, and I swear there was a reverberation through the unit that just added to the experience.

The Oculus Quest 1 earbuds accessory is compatible, and I’ve now taken to using that for more isolation.

Controllers

I love these controllers! Their weight and substance remind me of the CV1 controllers which always seemed to provide a little more presence because they had a little weight to them, which is akin to passive haptics when most games have you holding some object, tool or weapon. Active haptics has more oomph to them, and perhaps more fidelity in the expressed frequencies.

The controllers themselves take a few seconds to get full tracking, which adds a little more delay when dynamically swapping between hand-tracking and controller-tracking. There can be odd immersion-breaking moments of 3DoF before 6DoF kicks in during these transitions.

I love the missing halo ring, and the inside-out tracking on the controllers is awesome. For the last few generations of Oculus/Meta HMDs, I’ve missed the four-camera outside-in tracking I had with the CV1. This gap is now resolved, I can overdraw bows, and not lose tracking.

Whilst I use Horizon Workrooms daily and host Office Hours sessions with my colleagues, I’ve not used the stylus tips yet. I’ll update you as I get the opportunity.

Expression & Eye Tracking

This is the major new social feature that Meta is betting on, and they’re right to do so. To de-couple themselves from the policy decisions of upstream platforms impacting their revenue stream *cough* Apple *cough*. Meta has to be the platform, that platform is social eXtended Reality (XR.)

Lots of privacy statements around enabling this non-default feature certainly went a long way to addressing concerns here, and the statements are clearly worded so that I think most users will understand what is being processed locally, and what isn’t.

I’ve used the functionality in Horizon Workrooms and Horzion Worlds regularly. By default, there’s a series of pre-launch checks that I was guided through by the software to better position the HMD so that it could accurately track my eyes and expression. This part of the experience could do with a little work and is impacted by the glasses you’re wearing and their frames. My backup glasses are thin rectangular lenses and seem to impact eye tracking by making my avatar’s eyes look very chill and relaxed. Whereas by Ray-Ban Stories seem to give my avatar slightly more alert eyes.

Expression tracking was good, but the number of expressions interpreted seems to be relatively small. The avatar’s face does not appear to be fully boned — Yes, I’m sticking to that expression. But it was great for colleagues to see just how much I smile in real life, I felt that accurately translated. In Horizon Worlds, I’m extra conscious of the Digital Divide that generational features like this can make between HMDs. That said, it’s super easy to disable the feature within Horizon Worlds, and Workrooms.

Colour Passthrough

It’s okay. I mean, it’s Colour Passthrough via what appears to be a single RGB feed overlaid on IR composite feed. It’s prone to aberrations, and certainly gave me a ‘wow’ moment when first setting up the Guardian. I guess I’ve just not played AR-specific content to really focus on this capability.

I’ll update as I used this feature more.

Conclusion

TL;DR: Great hardware and platform for software to grow into.

The Meta Quest Pro does represent a generational upgrade over the Oculus/Meta Quest 2, certainly from a hardware perspective. However, the software to exploit the new hardware functionality needs to catch up, and catch up quickly to continue the momentum. The software feature relating to the new functions seems very Minimum Viable Product (MVP) but gives a promise of what’s to come.

I’d like to see more software improvements over time, including:

  • Eliminate aberration between IR and RGB camera feeds
  • Ability to set eye tracking thresholds, E.G. Allow me to define what my “eyes-wide-open” position is so I don’t have to exaggerate in order to have my avatar appear alert.
  • Hand tracking seems a little less accurate than the Oculus/Meta Quest 2. My assumption is that this is due to less training data based on new camera locations. I experienced popping hands from a neutral untracked position (under my chin) which is then interpreted as if I’m reaching out above my head with an out-turned palm. Hilarity ensues in Horizon Workrooms.
  • Let’s get some prescription lens insert options so I can use the device without glasses, that’ll enable me to try out the blinkers and full face gasket.
  • Calibrating the headset placement for eye-tracking and expression-tracking is a terrible process. I find I’m just in a never-ending loop of “Raise the headset, tilt the headset down.” Over and over again until I get fed up and choose “skip.”

I’ll update this as I make more observations through daily use with the Meta Quest Pro, but feel free to trigger an opinion on anything I’ve missed by leaving a comment.

Leave a comment