
GamesBeat: What about the other vector to work on, taking the price lower?
O’Brien: We introduced the original Vive at a higher price point than we’re introducing this one. We’re introducing this one purposefully with—it’s not always just about price. We think price is one of the friction points, but it’s also about user experience and content consumption. If you got the super-low price and you don’t have these other pieces, it’ll still struggle to hit the mass adoption space.
These are the concerted areas where we felt like we could make a good effort to solve these problems, and we could still maintain what the Vive brand is meant to do, which is bring the best VR experience in any of the tiers it goes into. This is about maintaining and delivering our premium VR experience at the consumer level. We’ll continue to do that over time with our enterprise portfolio as well, what you can be doing there. We brought eye tracking natively to that headset this past year. You’re going to continue see that. Whether it’s all-in-one, consumer, or enterprise, that’s our goal and our intent.
GamesBeat: Do you see an opportunity to take something into an Oculus Quest-like space, something entirely different for the lower end?
O’Brien: It’s about, can we deliver that premium experience to end users? That’s an interesting space, the all-in-one space. I think you’ll see us to continue to evolve our 5G portfolio with consumer-led products in the XR space. That’s an interesting space to partner with mobile operators. You saw something like our 5G hub. We have one on the table.
We’re starting to combine those technologies with our immersive portfolio. That solves real friction points. When you’re tethered to wi-fi, that creates just as much of a problem as this cable does. You can only go so far. 5G gives us a little more ubiquity.
Bamford: To really get a small form factor at a low price and high fidelity today, you need either a PC, or in the future you’ll be able to do cloud or edge GPU rendering to get that high fidelity. That’s where we’re aiming as a brand, that premium experience. If you don’t have a 5G capability or a modem capability — we have more than 100 years of modem expertise at our company – you’re going to be pretty limited three years down the road.
GamesBeat: Is this 5G ready?
O’Brien: Not right now. This is PC-enabled. We’re talking about a modular story. The first mod-capable product, being able to work with Steam VR and Steam VR tracking. Then our mod story will continue to grow over the next year. We’ll have more functionality.

GamesBeat: So 5G could theoretically be brought in through mods?
O’Brien: There’s a lot to come for us in what we’ll announce about what the products will be able to do over the next year. It’s pretty exciting. It gives users and developers a lot of options to say, “Okay, I can buy this one thing and it solves a lot of problems over time for me,” instead of iteratively buying everything because it has one new thing. We wanted one new thing that can do many things. It’s future-proof.
GamesBeat: How long have you been working on this?
O’Brien: Here’s the thing. When we took out the original Vive CE, I went around and visited almost every automotive company and several aerospace companies. They told me everything I needed to do in order to service them better. We built Pro, and then we built Pro Eye. We built a lot of SDKs and all these other bits that would make it even easier. We built a tracker so they would be able to bring other objects into their virtual space. These were very professional-driven requirements.
The consumer space drove very clear requirements around setup problems, ease of portability, ease of use. We wanted to attack those problems. At some point you have an inflection point where it’s time to bring up the resolution, time to bring up the experience, give a lighter headset to consumers. But since day one we’ve been building out the requirements of what should be in the next consumer headset. What can we and should we solve with that product?
Bamford: Some of the technical components took longer to incubate, like the inside-out tracking in the controllers. We’ve worked on that for some time. Some other elements of it, like Dan said, we’ve been collecting these requirements over time, but we’ve developed over the last 12-18 months.
GamesBeat: The CE is knocked out here, then. Is anything else knocked out on the enterprise side?
O’Brien: No, no. What you’ll see is that Pro and Pro Eye are both Steam VR tracked products. We think those products, with Pro Eye having native built-in eye tracking, support very specific use cases that developers have been working on and our professional customers have been asking us for. They need the higher fidelity with the tracking accuracy. They want to be able to track other objects that are not in their field of view.

That has a very specific customer where we think it’s going to go. You’ll continue to see us grow that road map in what we can do for enterprise and professionals. They’re willing to bear the price of that innovation. For consumers, we’ll continue to innovate at more consumable prices in what we can bring to the consumer level.
GamesBeat: This doesn’t have the eye-tracking, then?
O’Brien: No, it’s not built in.
GamesBeat: What does the native eye-tracking accomplish?
O’Brien: For enterprise and professionals, a lot of them are using it for training. A lot of them are using it for understanding what the user—for something like product introduction, Kellogg did a whole series with introducing a planogram of new product and where to put their new product. They discovered not only where to put a new product entry on a planogram in a store, but how to cause an overall cart lift. There were no interviews at the end of the session.
GamesBeat: When you train people you need to know exactly where they’re looking.
O’Brien: Right. Or if you’re trying to max out the panel with foveated rendering, you can get eight times more resolution out of a foveated object, because you’re taking down more resources off of other parts of the panel where your pupil isn’t looking. There are benefits on the resource side. There are benefits on the data tracking and data knowledge set side.
The Air Force is using it to understand how the human body reacts to crisis scenarios in the cockpit, and whether to throttle up the simulation or throttle it back, depending on how well the pilot is handling the situation. Some very interesting studies are happening around eye tracking. But developers have only had access to that hardware for maybe six months. We’re just starting to scratch the surface on the different use cases. The user research is awesome, because it’s very honest and truthful. You can’t fake your eye movements.

Bamford: There are incredible opportunities in user interface as well. I don’t know if you’ve seen the demos, but with eye tracking you no longer have to select with your controller. You just look at the UI and press a button. It’s an incredible efficiency gain. When I first heard that idea, I was quite skeptical, but having experimented with it, it’s a major step change.
O’Brien: In Major League Baseball, what they attempted in the very beginning was an accessibility test. Could you select objects? Could you select things in VR with your eyes instead of a head gaze? They were able to do that, so you could just stand in an experience with a bat. The first experience they built used both a controller and a bat, and you had to hand things off. It was really clunky. Some really cool stuff there.
Ovation, with the speech, is another one, where it was 80 percent more accurate when tracking your pupils than with head gaze. You’d be looking over here, but your eyes were heat mapped down to the display telling you your keynote speech. Now you’re creating better oratory communication. It’s impressive stuff.
GamesBeat: Eye tracking adds a fair amount of cost that you wouldn’t want to have here, though?
O’Brien: The consumer user experiences for eye tracking aren’t there yet. A couple of years down the road, I think eye tracking becomes table stakes for AR headsets, MR headsets, or VR headsets. The use cases and the proper usage of the data will be much more well-defined at that point too.