
GamesBeat: I’m also waiting to see — how well is the platform designed to suspend everything else on the computer in order to ensure a smooth experience, even on some of those low-end games?
Pallister: That’s part of why Microsoft has been focused on their universal app platform. It’s a quality control — a walled garden inside Windows. It’s a point of friction elsewhere in the industry, about what it prevents people from doing and innovating on, but for this particular use case, it’s understandable that they want to do a thing where they can really control what’s running, what’s not, what policies the apps are forced to conform to. But it’s a good question. What if I’ve installed a toolbar on my other browser, and it’s eating 50 percent of my CPU to mine Bitcoins and send coupons to my uncle?
GamesBeat: It seems like there’s a psychological barrier we’ll get over when headsets are sold as another peripheral that’s part of your computing experience, not a $500 Vive that you just use to play games. You get your keyboard, your mouse, your antivirus solution, and your headset.
Pallister: Even if there’s a bundle, the fact is, if I’m hearing right, as a consumer — if I look at the Rift, I’m not just buying a VR headset. I’m making a decision about whether it has the games I want. I’m getting locked into that platform of offerings. It feels like — our initial thought was there are any number of usages you can point to that started with this vertical model and eventually got standardized and went horizontal.
We’ve been saying that standards will be the way that happened. We’re on the Khronos OpenXR standards body, and we’re working with other companies to make that happen. But it feels like the industry is on the verge of making it happen anyway. Somebody enabled the path to get Steam VR titles to run on Oculus headsets. Initially, they said, “No, no, you have to go through our store,” but their users said, “We’re Steam users, and you’ll make that happen.” We recently saw an announcement from Microsoft that said, “Our headsets will also support that content.” Valve has hinted that other guys might be coming beyond HTC. The LG guys made an announcement.
It feels like, even if there’s not an outright declaration that all this stuff works on all the headsets, at least there will be a body of knowledge among enthusiast users who use Steam. “I know that I can go on the Steam discussion list and see that these five headsets support all this stuff, and I should probably buy this one.” It very much feels like we’re heading in that direction. At the very least, when the standards happen, we’ll get there, but even before that, you’ll see consumers getting more choice.
GamesBeat: VR arcades are very interesting and the things IMAX is doing with VR cinema. Do we think that has legs? Is that going to keep going?
Pallister: I don’t know what the economics look like for IMAX, but I think the model absolutely has legs. Whether it’s the next arcade and looks like arcades did in the ‘80s, and they’re a massive destination business, or they’re a bit like arcades now, where certain types of games — the only thing arcades have left today is form factor and peripherals. You sit on a motorbike or pick up a machine gun and do all this stuff you can’t do at home.
It’ll be somewhere in between, I think. Some people will say, “I’ll take an off-the-shelf Vive and deliver some flavor of experience to people who can’t afford it.” You’ll see that in China, say. And some people will say, “Even if people have a headset at home, how do I really do a next-level experience?” Something like what The Void is doing. I’ll have force-feedback vests and other peripherals. I’ll do an environment where it’s a 1:1 mapping with the VR environment. When I see stairs in the VR space, I’ll step forward, and there are stairs there. I’m mixing the real world in. Next-gen laser tech, absolutely, right?
I think people would pay a lot in some markets for — think of something between paintball and laser tag. People would pay a lot of money for a really high-adrenaline, really immersive experience. There’s an opportunity to do that. It’s got a long way to go from just dropping the PC version of a game in a Vive. That’s a good start, but it’s got room to expand. When you get good multiplayer experiences there, and people can collaborate with their friends — when they start painting a whole environment green and doing mixed-reality green-screening with four people in there and eight friends in the lobby, and it’s like a karaoke bar where you watch your friends do stuff, and you can swap stories about it, take the video home, post it on YouTube. There’s a lot of legs for stuff like that.

GamesBeat: I talked to the guys who made the Omni treadmill. They were arguing that these treadmills are much more economical because you can just have people run in place, and then, you don’t need a lot of real estate where you can never amortize the cost.
Pallister: That’s true. And certainly, for some urban environments, it might be the case. Paintball and laser tag have real space. But they have their own set of barriers. I think it’s true for some. The thing I’ll say about those — as much as they’re restrictive and sliding your feet around doesn’t quite feel like running, and you have to strap yourself into it, what I’ll give them is that every time they’re at a trade show, and they do the two-on-two shooter app, there’s always a crowd watching.
It’s part of why we got excited about VR and esports in combination. I think there’s a real opportunity for spectators. There’s a nugget of something there. If you show people a video of one person running on the treadmill, they say, “Why would anyone care about this?” But if you have the two people competing….
GamesBeat: Can you talk about the technical side of MixCast?
Pallister: Anybody who’s been looking at VR has seen some flavor of this mixed reality green-screen use case people are doing. The initial spawn of that idea was — it’s a number of companies, but the guys who did Fantastic Contraption, Owlchemy, the Job Simulator guys, a few early pioneers in this area said — we highlighted this is a problem to solve two years ago. One of the hard things about selling VR, convincing people they want to try it and why it’s interesting, is you have to do it one user at a time.
I personally spent months running around Intel cornering executives and saying, “I want to talk to you about this thing.” They said, “I’m not wearing that on my head.” I’d have to talk them into putting it on their face, and then, they got it. You can’t get to 20 million units selling things like door-to-door vacuum cleaners, one at a time. So how do you convey that? This is one piece of the equation people came up with: Here’s a way to convey the experience this person is having. You composite a video of them doing the thing with a video of the in-game simulation, mixing the two together. We said, “That’s really cool. Somebody needs to make that easy to do.”
Then, we ran into these guys at Blueprint in Vancouver. The application they developed is called MixCast. It’s a Steam VR application that makes it easier to set up and do green screen. We contacted them and met up at GDC and said, “Hey, we have a bunch of things we want to do here. We want to make it easier to get this out to people. You guys are already doing this. Can we collaborate on stuff and make the product better and get it out to people?” They thought that was great, so we did this technical collaboration.
The first thing we tackled is making it easier to do. This was an idea we discussed, although it’s largely their technical development. They made it from an SDK that a developer could integrate into their app, which they still have, into an application that hooks into any Unity app. Any Unity app that was built for VR but doesn’t necessarily have support for MixCast, they can hook into it and support someone using it. A reviewer or YouTuber or streamer on Twitch can make this work, even if the game wasn’t built from the ground up to work with it. Check, more people can use this thing.

We looked at it with a very default Intel point of view and said, “How can we apply performance and make this run better?” We’ve been working with them on taking our media SDK and doing a highly threaded video codec, so you can do up to 4K video and spread that workload across cores. While you’re on the same PC that’s doing your VR and rendering the user’s view and the MixCast view, you’ll be able to do a video encode and not be constrained by low-resolution video. Check, that’s another goal. It’s not yet shipping, but their app will take that technical proof of concept, and we’ll roll it into some drop of the application.
Once you have that video on the computer, then, what are some things you can do with that information to make it better? What kind of compute can you do on the video workload to make the end result better? We’re looking at things like adding support for RealSense cameras, so you can have a depth view of the user. That means you can do things like light them correctly for the scene, or maybe one day actually use background segmentation and remove them from the background without a green screen there. Then, the end user doesn’t have to put up bed sheets on the walls to try to do this. They could just say, “Sort out anything between this and that depth.”
That’s the nature of the collaboration with them. We’re trying a little science experiment right now where we’re using an AI-driven drone as a pilot to control the camera position. If you want to have a moving camera and you’re doing VR at home with this green screen, you need to get someone to hold the camera and film you [and] make sure they get certain things in the frame. We’d like that to be handled by a drone. We don’t think an end user is necessarily going to do this any time soon, but for a professional setup, it’s interesting. It’s an example of how we can start to think about expanding the ways developers work with this stuff.
GamesBeat: When it comes to content, are you doing any direct funding to get it built and get more applications out there?
Pallister: We’re not a game publisher: “Here’s money, go build this game.” We have done deals where we’ve said, “We want to show how you can use compute to add more bells and whistles to apps and make them look better.” Sometimes, that will involve funding or co-marketing or whatever makes business sense. We did things with Star Trek Bridge Crew and Arizona Sunshine. We’ll do more things like that going forward.
Our Intel Capital Group has looked somewhat into the consumer space but a lot in the commercial space as far as where VR is applicable. Who are interesting companies that aren’t just getting off the ground but that we think are sound investments and a good use of Intel Capital’s money? Situations where applying capital toward them can bolster the rapid growth of VR in different areas.
I was telling someone earlier about a company called InContext Solutions. They do specialized applications for retailers. If you’re laying out your new store and you want to put shelves in and decided how high things should be and what displays should look like, that’s what they do. They saw VR coming and said, “Hey, we can let people see what the store looks like before they build a 15,000-square-foot space. We can put focus groups in VR and see where their eyes are drawn when they walk in the store.”
That’s an example of a company where they’re doing great stuff. They’re propagating VR into a particular commercial segment. We’ve done things with them. We’ve done things with medical and architectural and a bunch of other sectors. We did a project with the Smithsonian art museum. Something like 90 percent of what the museum owns can never be viewed just because of space considerations, so we captured a wing of the museum and took it a step further. When you approach a painting of the Aurora Borealis, you can actually step through that painting and be surrounded by the landscape. For people who can’t travel, for teaching kids, that’s one element that’s been very cool.