IBM Immersion Room at the Watson Experience Center.

How Oblong helped IBM build its ‘immersion rooms’ with giant displays

VentureBeat: Do you have an idea of how much money they’ve put into each one of these?

That’s a good question. I can tell you that from a timeline standpoint, it takes anywhere from four to six months to create the material for each one of these experiences. We spend two months in an early fact-finding and research phase, helping both the designers and the engineers better understand the industry. They bring in subject matter experts from both inside and outside IBM to help us better understand how we might create a narrative or design interface within that context. Our engineers get comfortable with the services and what they’re capable of doing on the Watson side.

For the latest module we’re taking on insurance claims, identifying the viability of photos for insurance claims. We’re training our own model using various data sets to see if we can get accurate results on identifying whether a photograph of a vehicle shows severe damage or moderate damage or minor damage. It’s a lot of fun to use. Rather than working with canned algorithms that have already been proven in the field, we’re creating our own for some of these experiences.

Increasingly, they’re starting to showcase case studies where Watson is being used out in the world. They call it Watson in the Wild. Everything from engaging visitors to a museum in Sao Paulo to financial services. Many of the customers using Watson they can’t talk about directly, because they’re heavily NDA’d. Oil and gas is another field that leverages these pretty regularly.

IBM helps people understand machine learning in “immersion rooms.”

VentureBeat: Is IBM your biggest customer or user here?

Underkoffler: I think you might recall, there’s two sides to the house at Oblong. One is the Mezzanine product side. There, we have more than 150 Fortune 500-style customers all over the planet, on six continents. Then the team that Pete runs is the parallel universe version of that, for customers like IBM that already have and use a large number of Mezzanine systems, but want to go further. They need custom experiences. They need to be with us on the absolute cutting edge. It’s fair to say that IBM is our largest customer on Pete’s side of the house, what we call client solutions.

Hawkes: We do work with other customers as well, though. It’s not exclusively IBM on the custom side.

VentureBeat: When does everybody else get this sort of thing, then?

Hawkes: [laughs] To IBM’s credit — and I think a lot of this came from a few visionary folks within that organization, people that understood our take on interface, and some folks in research who had experienced firsthand how well we could implement these things — the large creative agencies that typically set up these spaces don’t really know how to operate on a live software level as directly. They might have some sharp folks around, but the standard mode is generating high-end commercial-like material. It’s large videos and other things that are static. The nice part about this engagement with IBM is they’ve not only given us license to explore new modes, but they’ve also pulled their content and their use of the spaces in that direction.

Underkoffler: For us, what’s exciting is that our work with IBM and other similarly minded customers lets us push forward the state of the art around the idea of massive-scale interaction. Cinema really is a pretty good analog. What can you do if the screen is 30 feet high and 55 feet wide? Things work differently at that scale. The work we do with IBM is breaking all kinds of new ground in that mode of interaction, in that kind of interactive space.

But to your question about when we all get it, we want the answer to be “as soon as possible.” For example, Pete runs monthly meetings here where the local hacker and graphics community is welcome to come in. Everyone’s invited. People just try things out on the giant wall. It’s like handing a film camera and flatbed editing rig and a Rialto once a month to see what people can dream up when they have access to this large format. You should come by someday.

This is a 300-degree immersion room at IBM.

Hawkes: There’s actually one tonight. It’s called “Slim Shader.” [laughs] Shader language is great because it cuts through an interesting swath of engineers, game designers, VR developers, mobile developers, and designers. We get a really interesting mix of folks. We’re also — I would say our team has also invested a lot of time in building accessible tools for non-technical people. We write a lot of difficult C++ software to drive these spaces in a way that makes sense.

One of the big hurdles with new tech is we’re so eager for it that we push it out the door before it’s really ready. This is the big problem with VR — illness because framerates are poor, or the display technology isn’t quite ready. We invest a lot of time making sure we have a high-fidelity experience, stuff that from a UX standpoint is very solid, but then we also create a lot of accessible tools so that folks can bring their own skill sets to the table. Designers can iterate rapidly at scale, without having to filter their ideas through an engineer, for example. That’s important to us.

We also have a strong relationship with many of the local schools. We’re right in the middle of UCLA, USC, ArtCenter, Cal Arts, and Otis. Not only do we employ designers and artists from those schools, but myself and other artists and engineers at Oblong teach at UCLA and USC. We do mentorships with ArtCenter. I’m teaching a course next term in the film school at USC, in Media Arts and Practice. I’ve been teaching there for the last two or three years now. We teach technology — Arduino and processing — for artists and designers. The class spends the last few weeks at our warehouse downtown interacting with many of these same data sets. We clean and scrub it down into a format they can play with, and we encourage them to create tangible interfaces.

This is another way — making sure that the younger generation, those that are just graduating from the top programs in the area, are familiar with this as a potential mode for creating content, for interacting with data, and understanding what’s possible. Many of these students end up working at some of the top companies around the world. ArtCenter folks are primarily working in automotive. Many of the USC folks go into entertainment and Hollywood, or a handful of other startups here in L.A. or around the Bay Area. We keep tabs on them, and they on us. These same concepts and ideas, we hope, trickle out into the world in a meaningful way.

Dean Takahashi

Dean Takahashi is editorial director for GamesBeat at VentureBeat. He has been a tech journalist since 1988, and he has covered games as a beat since 1996. He was lead writer for GamesBeat at VentureBeat from 2008 to April 2025. Prior to that, he wrote for the San Jose Mercury News, the Red Herring, the Wall Street Journal, the Los Angeles Times, and the Dallas Times-Herald. He is the author of two books, "Opening the Xbox" and "The Xbox 360 Uncloaked." He organizes the annual GamesBeat Next, GamesBeat Summit and GamesBeat Insider Series: Hollywood and Games conferences and is a frequent speaker at gaming and tech events. He lives in the San Francisco Bay Area.