Primal Space Systems has raised $8 million for its subsidiary Instant Interactive, and it’s going to use it to make a technology dubbed GPEG, which is like a cousin of the MPEG format used to run videos, but for graphics.
But GPEG, a content streaming protocol, is a different way of visualizing data, and its creators hope it could be a huge boost for broadening the appeal of games as well as making people feel like they can be part of an animated television show. Instant Interactive wants to use its GPEG technology to more efficiently stream games on the one hand, and on the other, it wants to turn passive video entertainment into something more interactive and engaging.
This may be where most people stop reading this story. But I think there’s a lot of legs to this technology. The idea for the Geometry Pump Engine Group (GPEG) originated with Instant Interactive cofounders Barry Jenkins (a medical doctor who became a graphics expert), John Scott (chief technology officer and formerly of Epic Games), and Solomon Luo (a medical vision expert and chairman) — who have thought about this challenge for years and created the startup, Primal Space Systems, and its game-focused division Instant Interactive.
The investors include a variety of seed and angel funders, including ImmixGroup cofounder Steve Charles. The capital will support the development and initial launch of GPEG, which can be used with game engines such as Epic’s Unreal. As a company, Instant Interactive has been working on the technology since 2015. It has just seven employees.
Jenkins has put more than a decade of work into the technology, and the company has 11 patents on technology that enables better streaming of games and ways to make TV shows into interactive entertainment. While MPEG (short for the Motion Picture Experts Group) gave us technologies for compressing video so it could be easily viewed across networks, GPEG can make some very expensive game and entertainment technologies much more practical, said Bill Freeman, president and chief operating officer of both Primal Space Systems and Instant Interactive, in an interview with GamesBeat.

“This kind of technology can enable interactive content on over-the-top programs” such as interactive Netflix shows, Freeman said.
For cloud gaming and interactive television, GPEG replaces video-based streaming through the use of pre-encoded content packets, which can be more efficiently streamed using GPEG middleware technology. The packets are prefetched to eliminate lag, while also lowering overall streaming costs. GPEG’s middleware solution is designed to be used with all existing content delivery networks and to be integrated into any game engine, including Epic’s Unreal Engine 4, providing for the efficient delivery of true in-the-moment personalized content.
“We think it’s possible to bring interactivity to the entertainment industry, breaking the interactivity out of the gamer silos and create content that everyone can consume,” Jenkins said in an interview with GamesBeat.
The possibilities for interactive entertainment

If you’re wondering what this is all about, you may have heard the story that Epic Games’ Unreal Engine, used primarily for making games, was used to create cinematic special effects for the Disney+ television show, The Mandalorian. GPEG can be used on top of Unreal, and it could enable a television show where you can participate in the action, taking the story in a direction you want it to go.
With GPEG, the interactive entertainment industry, as exemplified by Netflix’s Black Mirror: Bandersnatch where you could choose your own story, could get a big technological boost. Jenkins envisioned GPEG as a more effective way to create transmedia, or content that is used across multiple media such as comics, movies, and games.
Today’s games often have in-game visually rich cinematic sequences or “cutscenes” that are rendered in real time. Advanced real-time rendering effects can now give games a sophisticated cinematic look that was not possible before. At the same time, the art of interactive storytelling has evolved considerably since professor Henry Jenkins (then at MIT, now at USC) talked about transmedia.
Narrative-driven games such as Detroit Become Human and Life is Strange II focus on allowing users to actively discover and participate in the conflict and resolution cycles that are fundamental to story.
And if you looks at streaming, today’s edge centric content-delivery network (CDN) infrastructure delivers pre-encoded video streams at a very low cost per user and at global scale. However, this infrastructure is insufficient to support video-based cloud gaming systems such as Stadia or GeForce Now, which depend on expensive game server hardware that is housed in specialized data centers, Freeman said.
While most game content today is delivered by CDNs, this delivery is in the form of slow game downloads, which often require users to wait minutes or hours to begin gameplay. New methods of streaming game engine and VR content to game consoles, gaming PCs, and mobile
devices could eliminate these download delays and provide virtually instant access to interactive content.
Using the existing CDN infrastructure, this new type of stream could change the way games are delivered by broadband and wireless and could also enable new types of instantly interactive content for cable and over-the-top (OTT, think Netflix) audiences.
Jenkins said you should imagine animated programs or special effects sequences streamed to your came console or gaming PC using an application like “HBO Max Interactive” or “Netflix Interactive.” Such programs would convey the story driven, “lean back” entertainment experience expected from video but would also allow users to optionally pick up a game controller and customize the characters, explore a different narrative arc, take up a brief challenge, or otherwise ‘lean into’ the experience in ways that are more deeply engaging than Bandersnatch’s simple branching video.
Such programming would naturally appeal to gamers and could also attract non-gamers and larger mainstream audiences to the unique and compelling entertainment value of interactivity. This type of programming could augment the traditional transmedia approach by enabling convergent media experiences that combine the impact of cinematic storytelling with the kind of immersive engagement made possible by the modern game engine.
Instant Interactive wants to create a graphics revolution

Instant Interactive is pioneering the development of a game engine middleware protocol, GPEG, for streaming interactive content to game consoles, PCs, mobile devices, and next-gen set-top boxes.
Primal Space Systems itself used the technology to enable drones to stream data more efficiently when they’re flying over a place and capturing imagery and location data. The U.S. Army is using that technology. But Instant Interactive uses the technology to more efficiently stream games and turn passive video entertainment into something more interactive and engaging.
“We’ve all grown up with MPEG (a video format),” said Freeman. “This is not MPEG. It is a new way of encoding and streaming 3D data. Games have been our core, but we see GPEG going beyond gaming, bringing interactivity to what has historically passive content that will be more leaning forward.”
Cloud gaming and interactive entertainment need the help. Putting games in the cloud so they can be processed with heavy-duty servers is theoretically a great way to process games. You can then stream a video of a game scene to a user’s computer or console. When the user interacts with a controller, the input is sent back to the data center, where the impact is computed, and then a new scene is sent in video form back to the user’s computer. This allows the heavy lifting to be done in the cloud, so a high-end game can run on a low-end laptop.
The problem is that this streaming of data consumes a lot of bandwidth, and cable modem systems have only recently been able to transfer data at speeds high enough to enable cloud gaming services such as Google Stadia and Nvidia GeForce Now. But with GPEG, Jenkins claims that the data can be dramatically reduced and transferred using a fraction of the bandwidth needed today.