How Andy Serkis’ games studio created Planet of the Apes: Last Frontier

He seems quite human.

GamesBeat: Was there some discussion at the beginning about whether to do a more traditional sort of action game, versus convincing them to do this particular project?

Alltimes: Fox and ourselves agreed that we didn’t want to do a movie tie-in. We wanted to do a universe game. Apart from the fact that it gives us creative freedom, these games are based on decisions that can have real consequence. We had to allow our characters to live or die. That’s why we wanted to do it.

We were able to go to Fox and say, “If you look at the success of Walking Dead that Telltale had–if you went to any other publisher with that license, it would have been a first-person shooter. Activision released a shooter. I don’t think anyone remembers it. Everyone remembers the adventure game.” Telltale has legitimized that you can take a license and not default to a shooter, a fighting game. You can do things that are much more sympathetic to the source material.

GamesBeat: The animation here looks very lifelike.

Alltimes: We do use motion capture. It’s the same technology they use in the movies like Avatar, or even the Planet of the Apes series. They have two systems. The first one is a series of cameras that register your body movement, and then the second is head-mounted. That captures your facial movements. That’s all digitized and converted into realtime 3D graphics.

We have a very small team, but we’re able to do stuff at this level and in this quantity because of performance capture. If you’re a professional animator on a movie, you would produce three or four seconds of finished animation a day. On a motion capture stage, you can capture 10 to 12 minutes of animation, but it’s actually more than that, because it’s 10 to 12 minutes times the number of characters. In one day we can capture the same amount of animation that would take 300 or 400 animators to produce. There’s no compromise in quality because you’re capturing actual performances, and unlike an animator, who works in isolation on one character, your actors are reacting to each other on the fly in the same environment.

GamesBeat: So these scenes were all performed?

Alltimes: Yeah, live on a stage.

GamesBeat: How do you then make it so stylized, so it doesn’t look like a movie? This doesn’t look quite like real life. Are you essentially animating on top of the motion capture?

Alltimes: Well, we’d love it to look photoreal. That tech is probably 10 or 20 years away. Andy has a very specific philosophy, which is—he wants to capture what was on stage, and to minimize, or ideally completely eliminate, the use of hand animation. Sometimes it’s not possible to do that, because apes don’t have exactly the same face structure as a human. There has to be some manipulation behind the scenes. But we do minimize it. Otherwise—there’s an actor behind there, and what you’re doing is saying, “We’re not happy with your performance.” Andy, given that he is an actor, doesn’t believe in that, and we don’t believe in that either.

Three ape brothers have to decide whether to attack or not.

GamesBeat: I talked to [Epic Games CEO] Tim Sweeney about this. He was saying that he thinks this sort of tech is ready for prime time as far as creating digital humans.

Alltimes: I’d agree. I call Tim Sweeney my CTO. [laughs] We were going to be part of the keynote at GDC this year, but there were a few things we just couldn’t bring together in time. I don’t know if you saw it, but he gave a speech with Andy talking about what they were doing on the Tempest.

For me, when I think about real time 3D graphics—when I first came to the Imaginarium, it was just to make video games. After we were shown all this stuff, we said, “God, we could make a movie with this.” We captured five or six hours of movie footage in six weeks. Compare that to the cycle on a Pixar movie. They’re spending four or five years doing stuff. I do think we’re close. All the real innovation is happening in real time.

The real next big step in terms of technology for facial animation is real time facial. What I mean by that—at the moment cameras only track your eyes and mouth. There’s other stuff they can’t interpolate. But stuff that I’ve seen—they’re not just capturing your eyes and your lips. They capture the whole movement of the skin on your face. That’s going to be incredible. It means the barrier between what you see and what was performed is nonexistent.

GamesBeat: Back to the game, you have this combination of decision-making, plus the action sequences. There’s some physical skill coming into it as far as timing.

Alltimes: We deliberately wanted to make that minimal. At one point we had a tapping mechanic. Everyone on the design team said, “Okay, we get it, but that gets away from what this is about. The point is everyone in the room has a chance to vote.” There’s a time limit. We could have gotten rid of it, but in an action sequence we do want your heart to race a bit. It felt legitimate. It’s not really designed to be a skill mechanic.

GamesBeat: As far as how costly the consequences of your decision can be—Until Dawn was very complex. It had eight characters you were trying to save. There was this butterfly effect to it. If you went left or right sometimes you’d just die because you picked one instead of the other. Other times you made a consequential decision that had a big impact.

Alltimes: We don’t have things that randomly happen, but we do have a lot of consequence. Our main cast is 14 characters, and every one of them can live or die. The permutations are pretty immense. We didn’t want it to feel random. We do build up to those choices. Hopefully it’s a bit like Game of Thrones. Yeah, it might be a surprise, but you should have known it was coming.

GamesBeat: Since you rely so much on motion capture, I suppose you have to pay a lot of attention to each performance.

Alltimes: You do. The director of all the performance capture is a guy called Steve Kniebihly. He’s a French director. He came out of a French cinema school and happened to have his first job on Heavy Rain and Beyond: Two Souls at Quantic Dream. He was our number one choice to direct this.

What’s interesting—we’re the number one super user of Sequencer, Unreal’s real time 3D editing package. It replaced their previous cinematic editor. Steve, who has no technical background outside of film—he understands actors, cameras, and lenses. But he was able to direct this on stage and then do all of the editing and camera work himself in that tool. It’s incredibly powerful. We built some special tools for him to make it even easier, but it’s amazing.

You can edit, but also, if you don’t like the blocking in a scene, you can move the characters in a 3D space to make it work. These scenes can play out sometimes three or four different ways. You have to have a start and end point for all the characters. Sometimes, even for someone as experienced as Steve, when you get into the actual scene as you edit, you want to move someone here or there. Sequencer is an incredibly powerful tool.

What I say to my guys is I want creative challenges, not technical challenges.

The humans decide whether to torture a captive ape.

GamesBeat: It’s good to see something like this. It wasn’t really on my radar.

Alltimes: What happened was, we didn’t know about PlayLink until February. We always planned to support multiplayer, and then we came to GDC this year to show it, and realized, “Hang on, we have to make sure we can support this feature. It’s going to be so important.” Having a simple interface with a controller is one thing, but if you can eliminate the controller and use a device that everyone’s got in their back pocket, and it takes two minutes to download from the app store—we really hope this initiative from Sony has traction. In so many living rooms, there’s a PlayStation or Xbox and only one person plays it.

GamesBeat: That’s what happens in my house. My daughter had five of her friends over and they’re all yelling at the one player with the controller in Until Dawn to do this or do that.

Alltimes: What’s happening organically when you’ve got people together is they want to be involved, because it’s interesting to see what happens next. We give them a way of being engaged.

Dean Takahashi

Dean Takahashi is editorial director for GamesBeat at VentureBeat. He has been a tech journalist since 1988, and he has covered games as a beat since 1996. He was lead writer for GamesBeat at VentureBeat from 2008 to April 2025. Prior to that, he wrote for the San Jose Mercury News, the Red Herring, the Wall Street Journal, the Los Angeles Times, and the Dallas Times-Herald. He is the author of two books, "Opening the Xbox" and "The Xbox 360 Uncloaked." He organizes the annual GamesBeat Next, GamesBeat Summit and GamesBeat Insider Series: Hollywood and Games conferences and is a frequent speaker at gaming and tech events. He lives in the San Francisco Bay Area.