This is the tech that will make games look better and cost less
Say hello to the Triple Indie
A couple of weeks ago I, a woman only a couple of inches over five foot, was transformed into a gargantuan hell beast inspired by Viking warriors. No, I wasn't hungry – it was all thanks to the innovative technology used by indie game developer Ninja Theory.
Ninja Theory has worked with big publishers on some huge games you'll no doubt have heard of, including Heavenly Sword and Devil May Cry, but its latest game Hellblade: Senua's Sacrifice is entirely its own work.
Rather than calling this an indie release, though, Ninja Theory has given Hellblade a category all of its own: Triple Indie.
What's a Triple Indie? Ninja Theory classifies it as a game that takes the incredibly realistic, polished visuals and combat style we’ve come to expect from Triple A titles and applies them to a refreshing and unusual setting, story and character that you’d perhaps be more likely to see tackled by an indie studio.
Revolving around a Celtic Warrior called Senua, the game weaves together Celtic and Norse mythologies to tell a highly personal story, following her journey through the underworld and her battle with psychosis.
You can take a look at how all the technology works in our video:
The once-gargantuan divide in the gaming industry between indie and Triple A titles is slowly starting to close, for a multitude of reasons. Thanks to the rise of digital stores and console manufacturers looking to foster and discover the next Rocket League, we’re seeing more indie titles released than ever before.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
But, while it’s now easier for them to publish to the big consoles and reach a wider audience, many indie developers still have to deal with budget, staff numbers, and time restraints that many Triple A developers don’t.
However, that the fact that their games are largely digitally distributed and have smaller budgets means indies generally have less pressure to sell large volumes to justify their creation, and to make a return.
The grass isn't perfectly green on either side of the development fence, and it certainly seems like it would be a difficult task to merge the two publishing models.
Unavoidable expectations
Based on Ninja Theory’s history it’s easy to see why it's the right studio to tackle the challenge of creating a Triple Indie. Having worked on mainstream games for big publishers like Capcom for 14 years, the team understands what goes into a Triple A title.
However, it's also a small studio that’s managed to avoid being absorbed by the big publishers it's worked with – and it's now in a position where it can ride the indie wave and work directly with Sony to publish a game of its own.
That said, when it comes to making a game, having the knowledge and the idea is only half the battle. You also need the funds, and the ability to stay within budget.
Understanding how to work on a smaller scale and keep your ambitions in check is part of this, but Ninja Theory also had some seriously cool tech on its side.
Using advanced motion capture and facial recognition technology, the team was able to create a game that looks as good as any Triple A game out there, yet also significantly cut both production and release costs.
We took a trip to the Ninja Theory studio in Cambridge to chat to the Hellblade team, and try out the groundbreaking technology it used to develop the game for ourselves.
Real-time cinematography
The combination of facial and body motion-capture technology that Ninja Theory used enables what the studio calls 'real-time cinematography'. Essentially, the studio is able to capture an actor’s performance and transfer it in real-time into Sequencer, an editing tool for the Unreal Engine 4 game engine.
In this editing suite, the studio is able to edit the game footage like a film, with the added benefit of being able to change things like lighting and camera angles. Essentially it makes creating a game more similar to filming a small-scale film, something Hellblade’s creative director Tameem Antoniades says has made a huge difference in terms of time, cost, and production value.
“There are many good things about the motion-capture tech,” he told us.
“Building games is hard, and each game has its own structure, so you tend to iterate a lot. Having a stage next door means we can literally just jump over and shoot scenes as we need them. It’s a much more fluid way of working, and it means you save a lot of money.”
When Antoniades says the stage is “next door” he’s not kidding. Just a few feet from where we’re sitting is a full filming studio fitted with motion-capture cameras, suits, a performance stage and an editing suite, where the game was created.
When we asked how much it cost to put the 3x3 meter stage together, the team told us it was around the price of a small smart car – a fraction of what a major studio would pay. The rigging, holding just nine motion-capture cameras, was sourced from Ikea, and the lighting squares from Amazon.
Before it had this studio, Ninja Theory's team would have had to book out slots in expensive motion-capture studios in far-flung locations, film everything it could and then work with what it had at a later date.
“It’s an expensive operation,” explained Antoniades. “Usually you shoot over five weeks for a game and then we wouldn’t see the final thing until around six months later.
”You’d see stages, but you wouldn’t see anything final for months. With this stuff the only thing you don’t see in real time is the final touches an animator puts in to tweak expressions. Other than that you’re seeing it within milliseconds, which is amazing.”
Rather than requiring a crew of 20 to 30 people and a lot of expensive pre-planning, Hellblade was extremely small-scale. “Our crew on this was myself as director and cameraman, Melina as our video editor and actress, the cinematics team, technical artist and the sound guy,” says Antoniades. ”That’s it, we shot everything.”
Lower production costs means, of course, that savings can be passed on to the player, and Antoniades told us that it’s directly resulted in Ninja Theory being able to release Hellblade at the relatively low price of £24.99 / $29.99 / AU$44.95.
Savings made and shared
However, he was also keen to point out that while costs might be reduced, quality isn't compromised: “I think the quality we’ve achieved is beyond what we’ve done in big studios in the past [...] in fact it’s one of the best-looking games we’ve made.”
Though this is in part because the scope of the game is relatively smaller, focusing on one actor and using one camera with no scene cuts, it’s also because, Antoniades says “the nuance of the technology has just come on”.
To bring in the expensive tech, Ninja Theory has worked with a lot of outside collaborators – the motion-capture cameras were provided by Viacom, and the motion-capture suits were provided by a company called XSens.
Facial motion-capture is an even more complex process, which involved getting physical equipment from Cubic Motion and using facial scanning software from a highly specialized Serbia-based company called 3Lateral.
The technology is already being used on larger scales by big film makers with big budgets, but the real test was whether it would be possible to use it in an affordable way on a smaller scale.
“Yeah, you can totally scale it up,“ Antoniades acknowledged. “The costs grow linearly. More camera men and more actors means more money, of course. When we were shooting previously we’d normally have three cameramen and five actors on set. We can scale up to that, but this game was our proving ground really.”
Certainly, bringing this advanced technology to smaller studios in both film and gaming is likely to yield interesting and progressive results.
Ninja Theory told us Viacom had provided it with the cameras to create the project because “they’re interested in trying to make performance-capture something that’s not just in the reach of really high-end movie makers and game developers, but something that’s in reach of smaller developers like us”.
Big things in small packages
Collaborating with third parties hasn’t just been useful for Ninja Theory – they told us the relationship has also been helpful for Viacom because it’s meant they’ve been able to experiment with new technical things together.
“Operating independently [without the input of a major publisher] as we do we can be really flexible with these things. The presentations at GDC and Siggraph, these things came about because we could say ‘let’s just go for it’ and really push the tech. I think despite starting off as an innovative way to save costs we’ve actually been able to push the technology even further.”
Some of the tech being used to create Hellblade really is at the cutting edge, particularly in the area of facial capture.
”A really key piece of tech is the head camera that Melina [Juergens, the lead actress] wears for facial capture,” Antoniades tells us.
“People have been doing body motion-capture for upwards of 10 years, it’s quite established. But to capture a face in real time alongside it is brand new. No one’s done that before, but it makes a big difference.”
In order to effectively track Melina’s face using Cubic Motion’s head rig, the team had to fly out to Serbia to see 3Lateral, whose technicians used more some 100 cameras to take photos of Melina’s face performing more than 100 expressions. From these they were able to create a perfectly accurate 3D image of Melina’s face, the files of which could be used in conjunction with Cubic Motion’s tech to capture a real-time performance.
While flying to Serbia just to get an image of a face may sound like an expensive hassle, it certainly sounds more time- and cost-efficient than the process that came before it.
“When we did our first game you’d have to take the little balls like the ones that are used on the body and stick them on the face of the actor. I can’t remember how many exactly, but I think you had to have around 80 or 90 every morning, and it would take two or three hours to glue on every single one.”
When we asked how using real-time facial capture made the process of making Hellblade different from that for other games, Antoniades told us: “In most games and animated films they capture face and body separately. They have actors doing the body, animators doing the face and someone else doing the voice.”
The problem with this is that it’s hard to get a cohesive performance, and often a director is left unable to tell if a body performance is completely right because they have no facial expressions to accompany it.
Creative cohesion
“You need to capture body face and voice to get a full performance. For us, the suits with the markers capture the body and the cameras all around the room are just triangulating the position of the dot so that you can move around. It captures with sub millimeter precision so it’s super accurate.
"Usually when we’re shooting we have wireless microphones attached to the head piece to capture the voice. The facial capture head rig has two computer vision cameras that capture the face in stereo and using computer vision tech it can read where the eyes are. It captures the mouth, the eyes, the facial expressions, everything, just by looking at the face. It translates that in real-time so the face in the game is a 3D digital scan of Melina.”
What this means is that the game’s lead actress is able to put on all the equipment, perform a scene from the game in its entirety with her body, face and voice all together and see it rendered in real-time into the game world.
“At the end of the day you’re doing a scene with Melina performing and me as a cameraman,“ explained Antoniades.
“We’re just working the scene like any movie or theater production. It’s just that instead of the stage being made out of cardboard and curtains, it’s digital. Other than that it’s just shooting, that’s what’s magical about it. It’s just storytelling with lots of wires.”
We’ll admit we had always thought that one day technology might get to the point where real life actors wouldn’t be required to bring a game character to life at all and an emotional performance could be rendered solely by a computer, but Antoniades was very supportive of the continued use of real actors in games.
“I think you always need actors,“ he said.
“Actors are easier to direct than a robot for one thing. Acting is storytelling, an art form in itself and I don't think there’s any computer or robot that’ll get close to that in our lifetime. Until then if you want to bring about emotion you can’t really do it any other way.”
When we asked if this technology could perhaps make it likely that more high-profile actors would be attracted to games as a performance medium in a manner similar to the recent upswing of big names on TV, the creative director didn’t think it unlikely. That said, he thought it would be more suited to theater actors than those who work on film.
“On our first game, Heavenly Sword, we collaborated with Andy Serkis and he’s very much an actor’s man. He believes the performance has to come before anything else and anything that destroys the performance is bad for the film or the game. The whole idea behind this tech is to preserve the actor’s performance on set when you know it’s right that’s what you carry forward to the game.”
One of the most interesting things about the body motion capture is that any body can be mapped onto any digital character.
Though the team told us they tried to stay true to Melina and make Senua her exact proportions and facially similar, that doesn’t have to be the case.
We saw that for ourselves when we jumped into our own suit. Despite being of a similar build to Melina, we were able to become a monster twice her size and control its movements seamlessly in a way that looked completely natural. This creates a real freedom in casting parts and could lead to some incredible in-game performances.
Freedom to perform
It's extremely strange to see yourself essentially become a frightening creature before your eyes. It was like looking in a mirror borne from the imagination of Stephen King. However, it’s easy to see how this technology could drastically improve the quality of performances we see in games as well as dramatically reduce the time it takes to create them.
Excitingly, the technology isn’t limited to creating film and game content for later consumption – it can be used to create live interactive experiences.
The team has already proven how these can work by taking the tech to GDC 2016. On stage Ninja Theory showed what appeared to be footage for the game which then turned out to be an entirely live performance from the game’s lead actress backstage.
They also took the technology to Facebook Live, where lead actress Melina donned the motion-capture tech and answered fan questions live on video as Senua.
Antoniades told us that the technology could be used both in VR and on stage:
“We’ve taken some of our motion capture scenes, we’ve done live demos to people where Melina is Senua in VR and you speak to her in VR in the game world and she can come and touch you. It’s really strange. You could have it in VR theme parks with monsters running around. It’s like having digital costumes so any world which is digital you can have live actors interacting with you.”
Getting a little ahead of ourselves, we asked how possible it would be to use the equipment as full-body controllers for games. Imagine, if you will, an MMORPG world like World of Warcraft where everyone is able to don a suit and a headset and dive in.
Fortunately, Antoniades was just as excited by this idea as we were: “Yeah absolutely! I think people will just get together and create fantasies with this kind of technology.”
Of course, it’ll have to get more affordable first.
Though it’s not absolutely perfect just yet, the motion capture technology used by Ninja Theory in the development of Hellblade: Senua's Sacrifice is some of the most exciting in game development at the moment.
What’s more exciting, though, is that it’s smaller independent developers that are pushing its boundaries and trying new things.
If this trend continues, the lines between indie titles and Triple A titles will increasingly blur; not too long from now we could find ourselves with a larger number of high quality games that don’t cost the Earth and gaming could finally achieve the recognition it deserves as a powerful storytelling medium.
Emma Boyle is TechRadar’s ex-Gaming Editor, and is now a content developer and freelance journalist. She has written for magazines and websites including T3, Stuff and The Independent. Emma currently works as a Content Developer in Edinburgh.