In the age of Unity and Unreal, games developers collaboratively came together during the current generation to solve the secret of how wild ideas can become a tangible product.
The solution has been third-party tools, used by studios of all sizes. Few people grow their own tech, and the biggest hits these days are often built on someone else’s.
As the next-generation of games hardware vaguely looms over the current fractured market – consisting of a console-smartphone-handheld-browser-PC-tablet mix few would have foreseen three years ago – most of us are expecting more of the same when new games hardware is on the market. Studios can’t create technology for all of this alone, right?
Square Enix thinks otherwise. At E3 it unveiled its long-in-gestation Luminous Studio engine, a new proprietary technology built in-house.
Everything about Luminous seems to run contrary to the direction triple-A games are going. It’s not for sale to third parties. Square Enix thinks it can use this to form a cross-studio internal technology base. And it reckons it can do that by either keeping team sizes the same, or at least empowering its artists and coders in the face of spiraling asset demands. Oh, and it will also rescue its ailing Final Fantasy series by living up to the dream offered by the franchises luscious cut-scenes.
The first demo for Luminous, called Agni’s Philosophy, is very slick. It tells a brief but classic Final Fantasy story about priests, magic and transforming demons.
It revels in every detail, introducing a detailed, dusty town built into a mountain-top populated with characters both strikingly beautiful and realistic, but also ugly, with hairs and odd-looking facial nuances.
When he sits down with Develop to talk us through replay after replay of the real-time CG effects, Square Enix’s Japanese CTO Yoshihisa Hashimoto pours off the details, his labour of love broken down into impressive stats and humble demonstrations of technical genius. Hashimoto was director of the Sonic games for ten years, before moving to Square Enix to head up its next-gen technology efforts.
One scene in his tech demo includes the following: seven hugely detailed character models including Agni herself, whose outfit is built from hundreds of feather assets, an evocative temple environment, plus a giant contorting beast that is evolving from over a hundred thousand smaller insects, themselves made up of thousands of polygon meshes layered with detailed particle effects.
Oh, and it’s all running in real-time.
At one point Hashimoto pauses the action and spins the camera around to reveal the warts and all textures hidden off camera, to prove the content is real and not a trick.
Later, during a close-up of a bearded hobo, his facial scruff dynamically generated in real-time using tessellation, Hashimoto jokes, “More beard!” and “Santa Claus!”, lengthening strands of hair and recolouring the assets on-the-fly to comical but impressively swift effect.
He explains: “This has been a very big deal for our cinematic team, who would wait hours to see the rendering result of a single frame. Now, within seconds, they can edit characters and change appearances.”
He adds: “If an artist came up with the elaborate design [for Agni’s costume] for a game on PS3 or 360, we would really scold them. But this kind of design can be achieved with the powerful GPUs on the market now, and we can make it look realistic.”
Part of this seems to be thanks to a clever, if undisclosed, way the engine is running live with Autodesk’s Maya to allow for live editing of assets in the game renderer – although Square doesn’t go into details.
All in all, it is very slick, instantly catapulting Square Enix into a select league of studios that have dared to show super-HD games for PC and next-gen consoles before the latter are even announced.
At E3, Square Enix contemporaries LucasArts, Epic Games and Ubisoft Montreal were the only others happy to breakaway from Sony and Microsoft’s sealed-lips policy to show equally marvelous videos for Star Wars 1313, Unreal Engine 4 and Watch Dogs.
However any developer that has built a tech demo will tell you that they are much of a muchness. Smoke and mirrors. Yes, Agni’s Philosophy’s mirrors are probably some of the most detailed, reflective surfaces. And if it was for an actual game, the fanboys would be frothing. But the fact is CG showboating is still showboating.
What really intrigues about Luminous Studio is the impact Square Enix hopes it will have on the craft of making games.
Fittingly for a Final Fantasy concept, Agni’s Philosophy began with a CG cut-scene. Since Final Fantasy VII, the series has been notorious for almost jarring switches from beautiful rendered cinematics to gameplay featuring vastly downgraded, less expressive character models.
The dramatic jump between movie and gameplay may have been leveled out over the years as game technology has improved, but Luminous goes further, blending the fields of traditional CG and in-game rendering.
That might seem like a bit of a non-revelation, but to Square Enix and many other triple-A studios which still fall back on video teams to generate unforgettable CGI segments that prop up a game, it offers a unique change to the workflow and how teams can be structured.
“To create this demo, we requested the visual arts team create a pre-rendering CG first, and that was ported into our engine,” explains Hashimoto, showing the two running side-by-side. The comparison of the two inevitably shows the tech demo outshining the original and comparatively rough source material. It’s a production draft, after all.
But the marvel here is the fact it has taken high-end assets and swiftly translated that into game engine material. When it comes to things like lighting, the look and feel of this art-driven source material has been perfectly replicated into the demo.
“The same people who worked on the CG then worked on the lighting and design of the actual demo,” adds Hashimoto, extolling the inherent values of mixing the game and CG video workforces.
“If you want to fine-tune pre-rendered CG, rigging the lighting and changing the expressions takes time – but in our real-time technology it’s just a few minutes of tweaking. We’ve been able to easily change and address many of the subtleties quickly.
“Ultimately, what we have done is create a movie as a visual work, and it was ported into real-time. And we felt we became more flexible editing that using Luminous Studio compared to just creating it as a pre-rendering.”
CHASING THE DRAGON
In terms of gameplay, much of the motivation here has been on what Square Enix has always championed: characterisation.
“Square Enix highly values in-game characters – so the hair, skin, face and details. That’s what we focus on,” says Hashimoto.
You don’t have to look hard for varying examples of Square’s eternal search to get this right. The expensive foray into CG movies with Final Fantasy: Spirits Within or the emphasis on faces in the publisher’s 2011 Deus Ex sequel are just two. This new footage of Agni’s dragon pursuit seems to be, finally, the dream realised.
Yet the actual humans the company employs to make its games might in fact be the ones escaping from the nightmare trappings of development. Team sizes could actually fall in this new era of high-end technology, says Hashimoto – if not, the emphasis will at least allow more creative artistic freedoms.
“If you are making games like those we’ve seen on PS3 or 360, I can say you will be able to reduce your staff numbers with technology like this,” he says.
“However, this level of quality game [in Agni’s Philosophy] will require a lot of additional material, so you’ll probably end up employing more artists and designers. So that would offset reducing any staff.
“But things that previously used to require a lot of manual input, such as smoothing out errors or flaws, will switch to procedural or automated computer function.”
The emphasis in the next generation, he says, will shift to aesthetics, not programming. He doesn’t mention it, but it’s no secret that Square Enix for one has struggled with the demands of the current generation. A project like Final Fantasy XIII needed hundreds of programmers and artists, and even then was beset by delays.
Hashimoto won’t go on the record as saying his technology can kill off the cut-scene, though. For a start, things like “tiny changes to costumes” on the fly could cause headaches for designers and load for GPUs and CPUs – so whenever Agni shows up in a real Final Fantasy, it’s likely her adventure will still be punctuated by static video clips bookending gameplay segments.
“But these days you want seamless movement between cut-scenes and gameplay – that’s why we highly value having such high quality gameplay that looks like CG,” he says.
“We are hoping that, as a game engine, Luminous means we will not have to reduce any of the capabilities shown by the on-screen characters.”
It is progression, at the very least. The difference between cut-scene and gameplay, even those that are often produced in-engine but still rendered into movies, has been a concession developers have always had to make, often only with minimal satisfaction. Gamers might not notice it outright, but Luminous’ efforts to make the fantastical feel consistently real probably has a bigger pay-off in the long-term.
Provided there is a long-term future for these kind of games, that is.
In the face of a fractured market, with many in the industry predicting the death of consoles and triple-A losing its luster because of that, cynics might say that Square Enix is setting itself up for disappointment. Luminous Studio won’t be sold to third-parties either, so it could be an expensive disappointment at that.
But Hashimoto is resolute, and Square Enix says that high-end triple-A isn’t going extinct any time soon.
“We can confidently say that the quality of the graphics at least will be graphically improved, the AI and animation will be dramatically better too, whatever the next formats are,” he says.
“We wanted to deliver a message to the games industry and gamers: Square Enix is ready for these high quality games, using Luminous as our game engine.
“More realistic and more lifelike characters will appear in games,” he adds, in a matter of fact manner.
“[Agni’s Philosophy] is a Final Fantasy, there will be a Final Fantasy that looks like this,” he adds for emphasis.
“We wanted to make sure that expectations were rising. This is a glimpse of a game coming up pretty soon, it’s not that far from now. Technologically, we are now ready for any platform. And we want to inspire other game creators to make something like we have too.”
Certainly, Hashimoto is seeing that happen in-house. Key elements of the environment in the Agni demo were actually designed by Tomber Raider developer Crystal Dynamics. It’s the perfect metaphor for Square Enix; a character born in Japan, finding its way in Western surroundings.
“We all share, and are very friendly,” says Hashimoto, talking about sister studios IO Interactive (Hitman) and Eidos Montreal (Deus Ex), when Develop broaches the topic of how the Japanese fear of sharing technology and secrets seems to be evaporating as the country’s console market comes under increasing pressure.
Ultimately, what Square Enix is building in Luminous isn’t just an impressive and powerful game engine, but a bridge. Between its talent. Between its studios. Between its ambitions and reality. Between console generations. And perhaps even across the uncanny valley, territory forever unconquered by developers.
Not everyone will agree that this is the way forward. In an age of Unreal and Unity, Luminous strives to be realistic and divisive. Maybe that is the secret to the next generation.