Few things get the industry buzzing so much as news of a product launch. Yesterday, it was Sony that lit up feeds with the announcement of its much-awaited PlayStation 5.
Four years in the (very secretive) making, the new console is set to boast a range of features, including a high-end custom SSD, 8K graphics, and backwards compatibility.
But it’s the inclusion of ray-tracing technology in a gaming console - for the first time - that has our ears pricked at Jumbla.
Shedding light on a new era of technology
Real-time ray tracing is the most realistic form of light rendering technology currently available.
Its algorithm determines where a scene’s natural light and shadows fall in relation to a viewer and their surroundings. This data is continually updated based on the viewer’s movements, or interactions between objects in their environment.
Say you’re playing a video game. Ray tracing ensures that ancient chest you see a hundred metres away is lit according to environmental conditions. It also ensures any light bouncing off the chest is rendered accurately, based on calculations around estimated colour, potential distance, and the overall quality of ‘bounced’ light.
Today, this technology is the sole domain of PC hardware. But Sony is set to revolutionise the gaming industry by becoming the first manufacturer to offer real-time ray tracing technology in a console. And it’s going to lead to some impressive visuals.
As Jumbla 3D artist Richard Shilling explains, having access to real-time ray-tracing is like bridging the gap between taking a picture on a phone and shooting a professional photo with a DSLR.
“A high-quality camera will capture the true depth of field and accurate colours, but with a phone, you have to simulate these kinds of things,” Richard said.
“The advantage of the smartphone is its accessibility. The final image is instantly viewable, but its quality will never be as good as the photo taken with a top-line camera - there’s always a compromise.
“In that sense, adding real-time ray tracing to a console is like replacing its internal smartphone camera with a professional DSLR. It produces instant results - like the smartphone - but without compromising image quality. It’s going to lead to some seriously awesome graphics.”
This stands in comparison to current real-time light rendering technology (simulated ray tracing, as performed in a screen-space environment), which estimates how light and shadows fall in a scene and compensates for missing data - just like the smartphone and current generation consoles.
While 3D artists have had access to true ray-tracing technology for some time now, Josh and Richard say its use has been restricted to specialised rendering software, designed for traditional film and animation media.
How does ray tracing work?
“When calculating ray tracing, the machine performs a huge amount of math to see how light bounces off a scene, the material composition of objects within it, and how it all interacts with the light,” Jumbla Motion Designer Josh Le Good explained.
“This massive load means it can take anywhere between several minutes and several hours to render a single frame, depending on the complexity of the scene.
“So it’s pretty incredible that games can now render light using real-time ray-tracing as viewers move through an environment in milliseconds.”
To illustrate the significance of this achievement, Rich points to the current time difference in rendering a single second of a bee flapping its wings.
For simulated ray tracing? It’d only take a couple hundred seconds.
With traditional renderers? One hour.
Now imagine you no longer had to ‘simulate’ ray tracing, but could achieve true ray tracing in real time. With the latest developments from software companies such as Epic Games, Unity Technologies, and hardware developer NVIDIA, it’s possible.
This is a game-changer for an industry bound by hard deadlines.
Borrowing another trade’s tools
Perhaps it’s no surprise that animation studios and filmmakers like Neil Blomkamp of District 9 fame have been borrowing game studio engines for their projects.
“This is because game engines are so good at working and rendering in real time, at high quality,” Josh said.
“It’s great for us, because it helps us see our changes instantly, rather than having to wait, review it in low-res, and then make another change.”
Richard says Unreal Engine and Unity are common gaming engine tools the animation industry uses to access real-time ray tracing.
“In addition to the speed the engines offer,” he added, “they’re also accessible for smaller studios. There’s no up-front cost attached to Unity, for example, or for Unreal Engine.”
For Josh, the democratisation of new technology like real-time ray tracing presents huge opportunities for animation and gaming studios alike.
“The technology’s been around for ages,” he said. “It’s just that up until recently, it’s been too taxing for standard game console software to deal with.
“As it’s become more accessible, smaller animation studios have been able to implement more stunning visual effects like ray tracing in their work without needing huge amounts of processing power, render farms or huge overnight renders to get there.
“This is all achievable with consumer-grade graphics cards that create amazingly beautiful, accurately-lit scenes in real time.”
Entering uncanny valley
So what does all of this mean for Sony’s latest offering?
“For the video game industry and PlayStation owners, we’re going to see clearer visuals, higher resolution textures and surfaces, and much more realistic lighting and shadows,” Josh said.
“You’ll be able to see objects and scenery from much further away, and from a dev perspective, it’ll mean reduced load time between game sections.”
For Rich, there’s also an expectation the growing accessibility of real-time ray tracing technology will influence the way animated short films are produced.
“Given that Unity and Unreal Engine are so accessible, I wouldn’t be surprised if small studios used that technology to create their own pieces entirely within these engines,” he said.
“Once we’ve seen enough examples of real-time ray tracing used, it will become a standard tool in a studio’s creative arsenal.
“This will enable smaller, boutique studios to create potentially award-winning work, without the overheads that come with say, operating a render farm to achieve similar results.”