By Holger Gruen, posted May 12 2016 at 10:42AM
Authors: Jeroen Soethoudt (Nixxes), Jurjen Katsman (Nixxes) and Holger Gruen(NVIDIA)
This blog explores how we added support for true HDR to a real-time 3D application, focusing on the challenges tied to game development.
With the advent of Ultra High Definition (UHD) television, displays are taking a substantial step forward compared to the standards developers have been used to for the past two decades. While UHD as a specification was originally developed for televisions, the technologies involved will start to impact many classes of displays soon.
The two dimensions of improvement in these new displays are often thought of as wider gamut and higher dynamic range (HDR). Let’s concentrate on HDR for this blog post.
Many games are already rendering to a higher dynamic range internally. Tone mapping has been applied to get down to standard dynamic range (SDR) displays for many years.
HDR capable displays now put higher demands on tone mapping functions, as the desired output is no longer just SDR. In fact NVIDIA’s driver allows for displaying linear space, floating point frame buffers and thus tone mapping functions need to evolve to cater for this. As a great introduction to this topic, please make sure to check out Evan Hart’s blog ‘Preparing for Real HDR’.
One key element that greatly increases the realism of HDR is physically-based content and lighting models (PBR). NVIDIA’s new Pascal GPUs offer unprecedented capability to support physically based rendering approaches.
Working in close collaboration, NVIDIA, Crystal Dynamics and Nixxes implemented true HDR displays in ‘Rise of the Tomb Raider’ as a way to push the graphic detail to unprecedented heights.
Entering the Tomb
Featuring epic, high-octane action moments set in the most beautiful, hostile environments on earth, Rise of the Tomb Raider delivers a cinematic, survival action adventure where you will join Lara Croft on her first tomb raiding expedition as she seeks to discover the secret of immortality.
Screenshot 1 – A cinematic scene with great HDR potential
Amongst other things, the great cinematic quality of this fantastic game is what got us at NVIDIA interested. Also, an analysis of many captures of the game showed that the pre-tone mapping data had truly amazing scope for HDR. Specifically, we were looking for plausible, real-world levels of illumination and realistic relative luminance magnitudes. In other words, the engine isn’t cheating with the luminance values of light sources and isn’t faking SDR brightness effects. These realistic luminance levels allow for strong highlights that will take advantage of HDR displays. Finally, the advanced HDR-ready engine technology was another great fit because it largely used PBR throughout, allowing easy integration of true HDR support. All these findings led us to conclude that Rise of the Tomb Raider would become an even more immersive experience with true HDR.
NVIDIA technology can help reduce the development effort required to make your game HDR ready. Therefore we provide an API for dealing with HDR displays and also C++/shader source for integrating filmic ACES tone mappers to interested developers.
A few prerequisites should be met to make sure you get the best HDR experience out of a game:
Make sure your rendering pipeline supports PBR and floating point surface formats right until the end of the frame.
Remove all clamping or saturating operations from shaders that run after the tone mapping pass.
Ideally move all of your post-processing effects to run before tone mapping happens.
Switch over a tone mapper that properly understands output display luminance. We’ve found an ACES derived pipeline to work well.
Let’s now talk about what was done to get HDR in ‘Rise of the Tomb Raider’ up and running.
Switching your Display to HDR mode
Before you can display any linear space HDR surface on your display, you need to switch it to HDR mode. This is where NvAPI comes into play. It offers functions for:
Querying the HDR capabilities of an attached display
Enabling HDR output and for configuring the HDR display for HDR output
The experience at Crystal Dynamics/Nixxes is that it isn’t always easy to find out if your display has truly switched to HDR mode or not. To verify HDR mode is really on, a special debug mode was added to the game’s tone mapping shader. This mode (see Screenshot 2) clamps all color components to 1.0 in every second horizontal band. If the display for some reason doesn’t properly switch to HDR mode, these bands would not be visible as everything above 1.0 would get clamped to SDR white. So if you have HDR content on screen and you can see the bands you can be sure your display has truly switched to HDR mode. If you don’t see any bands chances are high that the display has erroneously stayed in SDR mode.
Screenshot 2 – debug bands - overall colors are incorrect here as the range had to be adapted to show this screenshot in SDR this is why the debug bands look dark grey instead of SDR white
When displaying an SDR image in HDR mode without any HDR content, most HDR displays tend to boost the luminance of the SDR image to make it look brighter and more compelling. So if you wish to compare an SDR image and an HDR image side by side with the monitor in HDR mode, the SDR image will look very dark in comparison. We have found that in order for that comparison to make sense we needed to boost the brightness of the low dynamic range (LDR) image by some factor to mimic what the display does for SDR images.
Once you are sure you have your display truly in HDR mode, it is time to improve your tone mapping shaders to ensure that you get the best user experience from the extended brightness range that can be shown. Often games tend to use a Reinhard style tone mapper to deal with mapping to the SDR range. This was also the case for ‘The Rise of the Tomb Raider’.
Let’s discuss now how to integrate filmic ACES tone mappers – again check out Evan Hart’s blog ‘Preparing for Real HDR’ for a great intro.
Integrating ACES filmic tone mappers
Using the C++/shader source for setting up ACES tone mapper shader constants as well as the shader sources provided by NVIDIA Developer Technology, it was easy to add support for a range of ACES tone mapping curves to the game (see Figure 1). All that needed to be done was to utilize the available C++ classes to setup constant buffer data that was then fed into the provided ACES tone mapping shader.
Figure 1 – Tone mapping functions (horizontal axis input color/luminance, vertical axis: output luminance in nits)
In the resulting images, it was immediately visible that the ACES 1k nits tone mapper made a big difference in bringing out more vibrant colors in the darker part of the image when compared to the Reinhard tone mapper originally used by the game.
Yet, everything was quite dark and didn’t adapt to the overall change in brightness in the scene. It then dawned on us that we hadn’t yet hooked up the auto-exposure data that gets fed into the original Reinhard tone mapper of the game. So let’s now discuss how that was done.
Feeding auto-exposure data into the ACES tone mapper
In reality, all that was left to do was to hook up the auto-exposure that the game applies before feeding color into the ACES tone mapper. To get this working, the ACES shader was changed to do exactly what the original tone mapper does for auto-exposure.
We wanted to be able to compare SDR and HDR, and thus a split screen mode was implemented for debugging purposes. As mentioned above, we have found that in order for that comparison to make sense one needs to boost the brightness of the LDR image by some factor to mimic what the display does for SDR images.
In general sRGB specifies a standard level of luminance, but user settings and different devices can lead to differences on displays in LDR mode. When displaying in HDR mode, the SDR part of the contents written to the screen will be displayed at a standard level.
After bringing up the split screen view, we did see a truly vibrant difference as shown in Screenshot 3-5. (Clearly it is tricky or impossible to truly show the improvement without an actual HDR setup. Instead we took photos of the HDR monitor displaying HDR in split-screen mode to be able to show the difference)
Screenshot 3 – HDR (left) vs SDR (right)
Screenshot 4– HDR (left) vs SDR (right)
Screenshot 5– HDR (left) vs SDR (right)
As many games do, ‘The Rise of the Tomb Raider’ employs a 3D LUT to perform color grading and correction after tone mapping. This is something that doesn’t immediately translate into tone mapper output in HDR space. Let’s now discuss what was done to address this remaining issue.
Color grading in HDR space
We needed to decide what to do about the color grading that the game applies via a 3D texture lookup. As the color channels may now well create HDR values that are well above 1.0, the original lookup table simply can’t be used anymore.
Our first attempt was to extrapolate how hue and brightness changes introduced by the SDR color grading LUT would translate into HDR space. Unfortunately this resulted in undesirable hue changes in some of the HDR colors and thus this approach got dropped. Instead a combination of simple gain, bias, contrast enhancement, and saturation functions were used to match the desired look.
Please note, that in general LUTs still work with HDR, but the original LUTs were designed specifically for SDR after tone mapping. Using the functions prior to tone mapping allows you to modify the color independent of the target.
If this blog got you interested in adding HDR support to your game read on for our ‘Call to Action’.
Call to Action
HDR is here to stay, and more and more displays will support HDR. As described in this blog, adding HDR support to you game is reasonably straightforward and there is a lot of help from NVIDIA in the form of NvAPI and source code for ACES tone mappers.
If you are still early enough in the development cycle of your game make sure that color grading isn’t abused to fix lighting or other artefacts in your rendering pipeline. Instead try to restrict your color grading operations to combinations of simple gain, bias, contrast enhancement, and saturation functions to achieve the desired look.
Thank you to Jurjen Katsman and Jeroen Soethoudt for all their help and engineering support.
2016 marks the 20 Year Celebration of the original “Tomb Raider” game, so it seemed fitting that Crystal Dynamics, Nixxes and NVIDIA collaborated to add true HDR to “Rise of the Tomb Raider”. Congratulations Lara Croft, you’ve been an icon for 20 years and look better than ever.