Here's how AMD will make games, movies and photos look better than ever in 2016
AMD's Radeon Technologies Group (RTG) just laid out its roadmap for improving visual technologies in 2016. What's in store? Better, brighter, and higher resolution displays that are truer to life than ever.
Higher resolution displays are not the be-all-end-all of image quality. (Image Source: AMD)
Here’s some news – more pixels isn’t always the way to go if you want better image quality. This may seem counterintuitive in a time when display manufacturers are shoving a growing legion of 4K displays in your face, but AMD thinks that the push to cram more and more pixels into displays can only go so far.
It may just be on to something here. At the recent Radeon Technologies Group (RTG) Summit, AMD set out its plans for improving its visual technology offerings in 2016. Part of that was the announcement that FreeSync would be available over HDMI in 2016, but a far meatier portion of the session was dedicated to its roadmap for enhancing the image quality on displays on both the software and hardware front.
According to AMD, we need more than just displays with more pixels. More than anything, we need “better” pixels. That’s quite a vague term, but what AMD really means is that both displays and GPUs need to support higher contrast ratios, a higher peak luminance, and larger color spaces.
Brighter displays and higher contrast ratios
If you’ve taken a look at the specifications of monitors while out shopping, you’ve probably seen the brightness quantified in terms of cd/m2, or nits. Most mainstream displays go up to around 250cd/m2, but even high-end displays like the Dell UltraSharp U3014 and ASUS PG279Q only go up to 350cd/m2.
You may think that this is plenty bright – and it probably is – but there’s actually still a lot that you’re missing out on. For one, the human eye is actually capable of perceiving a much wider range of brightness than today’s top displays. As you can see in the slide below, the color and constrast range that Standard Dynamic Range (SDR) can display is just a fraction of that which we can actually see.
SDR doesn't do justice to what the human eye is capable of. (Image Source: AMD)
In comparison, High Dynamic Range (HDR) displays match up much better to what our eyes are capable of, which results in more vibrant images that are truer to life. As it stands, the current crop of displays don’t have a brightness range that is wide enough to properly display HDR, which is defined by the range of brightness and contrast ratios within an image.
Sufficiently high contrast ratios are really only possible with LCDs with LED local dimming or self-illuminating OLEDs, which allow control over individual sections of the screen, thus enabling one portion to remain bright while another part goes dark. RTG says that we can expect to see HDR-capable displays on the market in 2016. LED backlit screens with local dimming are expected to hit up to 2,000cd/m2 by the end of 2016, while their OLED counterparts will hit 1,000cd/m2 by then.
The first step to HDR-capable displays is the production of higher luminance screens. (Image Source: AMD)
Displaying more colors
Unfortunately, display specifications are one of the elements that are almost out of AMD’s control, and it can only hope that the hardware keeps pace with its roadmap. Of course, bringing HDR to PCs requires more than just brighter displays – it calls for changes in the color space as well, such as the need for a larger color space and how colors are stored. This is one area that the RTG can play a more active role in.
For one, it is taking steps to support a larger color space, which essentially means the ability to display more colors. The graph below shows the different color spaces, and unsurprisingly, human vision is the largest and encapsulates all the others. After all, we’re still playing catch up in the area of digital reproductions.
The BT.2020 color container is required to support HDR content. (Image Source: AMD)
Most monitors today utilize the traditional sRGB color space, which is unsuitable for true HDR because it is not large enough to properly represent colors at extreme ends of the brightness curve. While support for AdobeRGB goes some ways to improve this, RTG’s goal is to ultimately support the BT.2020 color space, which is the standard the consumer electronics industry has agreed on for HDR content. 4K Blu-ray content will be mastered in BT.2020, and further content is likely to follow. As the graph shows, BT.2020 is the color space that is closest to the human visual range.
In addition, the larger BT.2020 color space will require changes in how it is represented, or rather, new encoding schemes. The problem is that today’s consumer content was designed around the Rec.1886 electro-optical transfer function (EOTF), which was built in the 1930s for CRTs and only accounted for brightness levels up to 100cd/m2. This ended up conflating different black levels and resulted in a loss of detail in dark areas – hardly ideal for HDR content.
New encoding schemes are required to keep up with the larger color space. (Image Source: AMD)
While the prevailing 8-bit per color encoding scheme is technically capable of displaying BT.2020, the larger color space would only make the banding that results from having only 256 shades of any primary color all the more glaring.
As a result, BT.2020 must be followed by a higher bit depth of images, up from the current 8bpc to a minimum of 10bpc. This quadruples the number of shades of each primary color to 1,024, thus allowing an HDR image to be accurately displayed.
Fortunately, the RTG doesn’t have to start from scratch as 10bpc rendering is already being done on PCs, although it is still limited to professional applications and graphics cards. But if AMD has its way, BT.2020 and 10bpc rendering will be a consumer feature sooner rather than later.
Furthermore, AMD’s existing Radeon 300 series of graphics cards are already capable of 10bpc rendering, but they will need to do this over HDMI 1.4b or DisplayPort 1.2 on a compatible monitor. However, the higher bit depth requires more bandwidth, so you won’t be able to have all the benefits of HDR, 4K, and a 60Hz refresh rate with the 300 series cards because of the bandwidth limitations of DisplayPort 1.2.
The solution? The DisplayPort 1.3 standard and new HDR compliant monitors! More on them on the following page.
DisplayPort 1.3: Higher resolutions and refresh rates
Moving forward, AMD’s 2016 GPUs will support the new DisplayPort 1.3 standard, which increases the bandwidth over DisplayPort 1.2 by 50% to 32.4Gbps, up from 21.6Gbps.
AMD's 2016 GPUs will support the new DisplayPort 1.3 standard, which brings significant improvements in available bandwidth. (Image Source: AMD)
This allows for a combination of both higher resolutions and refresh rates, and the next generation of AMD GPUs will be able to playback 2160p60 content at 10bpc over DisplayPort 1.3, and also HDR at 4K and 60Hz (or 4K@120Hz with SDR). The maximum supported SDR resolution has also increased to 5,120 x 2,880 pixels at 60Hz, assuming a 4:4:4 chroma subsampling scheme. DisplayPort 1.3 utilizes existing cables and connectors, so there’s no need to run out and get new ones.
DisplayPort 1.3 will enable displays with higher resolutions and refresh rates. (Image Source: AMD)
And unlike the 300 series cards, next year’s GPUs will also directly support the HDCP 2.2 standard, which will be required for all 4K/HDR content. The result is that only 2016 GPUs will be able to play HDR movies, while the current cards will be limited to just gaming and photos.
Only 2016 AMD GPUs will support HDR playback for movies. (Image Source: AMD)
There are just a couple of final hitches to address. For instance, the Windows operating system isn’t capable of HDR rendering, so RTG currently has to circumvent the OS’s color management and render HDR only in exclusive fullscreen context. RTG is currently working with Microsoft to resolve this issue, which will need to be addressed if HDR adoption is to truly enter the mainstream. Software and game developers will also need to make changes, because game engines currently work on the assumption that games and applications will be rendered for an SDR display. For instance, many game engines render in HDR, but they use tone mapping algorithms like High Dynamic Range Rendering (HDDR) to map images back to SDR, resulting in overblown or blown out colors.
The industry is still built to cater primarily to SDR, but if all goes as planned, AMD's RTG will be leading the shift to HDR in 2016.
AMD has set its sights on HDR in 2016. (Image Source: AMD)
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.