Monitor Technology: Don’t ditch your display … just yet

It’s been an exciting time of the year for graphics enthusiasts. With both AMD and Nvidia launching entirely new product lines, we’ve been spoilt for choice. The only problem is that you now probably want a new monitor to take advantage of their new features. Whether it’s gaming at 4K at 60 frames per second, or checking out the stunning colour and contrast offered by High Dynamic Range (HDR) displays, these new graphics cards offer a huge new range of features and connection options.

Unfortunately, most of today’s displays haven’t quite caught up with the gold mine of new features now offered by the latest graphics cards. In fact, once you discount the performance improvements, much of what makes today’s new graphics cards so special is kind of useless, as we don’t have the displays to make the most of them. There are a handful of displays that are extremely expensive that do offer these features, but for the most part it’s going to be a year at least before we start to see these features become the norm. With that in mind, let’s take a look at what we can expect to see in PC displays over the next year or two. 

HDMI 2.0b

It seems like only yesterday that we finally got HDMI 2.0. It was the first version of HDMI that supported 4K resolution with a refresh rate of 60Hz; prior to that, we were limited to 30Hz, which isn’t exactly the most ideal experience for gamers. It was fine for movie viewing, but as most PC gamers know, 30 frames per second is a stuttering mess, which is why most gamers target 60Hz as the minimum.

HDMI 2.0b ups the ante once again, delivering a total bandwidth of 18Gbps, up from 10.2Gbps on standard HDMI 1.4. So what does this actually mean in real life? For starters, it’s fully compliant with HDR video technology, which we’ll get to in a little bit, and is one of the most significant advancements in display technology in years. It also supports up to 32 audio channels for over-the-top audio systems, with an audio sample frequency boost up to 1536kHz. 

The good news is that 2.0b is backwards compatible – if you’re running a display with 2.0, it should be possible to update the firmware to 2.0b, though updating a monitor’s firmware is rarely a simple task. It also now supports the 21:9 aspect ratio favoured by some cinematic releases. There’s just one slight issue with this new format. Despite being supported by both Nvidia’s new Geforce 10-series range and AMD’s new Radeon RX series, the number of displays that support HDMI 2.0b can be counted on one hand. In fact, on no hands – we couldn’t find any that support this new format. 

DisplayPort 1.4

Just as we see a new version of HDMI, along comes its competitor DisplayPort with its own update. Version 1.4 offers much more bandwidth than HDMI 2.0, with a maximum of 32.4Gbps, the same as DP 1.3. It also delivers simultaneous twin 4K streams, and can be run over the USB Type C connection. This means it brings full HDR support to the table, along with 8K video at 60Hz, or 4K at 120Hz.  Considering 4K support is still struggling, it’s obvious this 8K technology is designed for well into the future. Like HDMI 2.0, DP 1.4 also supports 32 simultaneous audio channels with a sample rate of 1536MHz. It supports all known audio formats. 

Like HDMI 2.0, the biggest improvement to DP 1.4 is support for HDR technology, which is the next big technology being pushed by display makers.


We have to admit that Thunderbolt 3 is one of the more confusing new connection types around. USB 3.0 Type-C also supports Thunderbolt 3, so is actually two connections in one… in most cases. Some USB 3.0 Type-C connections do not support Thunderbolt, so double check before shelling out for your new product.  Unlike earlier revisions of Thunderbolt, it requires the new USB Type-C connection to function – it won’t work over mini DisplayPort connections. 

Thunderbolt 3 can drive twin 4K panels at the same time. It also has the fastest bandwidth of all the new connections, at 40Gbps. Unlike its competitors, DisplayPort can be used to daisy chain devices, so you can hook a display up to a hard drive to a router. Gamers will appreciate its ability to allow the connection of external graphics cards to their laptops to speed up gaming performance. It’s also useful for a whole lot more than just display connections, with various hubs and cables transforming what it can do. However, once again, finding a display with a Thunderbolt 3 connection is next to impossible, unless you’re using a Mac, which has been using Thunderbolt for several years already. 

Aspect Ratios

Over the last year, we’ve started to see the dominance of 16:10 and 16:9 displays slowly shrink – it seems that Ultra-wide 21:9 aspect ratio monitors are starting to grow in popularity. These generally have a resolution of 2560 x 1080 or 3400 x 1440. In the past, the reliance upon TN panels meant that Ultra Wide panels suffered from colour corruption and image issues as the user looked to the edge of the screen. However, as curved IPS 21:9 screens are becoming more popular, these issues have been solved entirely. The increase in immersion is immediately apparent, as more of the user’s field of view is filled by the screen, and the use of IPS means that there’s no washout of colours or contrast towards the edge of the screen. We think curved 21:9 screens will really take off in the near future as they start to become more affordable. 


It’s hard to believe that you can buy a 4K TV for less than a thousand dollars, considering their incredibly high prices just a few years ago. 4K PC display prices haven’t decreased quite as much although; there are now entry-level 4K displays on the market for as little as $600. Obviously 4K displays at this price have plenty of other compromises though, with average colour and contrast performance, and slow grey-to-grey times. Still, for around $750, it’s possible to pick up a quality 28-inch 4K display; there’s just two issues with this. When it comes to video content, there’s just a couple of hundred titles available in true 4K resolution. There are still only two 4K Blu-ray players on sale in Australia, and even Netflix’s 4K library is limited. If you want to use 4K for gaming, you’re going to have to make one of two compromises. Even on the fastest single card solutions, you’re going to need to drop your detail settings to middle to high, or you’re going to need to shell out for a second GeForce GTX 1080. We’ve tested a PC with twin GeForce GTX 1080s running at Ultra Detail at 4K, and even that struggled to maintain 60 frames per second during especially hectic moments in graphically detailed games. Hopefully DX12 will help to improve this performance, but ultimately playing 4K with a minimum frame rate of 60 frames per second is still a dream.

Refresh rates

If there’s one thing pro-gamers love, it’s high-refresh rate monitors. In the last year we’ve seen Asus introduce gaming displays with overclockable refresh rates that hit over 144Hz when their special overclocking modes are enabled. Anybody who has gamed at 120Hz knows just how much smoother it feels than 60Hz, and we’re starting to see these 120Hz+ displays become much more affordable. Expect 120Hz to become the new standard in the near future, but bear in mind you’re going to need some rather spiffy hardware to drive it at these kinds of speeds.


It’s time to say goodbye to LCD and LED screens and bring OLED to market as the display panel of choice. Those who have seen OLED displays can testify to their amazing ability to portray true blacks amongst rich, vibrant colours, all while delivering incredible contrast ratios. This is because they don’t have a backlight – each pixel actually lights itself up. So when you turn that pixel off, it really is black. OLED displays also tend to be thinner and lighter than LED or LCD panels, and they’re also flexible, which is why we’re seeing more OLED curved screens. OLED also has a wider field of view than most LCD or LED panels, rivalling IPS panels with a field of view of around 170 degrees. 

In the past, OLED screens were incredibly expensive due to the small amount being produced. As production has ramped up though, especially in the land of televisions, we’re now starting to see the technology become much more affordable due to the scale of production. Dell has a 30-inch OLED screen on the way that supports a 120Hz refresh rate and 4K resolution, and it’s this kind of monitor that we can expect to become mainstream in the next year or two.  Sure, the upcoming Dell display is around US$5,000, but over time these prices will drop severely, just as they did with OLED TVs. 

G-sync vs FreeSync

The great Green versus Red battle continues when it comes to adaptive sync technology. Removing the issues of V-sync by tying a monitor’s screen refresh to each frame output by the graphics card, both technologies basically have the same outcome, but via different methods. Until recently, Nvidia’s G-Sync has been the solution of choice for gamers, as their stricter quality control has led to a more nuanced, polished experience. AMD has made up a lot of ground in this regard though over the last years, working with its partners to ensure Free-Sync works over a wider range of refresh rates, with fewer issues once a game exceeds these parameters. The fact that FreeSync is based on Adaptive-Sync, which is baked into the DisplayPort protocol, and doesn’t need proprietary hardware, also means that there are far more displays on the market with FreeSync than G-Sync. They also tend to be substantially cheaper due to the lack of a proprietary scaler required by G-Sync… and yet we think AMD still has a lot of marketing to do to get gamers onboard. The fact that Nvidia products dominate the market hasn’t really helped sales of FreeSync displays – we can only hope that Nvidia finally decides to play nice and adopt Adaptive-Sync, so that we’re all winners. 

High Dynamic Range

We’ve saved the best until last. High Dynamic Range (HDR) panels are the talk of the town, from GPU manufacturers like Nvidia and AMD, through to panel makers like Panasonic and Samsung. At this year’s Computex, it was impossible to talk about a graphics card without mention of HDR support, and it’s built into both the new Radeon RX and GeForce GTX series. It’s only possible thanks to the nature of their new HDMI 2.0b and DisplayPort 1.4 connections, which have the necessary bandwidth to deliver the extra information required by HDR. 

In mainstream terms, HDR delivers an image that is much more true to life, with richer colours, deeper blacks and an image that pops. It has many pros, but also a few cons. Like any new panel technology, it’s springing up in the TV world first, where the larger numbers of panels being built helps justify the use of this new technology. However, we should start to see HDR become commonplace in the PC displays over the next year or two, and the improvement in image quality is breathtaking; some would argue that it’s a bigger leap than that from 1080p to 4K. But just like any new technology, there are two competing types, with slight variances in their technical specifications. 

You’ll recognise most HDR-compatible devices by their Ultra HD Premium label. This was created by the UHD Alliance, a group of TV, computer and content creators. Then there’s the competing spec, called Dolby Vision, but they both basically do the same job. 

The key specs for both call for a minimum resolution of 4K, but it’s the colour depth that is the real difference. Ultra HD Premium supports a 10-bit colour depth, whereas Blu-ray only supports 8-bits. This is an uplift from 16 million colours to over a billion, leading to much smoother transitions between colours. Meanwhile Dolby’s standard requires a 12-bit colour depth.

Ultra HD also has to be able to display 90% of the colours defined by the P3 colour space, a standard that defines the colour information in a video stream. The higher the percentage, the richer and more accurate the colours of a TV. 

Contrast ratios are also incredibly important to HDR, and there are two options. The first is that a screen must support between 1000 nits peak brightness and less than 0.05 nits black level. The second option is a lower peak of 540 nits brightness, with a much darker 0.0005 nits black level. This is why OLED TVs tend to use Option 2, as their ability to produce pure blacks make them more suited for this colour range. 

It seems that Ultra HD Premium is the standard that most TV makers are going with, but the good news is that both standards are compatible with each other. The bad news is that HDR PC displays are thin on the ground. There are plenty planned for release in the coming months, but it’s already available in prosumer displays, costing around twenty grand. Expect this price to plummet, as it’s already possible to buy a new HDR TV for around five grand.


As you can see, there aren’t many new panel types on the way – LED, LCD and OLED will still lead the charge. However, it’s what these panels are now able to do, such as ultra-high refresh rates, different aspect ratios, HDR, and the like, that means right now might not be the best time to upgrade your display. If you’ve got one of the new GPUs from AMD and Nvidia it’s a shame that the display technology designed to go with their outputs isn’t really ready yet, but the fact that they’re there at all means they should come into play soon. 

[relatedYouTubeVideos relation=”postTitle” max=”1″ class=”horizontal center bg-black”]

Source link

Please follow and like us: