Sunday 23 August 2015

Sync or swim

Sync or swim

Bennett Ring examines how Variable Refresh Rate tech has changed over the last year

It’s been over a year since we covered the first Variable Refresh Rate (VRR) technology to hit PC displays, in the form of NVIDIA’s G-Sync. A lot has changed since then, first with the Video Electronics Standards Association (better known as VESA) announcing that its own version of the technology, Adaptive-Sync, would become an optional part of the DisplayPort 1.2a specification. AMD then utilised Adaptive-Sync to launch FreeSync, its direct competitor to G-Sync.


While we covered both technologies at launch, many of the issues that plagued Adaptive-Sync and FreeSync seem to be getting solved, and we’re now in a position where there are around double the number of FreeSync displays on the market than G-Sync. There’s also the confusing issue of where Adaptive-Sync fits into the picture – will it replace FreeSync, and will NVIDIA ever support it? So we caught up with staff from AMD, NVIDIA, VESA, AOC and BenQ to see just where this refresh rate rebellion stands at the moment. As you’ll see, the debate between AMD and NVIDIA is fierce, so we’ll try to read between the lines to tell you what’s really going on.

VARIABLE REFRESH RATE THEORY 101


Before we begin, we’ll give you a very quick primer on what VRR actually means, in case you missed our earlier features. Before VRR came along, gamers had to make do with either Vertical Synchronisation enabled or disabled, aka V-Synch. Enabling it meant the GPU’s update rate was tied to the screen’s refresh rate, usually at 60Hz, which meant that only single, full frames would be drawn on screen. However, it also meant that the GPU had to stick to a framerate that was either tied to the display’s refresh rate, or a neat division of it. For example, when running on a 60Hz screen, the game would run at either 60Hz or division of that, such as 30Hz. The benefit of V-Sync is that it removes screen tearing, the problem that occurs when two different frames are drawn onto a single screen refresh, which looks like there’s a line running through the screen. However, V-Sync incurred the performance hit we just described, and also adds latency and stuttering, as it sometimes needs to pump out the same frame twice.

Disabling V-Sync removes this stuttering and latency, and also allows the GPU to spit out as many frames as it likes. However, this reintroduces the issue of screen tearing, which is ugly to say the least.

VRR removes this issue by forcing the screen to update only when the GPU has a frame ready for it. The screen’s refresh rate dynamically alters to match the framerate of the game, and it removes all screen tearing, latency and stuttering. It also makes lower framerates look smoother, even when a game is only running at 35 or 40fps.

So that’s how VRR works, but the issue we’re now faced with is that AMD and NVIDIA have both approached the problem differently. As a result, if you’re in Team Green, your only VRR choice is G-Sync. If you’re in Team Red, you have both Adaptive-Sync and FreeSync available to you. Let’s take a look at where these technologies sit today in the Australian market.

DISPLAYPORT ADAPTIVE-SYNC – ONE STANDARD TO RULE THEM ALL?


Adaptive-Sync is one of the main reasons you’re reading this article, as we noticed there was a huge amount of confusion about what it is, and how it relates to AMD’s FreeSync. In the past we’d thought the two terms were interchangeable, but that doesn’t appear to always be the case. AMD’s FreeSync is built using the Adaptive-Sync spec, and AMD actually suggested Adaptive-Sync to VESA, but they’re not identical entities.

According to Syed Athar Hussain, Display Domain Architect at AMD and VESA Board Vice Chairman, “The VESA Adaptive-Sync provides a flexible framework that allows the video source device to control the display’s frame rate”. It’s an open specification that is an optional part of the DisplayPort 1.2a standard, which means that monitor makers don’t have to pay a license to use it. However, it does not specify the overall supported refresh rates, or the quality of the experience. Mr Hussain continues, “While Adaptive-Sync is a fully described display protocol specification published by VESA, it does not establish limits on parameters such as allowable frame rate and other related attributes. Adopters of Adaptive-Sync select parameter limits based on system and display capabilities.”

Mr Hussain then explained that VESA expects Adaptive-Sync to become a brand unto itself. “VESA envisions Adaptive-Sync as a branded capability in the future, but at this point in time only AMD supports this feature on the Source side, so they are mostly promoting their brand…”. Now, we know that FreeSync is built on Adaptive-Sync, so what’s the difference between the two?

According to AMD’s Rob Hallock, Head of Global Technical Marketing, “Adaptive-Sync is the standard that allows a graphics device to control the refresh rate of a display. But by itself Adaptive-Sync doesn’t actually do anything for the user. It is merely a capability. A vendor still needs to come along and build a solution like FreeSync that actively provides an end-user benefit that incorporates the spec.”

Despite this, Mr Hallock then confirmed that any monitor that is flagged as Adaptive-Sync compatible - yet does not have AMD’s FreeSync branding - will indeed work with FreeSync. He says, “Any DisplayPort Adaptive-Sync device will be compatible with an AMD graphics card, because we all use the DisplayPort Adaptive-Sync specification”.

As a direct competitor to NVIDIA’s G-Sync technology, which the company spent several years developing, it’s no surprise that team green doesn’t have the greatest things to say about Adaptive-Sync. Tom Petersen, Distinguished Engineer at NVIDIA claims that, “…there’s not much to that spec. It’s pretty much a spec that says how a monitor can tell a GPU what its range is. It doesn’t deal with ghosting, or frame doubling or tripling, it’s strictly a communication protocol for setup.”

The benefit of Adaptive-Sync is that it’s an open standard, with no licensing fees. However, the lack of quality control across the entire Adaptive-Sync experience means the quality of the Adaptive-Sync experience will likely be highly variable, at least until the technology matures. Now that you know where Adaptive-Sync fits into the picture, let’s take a look at the technology that leverages it, AMD’s FreeSync.

AMD’S FREESYNC – A ROCKY START


To say FreeSync had a bit of a tough beginning is something of an understatement. Initial FreeSync displays only supported the VRR in a narrow zone of refresh rates, such as LG’s 34UM67, which only offered it between 48Hz and 75Hz. Outside of that and the display would kick back into bog standard V-Sync on or off. Ghosting was also in issue, with screens such as BenQ’s XL2730Z displaying motion trails or blur behind moving parts of the scene. It’s likely that these issues were a result of AMD’s hands-off approach when compared to G-Synch, where NVIDIA does QA testing across every component involved in a G-Synch display, before tailoring its proprietary scaler for each display. Mr Hallock clarified what this means. “The difference between us and NVIDIA is we let the manufacturer choose their low refresh rate and apply their own QA (Quality Assurance) process to determine whether or not LCD flicker is acceptable. Most of the manufacturers in the FreeSync ecosystem have said no, which explains the 30Hz bottom refresh rate.”

Thankfully it seems that these issues are rapidly being solved. Nixeus is about to release the NX-VUE24 monitor which supports FreeSync between 30Hz and 144Hz, while BenQ solved its ghosting issues via the release of a firmware update which allowed its Overdrive function to operate during FreeSync mode.

The major benefit of FreeSync is that it leverages Adaptive-Sync, which has no licensing costs. It also doesn’t require proprietary hardware. AMD’s Mr Hallock revealed how existing premium scalers were basically already good for VRR. “What is true is that many of the scalers that were already being used by these manufacturers were in fact compatible with a dynamic refresh technology like DisplayPort’s Adaptive-Sync. The critical missing component was a specification and a software standard that exposed these latent capabilities.” As a result, FreeSync shouldn’t add anything to the cost of a display, though we have seen very slight price increases on FreeSync displays in Australia. LG’s non-FreeSync 29UM57-P 29-inch display sells for $429, while the FreeSync enabled version is just $465, a tiny increase.

As a result of its low cost and open standard, the number of FreeSync panels has exploded, with 21 units on the market as of the end of July. Not bad considering FreeSync has only been around for four months, while NVIDIA’s 20 month-old G-Sync technology is currently limited to just 13 displays. Still, there’s the issue of what happens once FreeSync falls below 30Hz, the current minimum supported by FreeSync displays. AMD has plans to resolve this according to Mr Hallock. “We intend to talk more about our low render rate solutions soon. Regardless of the display, however, it’s widely understood that <30 FPS gaming is unplayable regardless of the display. There simply aren’t enough frames to convey smooth motion, so chasing solutions for these cases seems like an exercise in futility that can be solved through a more reasonable GPU/game pairing.” We have to agree, as even G-Sync starts to feel sluggish once the frame rate drops below 30fps.

While FreeSync is currently shaping up to deliver a much better experience across a wider range of displays, Mr Hallock would be happy for FreeSync to disappear, and Adaptive-Sync to become the industry standard. “That effectively means that the efforts we worked very hard on in the display ecosystem have come to fruition and have won the day, that everybody is now working with an open, interoperable standard rather than haves versus have nots. I think that would be a good thing for everybody in the industry, especially consumers.” Let’s see what NVIDIA has to say about that.

NVIDIA’S G-SYNC – A PRICEY, PROPRIETARY YET PREMIUM EXPERIENCE


Unlike Adaptive-Sync and FreeSync, NVIDIA’s G-Sync requires proprietary hardware to operate, in the form of a custom scaler in the display. While the cost of this is rumoured to be around US$100 to US$150, it has led to an even higher premium in Australia. According to Josh Edwards, Sales and Marketing Coordinator at BenQ Australia, Aussie gamers have to pay $200 to $300 extra for a G-Sync display. In fact, the cost was so high at G-Sync’s launch, that when combined with the high exchange rate at the time, BenQ Australia didn’t release G-Sync panels in Australia. Josh explained, “…when we were looking at bringing in the XL2420G originally… it would just end up costing too much. So we had to get support from our headquarters in order to bring it here due to demand.”

From a manufacturing perspective, the other major difference with G-Sync is that NVIDIA conducts Quality Assurance on all G-Sync displays to ensure a rock-solid experience. Tom Petersen, Distinguished Engineer at NVIDIA, detailed what this means. “So that means we do the monitor, the module, and we test all that stuff, from driver through GPU through monitor. We know that if there’s a problem, it’s on NVIDIA.” This is in stark contrast to Adaptive-Sync, which has a hands-off approach, whereas FreeSync seems to sit somewhere in the middle.

A key difference in the actual technology of G-Sync is its ability to handle lower frame rates, which is why it was the preferred choice when the first FreeSync panels struggled with these same sub-40 frame rates. LCD panels start to flicker when their screens aren’t updated frequently, usually below 30Hz or so. G-Sync solves this issue by frame doubling, tripling or quadrupling when the monitor hits its designated minimum refresh rate. This minimum varies by each display, but let’s use a display with a minimum 30Hz refresh rate as an example. Once the game starts outputting at 29 frames per second, G-Sync starts doubling the screen’s refresh rate, up to 58Hz. This means flickering is not an issue. Tom suggests that this gives G-Sync a huge edge over FreeSync. “If you’re FreeSync and the render rate drops below the minimum refresh rate of the monitor, it shifts to a traditional mode where they’re synchronised to the refresh rate, like 60Hz, and they’re either going to tear or stutter.” However, we’re now seeing FreeSync displays that operate between the 30Hz and 144Hz range, and we’d argue that even G-Sync doesn’t offer a very good gaming experience below 35Hz.

NVIDIA spent several years developing G-Sync, so it’s no surprise that it doesn’t intend to support Adaptive-Sync in the near future. When asked if NVIDIA could roll out Adaptive-Sync support to existing products, Mr Petersen stated, “ I’m not honestly certain of that. I can speculate and say that as far as I can tell after looking at the tech, what Adaptive-Sync is, it is a method for a monitor to communicate to a GPU, and it’s very likely that we can change our GPUs to understand that communication, but there’s a lot more going on.” However, he claims that without the G-Sync scaler and NVIDIA QA, it wouldn’t live up to NVIDIA’s reputation. “The experience that an Adaptive-Sync monitor would deliver even when attached to an NVIDIA GPU, is likely not at the quality level that NVIDIA wants.”

CONCLUSION


To sum it up, here’s where VRR displays stand today. NVIDIA doesn’t think Adaptive-Sync or FreeSync is a threat to its proprietary standard G-Sync. Yet FreeSync has been adopted by a wider range of displays in a much shorter window than G-Sync, but it’s anybody’s guess as to how FreeSync versus G-Sync sales compare. The initial problems of Adaptive-Sync seem to be well on the way to being resolved as monitor makers get their head around better quality scalers and panels, while we can expect Adaptive-Sync branded displays to hit the market soon, which will work fine with FreeSync. We’ve got a hunch that Adaptive-Sync will take over as the dominant standard, especially if Intel decides to support it, but the million dollar question is if and when will NVIDIA offer Adaptive-Sync support. We get the feeling it’ll be possible for them to enable it on existing products via a firmware or driver update, but we don’t expect NVIDIA to give up on G-Sync for at least a year. In the meantime, your decision about which technology to use is most likely to be determined by the GPU you own, rather than the choice of VRR technology you prefer.