It started at CES, nearly 12 months ago. NVIDIA announced GeForce Experience, a software solution to the problem of choosing optimal graphics settings for your PC in the games you play. With console games, the developer has already selected what it believes is the right balance of visual quality and frame rate. On the PC, these decisions are left up to the end user. We’ve seen some games try and solve the problem by limiting the number of available graphical options, but other than that it’s a problem that didn’t see much widespread attention. After all, PC gamers are used to fiddling around with settings - it’s just an expected part of the experience. In an attempt to broaden the PC gaming user base (likely somewhat motivated by a lack of next-gen console wins), NVIDIA came up with GeForce Experience. NVIDIA already tests a huge number of games across a broad range of NVIDIA hardware, so it has a good idea of what the best settings may be for each game/PC combination.

Also at CES 2013 NVIDIA announced Project Shield, later renamed to just Shield. The somewhat odd but surprisingly decent portable Android gaming system served another function: it could be used to play PC games on your TV, streaming directly from your PC.

Finally, NVIDIA has been quietly (and lately not-so-quietly) engaged with Valve in its SteamOS and Steam Machine efforts (admittedly, so is AMD).

From where I stand, it sure does look like NVIDIA is trying to bring aspects of console gaming to PCs. You could go one step further and say that NVIDIA appears to be highly motivated to improve gaming in more ways than pushing for higher quality graphics and higher frame rates.

All of this makes sense after all. With ATI and AMD fully integrated, and Intel finally taking graphics (somewhat) seriously, NVIDIA needs to do a lot more to remain relevant (and dominant) in the industry going forward. Simply putting out good GPUs will only take the company so far.

NVIDIA’s latest attempt is G-Sync, a hardware solution for displays that enables a semi-variable refresh rate driven by a supported NVIDIA graphics card. The premise is pretty simple to understand. Displays and GPUs update content asynchronously by nature. A display panel updates itself at a fixed interval (its refresh rate), usually 60 times per second (60Hz) for the majority of panels. Gaming specific displays might support even higher refresh rates of 120Hz or 144Hz. GPUs on the other hand render frames as quickly as possible, presenting them to the display whenever they’re done.

When you have a frame that arrives in the middle of a refresh, the display ends up drawing parts of multiple frames on the screen at the same time. Drawing parts of multiple frames at the same time can result in visual artifacts, or tears, separating the individual frames. You’ll notice tearing as horizontal lines/artifacts that seem to scroll across the screen. It can be incredibly distracting.

You can avoid tearing by keeping the GPU and display in sync. Enabling vsync does just this. The GPU will only ship frames off to the display in sync with the panel’s refresh rate. Tearing goes away, but you get a new artifact: stuttering.

Because the content of each frame of a game can vary wildly, the GPU’s frame rate can be similarly variable. Once again we find ourselves in a situation where the GPU wants to present a frame out of sync with the display. With vsync enabled, the GPU will wait to deliver the frame until the next refresh period, resulting in a repeated frame in the interim. This repeated frame manifests itself as stuttering. As long as you have a frame rate that isn’t perfectly aligned with your refresh rate, you’ve got the potential for visible stuttering.

G-Sync purports to offer the best of both worlds. Simply put, G-Sync attempts to make the display wait to refresh itself until the GPU is ready with a new frame. No tearing, no stuttering - just buttery smoothness. And of course, only available on NVIDIA GPUs with a G-Sync display. As always, the devil is in the details.

How it Works

G-Sync is a hardware solution, and in this case the hardware resides inside a G-Sync enabled display. NVIDIA swaps out the display’s scaler for a G-Sync board, leaving the panel and timing controller (TCON) untouched. Despite its physical location in the display chain, the current G-Sync board doesn’t actually feature a hardware scaler. For its intended purpose, the lack of any scaling hardware isn’t a big deal since you’ll have a more than capable GPU driving the panel and handling all scaling duties.

G-Sync works by manipulating the display’s VBLANK (vertical blanking interval). VBLANK is the period of time between the display rasterizing the last line of the current frame and drawing the first line of the next frame. It’s called an interval because during this period of time no screen updates happen, the display remains static displaying the current frame before drawing the next one. VBLANK is a remnant of the CRT days where it was necessary to give the CRTs time to begin scanning at the top of the display once again. The interval remains today in LCD flat panels, although it’s technically unnecessary. The G-Sync module inside the display modifies VBLANK to cause the display to hold the present frame until the GPU is ready to deliver a new one.

With a G-Sync enabled display, when the monitor is done drawing the current frame it waits until the GPU has another one ready for display before starting the next draw process. The delay is controlled purely by playing with the VBLANK interval.

You can only do so much with VBLANK manipulation though. In present implementations the longest NVIDIA can hold a single frame is 33.3ms (30Hz). If the next frame isn’t ready by then, the G-Sync module will tell the display to redraw the last frame. The upper bound is limited by the panel/TCON at this point, with the only G-Sync monitor available today going as high as 6.94ms (144Hz). NVIDIA made it a point to mention that the 144Hz limitation isn’t a G-Sync limit, but a panel limit.

The G-Sync board itself features an FPGA and 768MB of DDR3 memory. NVIDIA claims the on-board DRAM isn’t much greater than what you’d typically find on a scaler inside a display. The added DRAM is partially necessary to allow for more bandwidth to memory (additional physical DRAM devices). NVIDIA uses the memory for a number of things, one of which is to store the previous frame so that it can be compared to the incoming frame for overdrive calculations.

The first G-Sync module only supports output over DisplayPort 1.2, though there is nothing technically stopping NVIDIA from adding support for HDMI/DVI in future versions. Similarly, the current G-Sync board doesn’t support audio but NVIDIA claims it could be added in future versions (NVIDIA’s thinking here is that most gamers will want something other than speakers integrated into their displays). The final limitation of the first G-Sync implementation is that it can only connect to displays over LVDS. NVIDIA plans on enabling V-by-One support in the next version of the G-Sync module, although there’s nothing stopping it from enabling eDP support as well.

Enabling G-Sync does have a small but measurable performance impact on frame rate. After the GPU renders a frame with G-Sync enabled, it will start polling the display to see if it’s in a VBLANK period or not to ensure that the GPU won’t scan in the middle of a scan out. The polling takes about 1ms, which translates to a 3 - 5% performance impact compared to v-sync on. NVIDIA is working on eliminating the polling entirely, but for now that’s how it’s done.

NVIDIA retrofitted an ASUS VG248QE display with its first generation G-Sync board to demo the technology. The V248QE is a 144Hz 24” 1080p TN display, a good fit for gamers but not exactly the best looking display in the world. Given its current price point ($250 - $280) and focus on a very high refresh rate, there are bound to be tradeoffs (the lack of an IPS panel being the big one here). Despite NVIDIA’s first choice being a TN display, G-Sync will work just fine with an IPS panel and I’m expecting to see new G-Sync displays announced in the not too distant future. There’s also nothing stopping a display manufacturer from building a 4K G-Sync display. DisplayPort 1.2 is fully supported, so 4K/60Hz is the max you’ll see at this point. That being said, I think it’s far more likely that we’ll see a 2560 x 1440 IPS display with G-Sync rather than a 4K model in the near term.

Naturally I disassembled the VG248QE to get a look at the extent of the modifications to get G-Sync working on the display. Thankfully taking apart the display is rather simple. After unscrewing the VESA mount, I just had to pry the bezel away from the back of the display. With the monitor on its back, I used a flathead screw driver to begin separating the plastic using the two cutouts at the bottom edge of the display. I then went along the edge of the panel, separating the bezel from the back of the monitor until I unhooked all of the latches. It was really pretty easy to take apart.

Once inside, it’s just a matter of removing some cables and unscrewing a few screws. I’m not sure what the VG248QE looks like normally, but inside the G-Sync modified version the metal cage that’s home to the main PCB is simply taped to the back of the display panel. You can also see that NVIDIA left the speakers intact, there’s just no place for them to connect to.

It looks like NVIDIA may have built a custom PCB for the VG248QE and then mounted the G-Sync module to it.

The G-Sync module itself looks similar to what NVIDIA included in its press materials. The 3 x 2Gb DDR3 devices are clearly visible, while the FPGA is hidden behind a heatsink. Removing the heatsink reveals what appears to be an Altera Arria V GX FPGA. 

The FPGA includes an integrated LVDS interface, which makes it perfect for its role here.

 

How it Plays
Comments Locked

193 Comments

View All Comments

  • just4U - Friday, December 13, 2013 - link

    The only stumbling block I really have is being tied to one video chip maker because of the Monitor I buy. That's a problem for me...Stuff like this has to be a group effort.
  • butdoesitwork - Friday, December 13, 2013 - link

    "The interval remains today in LCD flat panels, although it’s technically unnecessary."

    Technically unnecessary from whose perspective? Anand, I'm sure you meant the monitor's perspective, but this otherwise benign comment on VBLANK is misleading at best and dangerous at worst. The last thing we need is folks going around stirring the pot saying things aren't needed. Some bean counter might actually try to muck things up.

    VBLANK most certainly IS "technically" needed on the other end ---- every device from your Atari VCS to your GDDR5 graphics card!

    VBLANK is the only precious time you have to do anything behind the display's back.
    On the Atari VCS, that was the only CPU time you had to run the game program.
    On most consoles (NES on up), that was the only time you had to copy finished settings for the next frame to the GPU. (And you use HBLANK for per-line effects, too. Amiga or SNES anyone?)

    On most MODERN consoles (GameCube through XBox One), you need to copy the rendered frame from internal memory to some external memory for display. And while you can have as many such external buffers as you need (meaning the copy could happen any time), you can bet some enterprising programmers use only one (to save RAM footprint). In that case VBLANK is the ONLY time you have to perform the copy without tearing.

    On any modern graphics card, VBLANK is the precious time you have to hide nondeterministic duration mode changes which might cause display corruption otherwise. Notably GDDR5 retraining operations. Or getting out of any crazy power saving modes. Of course it's true all GPU display controllers have FIFOs and special priority access to avoid display corruption due to memory bandwidth starvation, but some things you just cannot predict well enough, and proper scheduling is a must.
  • DesktopMan - Friday, December 13, 2013 - link

    G-Sync isn't V-blank, so, yeah, if you have G-Sync you don't need V-Blank. You can take your time rendering, not worried about what the monitor is doing, and push your updated frame once the frame is ready. This moves the timing responsibility from monitor to the GPU, which obviously is a lot more flexible.

    If you need time to do GPU configuration or other low level stuff as you mention, then just do them and push the next frame when it's done. None of it will result in display corruption, because you are not writing to the display. You really can rethink the whole setup from bottom up with this. Comparing to systems that are not this is kinda meaningless.
  • mdrejhon - Friday, December 13, 2013 - link

    Although true -- especially for DisplayPort packet data and LCD panels -- this is not the only way to interpret GSYNC.

    Scientifically, GSYNC can be interpreted as a variable-length VBLANK.

    Remember the old analog TV's -- the rolling picture when VHOLD was bad -- that black bar is VBLANK (also called VSYNC). With variable refresh rates, that black bar now becomes variable-height, padding time between refreshes. This is one way to conceptually understand GSYNC, if you're an old-timer television engineer. You can theoretically do GSYNC over an analog cable this way, via the variable-length blanking interval technique.
  • mdrejhon - Friday, December 13, 2013 - link

    Yeah, put this into perspective:
    "Refresh rates" is an artificial invention
    "Frame rate" is an artifical invention

    We had to invent them about century ago, when the first movies came out (19th century), and then the first televisions (Early 20th century). There was no other way to display recorded motion to framerateless human eyes, so we had to come up with the invention of a discrete series of images, which necessitates an interval between them. Continuous, real-life motion has no "interval", no "refresh rate", no "frame rate".
  • darkfalz - Friday, December 13, 2013 - link

    Will this polling performance hit be resolvable by future driver versions, or only by hardware changes?
  • Strulf - Friday, December 13, 2013 - link

    While you can still see the effects in 120 Hz, tearing or lag is nearly not visible at all anymore at 144 Hz. At this point, I can easily do without G-Sync.
    G-Sync is certainly a nice technology if you use a 60 Hz monitor and a nVidia card. But nearly the same effects can be achieved with lots of Hz. A 27" 1440p@144Hz monitor might be quite expensive though. ;)
  • emn13 - Friday, December 13, 2013 - link

    The article states that at any framerate below 30fps G-Sync doesn't work since a panel refresh is caused on at least a 30Hz signal. That conclusion doesn't make sense; unlike a "true" 30Hz refresh rate, after every forced refresh, G-Sync can allow a quicker refresh again.

    Since the refresh latency is 1s/144 ~ 7ms aka on this panel, and a 30Hz refresh is ~ 33ms, that means that when the frame rendering takes longer than 33ms - but shorter than 40ms, it'll finish during the refresh, and will need to wait for the refresh. Translated, that means that only if the instantaneous frame is between 25 and 30 fps will you get stuttering. In practice, I'd expect frame rates to rarely be that consistent; you'll get some >30fps and some <25fps moments, so even in a bad case I'd expect the stuttering to be reduced somewhat; at least, were it not for the additional overhead due to polling.
  • mdrejhon - Friday, December 13, 2013 - link

    And since the frame scan-out time is 1/144sec, that's one very tiny stutter (6.9ms) in a forest of high-latency frames (>33ms+)
  • 1Angelreloaded - Friday, December 13, 2013 - link

    Just a thought but the one thing you really didn't take into account was NVidia's next GPU series, Maxwell, which supposedly will have the VRAM right next to the DIE and will share stored files with direct access from the CPU, if you take that into account, along with G-Sync you can see what will be happening to the framerates if they can pull off a die shrink as well.
    At this point I think Monitor technologies are so far behind, and profit milked to the point of stifling the industry so bad, 1080p has been far longer cycle that we could expect partially due to mainstream HDTVs. We should have had the next jump in resolution as a standard over a year ago in the 200$-300$ price range and standard IPS technologies with lower MS in that range as well or have been introduced to other display types. LED backlighting carried a heavy price premium when in reality they are ridiculously cheaper to produce because of the hazardous waste CFL bulbs being taken out of the equation, which cost more and have certain fees that need to be paid.

Log in

Don't have an account? Sign up now