NVIDIA Montreal Event - Live Blog

by Anand Lal Shimpi on 10/18/2013 9:49 AM EST
POST A COMMENT

24 Comments

Back to Article

  • liahos1 - Friday, October 18, 2013 - link

    Anand you are a true multitasker. Reply
  • noeldillabough - Friday, October 18, 2013 - link

    I *just* bought a GTX 780, bummer lol. Reply
  • Guspaz - Friday, October 18, 2013 - link

    Man, Anand is here in Montreal, and I'm stuck in work and unable to get an autograph :P Reply
  • lm1988 - Friday, October 18, 2013 - link

    只买过GT210的颜面撸过 Reply
  • tipoo - Friday, October 18, 2013 - link

    Aww, I wanted them to call it N-Sync.

    And seriously, this guy took like 20 minutes to keep repeating what the problem is before getting to the solution.
    Reply
  • nathanddrews - Friday, October 18, 2013 - link

    GSync... I've always wondered why GPUs and monitors can't talk to one another in better ways - maybe use a webcam to sync settings or something. While I do not like the thought of more proprietary hardware/protocols, I DO love the net result.

    PLEASE let this start a new wave of HFR glory!
    Reply
  • nathanddrews - Friday, October 18, 2013 - link

    AT, I assume that when talking about "3D" and GSync, it means whenever there's a 3D game being played and not stereoscopic 3D, correct? I imagine it works with both modes, but I would like clarification...

    OK, so now my FW900 needs to last long enough to see one of these in action...
    Reply
  • RealiBrad - Friday, October 18, 2013 - link

    I like the Idea, but fear the premium. Reply
  • AP27 - Friday, October 18, 2013 - link

    Since they seem to be working with 3D mode for G-Sync, maybe they can implement the Lightboost effect as well. Their focus seems to be on 120/144Hz monitors as it is. Reply
  • B3an - Friday, October 18, 2013 - link

    Too much proprietary crap.... it's getting out of hand. I like the sound of G-sync a lot but we have AMD's Mantle, Nvidia's PhysX and now G-sync. All proprietary.

    Microsoft should do something to sort out this mess. Create something like G-sync and make it a DirectX 11.3 feature or something, while also trying to lower the DX overhead to make any performance advantage Mantle has insignificant. Everyone would benefit.
    Reply
  • jjj - Friday, October 18, 2013 - link

    Any chance you can ask if they plan to integrate G-Sync into mobile devices (make the display driver if they have to) or suggest that they should try to? Reply
  • DukeN - Friday, October 18, 2013 - link

    Finally, G-Unit and N-Sync merged.

    It's a great day for the world..
    Reply
  • Kiste - Friday, October 18, 2013 - link

    For some reason I have the strong suspicion that the G-Sync module will go into TN-Panel "gaming monitors", which are useless to those of us who actually care about image quality. Reply
  • Sabresiberian - Friday, October 18, 2013 - link

    My thoughts, too. None of the so-called gaming monitors have received good reviews for their screen quality. 120Hz gets raves, but the TN panels used, not so much. Reply
  • repoman27 - Friday, October 18, 2013 - link

    So the G-Sync chip is either replacing or directly driving the T-Con in the display... And we've totally thrown CVT out the window, so what protocol are the GPUs even using to connect to the display? Something proprietary over a DisplayPort/HDMI physical link? Reply
  • Ikas - Friday, October 18, 2013 - link

    Would be great to see G-Sync available on laptops as well.Any word if it's coming for laptops or is it PC monitors only? Reply
  • dew111 - Friday, October 18, 2013 - link

    I am very skeptical about this implementation. If you know anything about how an LCD display works, you know that all the pixel data is swept out to the panel, kind of (but not really) like on a CRT. And you have to keep sweeping, because if you stop, the pixels start to wander back to a neutral position. Like JHH said, it would be immensely complicated to get around this. Interrupt the sweep, and some pixels may start to wander too far. Wait for the sweep to finish, and you have stutter.

    So the big questions are: how expensive is it? Is a new kind of LCD needed? Will there be versions on quality monitor types, or just TN panels (not interested)?

    And of course, an open standard would be nice, if it truly is a revolutionary as this press release makes it sound. Then again, they said the same thing about 3D Vision, and look where that is. At least on the face, it smells like another vendor lock-in scheme (although maybe it is also a really good technology).
    Reply
  • dew111 - Friday, October 18, 2013 - link

    Also, comments like, "I was actually surprised at how my mind interpreted it," do not inspire much confidence. If you are conscious of how your mind interprets it, that seems like a bad thing. Maybe you just had your reviewer hat on at the time. Reply
  • mdrejhon - Friday, October 18, 2013 - link

    Actually, LCD's today are now able to sync to any refresh rate (60Hz through 144Hz). My existing 144Hz monitor can use any refresh rate, as long as it's unchanging. The monitor maker has successfully been able to drive pixels at any refresh rate as long as the refresh rate is constant.

    What appears to be happening now, is that the monitor is always scanned at a speed of 1/144sec, but asynchronously of the 144Hz intervals. Modern LCD pixels are now stable, as LCD pixel transitions on modern 120Hz monitors are now slowly starting to resemble square waves (small ripples during the first few milliseconds, then a flat plateau until the end of a refresh). Basically a 1ms monitor (usually ~4-5ms actual) leaves plenty of time (16.7ms) between refreshes at 60Hz. Because the transitions curves are becoming cliffs now on faster LCD monitors compared to the large length of a refresh (16.7ms), LCD pixels are now stable enough to support asynchronous refresh rate operation.

    An excellent animation of LCD pixel transitions slowly starting to resemble square-waves:
    http://www.testufo.com/#test=eyetracking&patte...
    View this on a modern 120Hz or 144Hz monitor.

    This is proof that modern LCD panels are now ready for asynchronous refresh rates. They might need a "re-refresh" after maybe a tenth of a second or so, but GPU game frame rates are much higher than that.
    Reply
  • drinkperrier - Friday, October 18, 2013 - link

    montréal power! Reply
  • mdrejhon - Saturday, October 19, 2013 - link

    I've developed a new method of combining G-Sync and LightBoost, without creating uncomfortable flicker at lower framerates:

    http://www.blurbusters.com/faq/creating-strobe-bac...

    Eventually, I hope variable-rate LightBoost becomes included so we don't have to choose between either G-Sync or LightBoost.
    Reply
  • wumpus - Saturday, October 19, 2013 - link

    First, it's good to note that somebody realized that CRTs aren't being used anymore and thought about what that might mean. Now can we *please* have sub-pixel resolution (i.e. calculate the R-G-B values where each of them is instead of splattered together like how you need to with CRTs)?

    Second, they had Carmack there he didn't give a peep about just how any G-sync improvements were critical to Oculus Rift? I'd assume that is the elephant in the room much like Steam is for Mantle. Getting the updates timed right has to be at least as important as getting the framerate up (to a certain point). I have to suspect that much after 60fps, the main advantage is that the margin of error between the time the frame was accurate and the time it was displayed is going to go down. G-sync will get this more accurate.

    What isn't shown is any indication that G-sync will triple-buffer or do anything silly like introduce intentional latency to nail down "smoothness". For less "twichy" applications it might well make sense to know exactly when a frame will be needed, and then work on the next frame and save it a few miliseconds till it's time to show it.
    Reply
  • earthzero - Wednesday, October 23, 2013 - link

    So when do we see G-Sync on TVs? I would assume that, if the push into the living room continues, this would be a critical next step for gaming going forward. How will this technology compare with the high quality scalers that already exist in some TVs and in living room home theater equipment like mid to higher end AV receivers? How can HT equipment possibly benefit? These are the questions I would like to hear answers to soon. Reply
  • skumdogg - Wednesday, March 12, 2014 - link

    So all of our DVD, and Blu Ray, players are going to need NVidia GPU's? I can see this getting really expensive really fast. Besides Blu Ray, and OTA TV displays a constant frame rate, so there isn't anything to 'keep up' with. Reply

Log in

Don't have an account? Sign up now