POST A COMMENT

217 Comments

Back to Article

  • Insanejew - Friday, October 18, 2013 - link

    ITS OVER, AMD IS FINISHED Reply
  • mwildtech - Friday, October 18, 2013 - link

    Says the Jew... Reply
  • Insanejew - Friday, October 18, 2013 - link

    I am so confident that this will be the best thing ever, I am moving all my jew gold that I put in AMD stock for mantle back to Nvidia. Reply
  • mwildtech - Friday, October 18, 2013 - link

    Love this guy :) Reply
  • yannigr - Saturday, October 19, 2013 - link

    You still have AMD shares? I though Jews where better at economics. You should have sold 2 days ago. Now you are more than 10% down :p

    On the other hand you do realize that something that needs a specific monitor+a specific gpu+to buy extra hardware is not going to have wide adoption. Right?
    Reply
  • lmcd - Sunday, October 20, 2013 - link

    If this is compatible with older Nvidia cards (well, the 660 TI for me) then I think my monitor purchasing criteria just changed quite a bit... Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    lol, clearly you don't know how long term investments work. When the price drops for someone like AMD, buy as much as you safely can. They own the next generation and have an incredible roadmap into mobile as well. Reply
  • andrerocha05251975 - Saturday, November 02, 2013 - link

    Wrong. If it works, if it works. Everybody will want it, just because it's a step up. An enhancement, something that makes the tech. better. Reply
  • Spunjji - Tuesday, October 22, 2013 - link

    Amazing Reply
  • Da W - Friday, October 18, 2013 - link

    Lol.
    Just a matter of throwing my 3 brand new IPS screens out the window.
    Reply
  • MethylONE - Friday, October 18, 2013 - link

    I just shorted AMD based solely on your comment. Reply
  • Retired Budget Gamer - Monday, October 28, 2013 - link

    Thank you so much. Your advice paid my mortgage for the next 2 months. It confirmed that Nvidia is the only company innovating in this industry. AMD is probably going to capture the low end but the low end will instead be usurped by ARM, since soon there will only be basic mobile chips and server farms. Reply
  • Byte - Sunday, October 20, 2013 - link

    I've moved all my computers over to Radeon to litecoin mine, but drivers are just terrible on radeons, stuttering, and crossfire sucks, and litecoins are dropping faster than AMD stocks. People say radeon and geforce are neck to neck, I think they are delusional. Its time to move back to Geforce. Reply
  • sajara - Friday, November 08, 2013 - link

    I totally agree. I always had better everything with Geforce over the years. 3 ATI/AMD here over 2 Geforces. Geforce lasted longer than ATI's in terms of usable longevity. Drivers were always better too on Geforces. But I have to admit that my laptop 6570m (rebranded 5730m) is pretty bang for the buck almost 3 years later, still can deliver 60fps on just about any game with High settings. I think 1366x768 is pretty low resolution to play for the Madison chip. Reply
  • TempAccount007 - Saturday, November 09, 2013 - link

    ATI?

    How retro.
    Reply
  • Retired Budget Gamer - Monday, October 28, 2013 - link

    Have you ever heard of triple buffering? Reply
  • stangflyer - Friday, October 18, 2013 - link

    If you have 3 monitors that have G Sync will this work for multimonitor single card and sli gaming? That is the next question that I would love to see answered. Reply
  • GmTrix - Friday, October 18, 2013 - link

    I would guess that it would. My question would be would you get different frame rates on the different monitors as the buffer was readied for each or would the GPU wait till all three buffers were ready before notifying the monitors to refresh. Reply
  • inighthawki - Friday, October 18, 2013 - link

    There may be a technical limitation to answer that, but I really don't see any reason it couldn't do either. The drivers are already capable of synchronizing the rendering and presentation of the two cards. If there is really a software component involved, I don't see why it wouldn't be able to just insert a wait until all frames are ready, and broadcast the vsync to all monitors at once. Reply
  • invinciblegod - Friday, October 18, 2013 - link

    Between G-sync, mantle, and trueaudio, the various proprietary features of the GPU's are really making the choice of a next gen video card difficult (though all 3 are unsupported features at this very moment). Reply
  • SetiroN - Friday, October 18, 2013 - link

    Not really.
    We have two AMD features that are most likely going to be utilised two times at best, and are only going to consist in better performance, against a "game changer", to quote Anand.
    Even if Mantle were to give me +40% fps in battlefield, it doesn't even remotely compare to perfect smoothness in every game without the additional vsync input lag.
    Reply
  • Raviolay - Saturday, October 19, 2013 - link

    Any gamer worth is salt is running a 120/144htz screen and turns off V-sync or anything else like TXAA that causes custard mouse and fuzzy images. Lightboost is a must, however and that hack now works with AMD. G-sync is of little import to someone like me, more frames means a better response time with my mouse. The screen tear is negligible at 120/144htz without any synchronization rendering this G-sync moot. That is unless you want to run at 4K @ 60fps and given the new consoles target 1080p. The improvements are going to be neutered with low a texture quality, and I don't see developers making 4k textures for PC ports any time in the next 4 years. Reply
  • treeroy - Saturday, October 19, 2013 - link

    "Any gamer worth its salt is running a 120/144htz screen"
    What absolute rubbish. 120Hz monitors weren't even a thing a few years ago, yet it was perfectly possible to be a core gamer then. I use a 60Hz monitor because I don't have $500 to throw on a brand new screen, yet I would definitely describe myself as a proper gamer.

    I also find your use of the term "htz" quite hilariously ironic. It's like saying, "Any driver worth his salt speeds at 80msph". If you're going to make yourself out to be a tech badass, at least learn that it's Hz not htz.
    Reply
  • nathanddrews - Saturday, October 19, 2013 - link

    120Hz monitors weren't a thing? Ever heard of these things called CRTs? The 60Hz prison of LCDs are the worst thing to happen to gaming in the last decade. I feel a bit strongly about this topic... XD Reply
  • jasonelmore - Saturday, October 19, 2013 - link

    The Asus VG248QE used in this demo is only $299 on newegg

    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Reply
  • dylan522p - Saturday, October 19, 2013 - link

    Oh really? So as a gamer having 3 27" 1440p Ultrasharps makes me not a real gamer.... Reply
  • lmcd - Sunday, October 20, 2013 - link

    I don't know gamer or not but it definitely makes you a target for envy! Reply
  • sajara - Friday, November 08, 2013 - link

    Yeah. I've only 768p and I'm a gamer. And I was a hardcore gamer in 1985 with 256i, but i did not used a mouse back then... :p Reply
  • treeroy - Saturday, October 19, 2013 - link

    Mantle has the potential to be extremely interesting, considering the next-gen platforms are all AMD-based. I'm not convinced about trueaudio, but I'm also not sure I care about g sync. I don't want to buy a new monitor just to utilise a solution to a problem that I haven't got in the first place. Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    http://www.hardwaresecrets.com/article/Introducing...

    This is an example of a *very* similar standard VESA has. Modifying it to suit the needs of a Dynamic-Sync - Variable-Refresh solution would be extremely easy.

    Did Nvida beat AMD to the punch? Sure. However, if you think that Nvidia will magically be the exclusive provider of this experience, well... That is just silly. This is the type of thing that will become industry standard very quickly. The same can be said about Mantle and TrueAudio, of course.. though the likelyhood of Microsoft giving developers Mantle levels of hardware access is pretty iffy at best.
    Reply
  • TheJian - Saturday, October 19, 2013 - link

    Incorrect. Mantle and TrueAudio will need to have heavy investment from a Broke AMD. Much like OpenCL...Why code for it? Just to be open? That isn't free coding and cuda is already there for every app funded by billions from NV over 7yrs. On the other hand Gsync will be used and is desired by all who are tired of tearing/stutter. It's a lot easier to push something that we all want, and will likely be asked to pay a premium for (thus companies see more profit and pushes new product purchases), vs something that just costs more to implement for a small subset of AMD's small discrete market share. You can't charge $80 for a mantle optimized game. People will still only pay $60 (not so with enhanced monitors etc). Mantle died the second it said it would help people get off consoles, which of course caused consoles to freak. MS had an immediate response and Sony will do the same (BLOCK IT). If your games run great elsewhere you don't need my consoles. Thus I block your tech forever. 8mil for Dice to use Mantle on Battlefield4. At that price AMD can't afford another game or two let alone dozens yearly. Mantle has been slain like it or not. Trueaudio is the same story, same subset, doesn't allow higher priced games etc. Just costly to implement. I already have a sound card that works ok.

    NV wins the proprietary war (good or bad, just the facts it seems). Choice over if it's based on tech and the chances of that tech being implemented everywhere. :) I will gladly pay $100 for years of great gaming across titles without anything but NV driver work rather than praying for every dev coding for it. My current 24in/22in combo (dell 24+LG 22) are both 5+yrs old (dell is over 6yrs). Less than $20 a year for a vastly improved experience all the time. Sign me up.

    "I can't stress enough just how smooth the G-Sync experience was, it's a game changer."
    All I need to know as I'm planning a maxwell purchase and new monitor (27 or 30in to drop my 22in) next year. They'll get a tegra 5/6 tablet or shield rev2 or 3 out of my wallet eventually to keep it all tied together (much like getting sucked into the Apple ecosystem used to cause). I'm not seeing how AMD's new tech will stop me from switching knowing it probably won't be supported unless paid for by AMD.

    To get the tablet from me they'll need to put tegra 5/6 in a 1080p 13 or 20in though. 2560x1600 just makes games too slow until 16/14nm. My radeon 5850 has be be jacked all around just for my 1920x1200 dell 24 and when that fails I go to the 22in 1680x1050 for happiness...LOL. Expecting a tablet to push great games at 2560x1600 in an 6-8w envelope or so is just stupid. My dad's nexus 10 looks great, but gaming is pretty crap at that res for a lot of stuff and I have no need to have a res higher than 1080p for movies in that small of a form factor. (nor games, my dell looks great at 1920x1200). In a 13-20in 1080p is fine for me an a perfect fit for movies on the go etc.
    Reply
  • SlyNine - Saturday, October 19, 2013 - link

    You talk and talk, but its all your opinion or complete conjecture. You sound very arrogant. Like if anyone really believes you have some magical crystal ball...

    mantel is pretty interesting, we will see what happens. Don't kid yourself into thinking you know.
    Reply
  • Klimax - Saturday, October 19, 2013 - link

    All he needs verified facts for base and then just build up the case. Which he did.

    You'll need bit more then the cheap ad hominem and misdirection.
    Reply
  • SlyNine - Saturday, October 19, 2013 - link

    What facts did he use again? Reply
  • TheJian - Monday, October 28, 2013 - link

    Paul Graham's Hierarchy of Disagreement
    https://en.wikipedia.org/wiki/Ad_hominem
    You need to read this :)

    Facts I gave were TrueAudio and Mantle require INVESTMENT from AMD as devs don't just pony up their own funds for free. To write for mantle for 1/3 of the market (and less than that as all cards from AMD don't support it anyway, so writing for niche) will be on TOP of writing for everyone else (amd cards that don't support it, NVidia, Intel etc). They will just write for ALL unless AMD pays them. That is a FACT. AMD paid 8mil for BF4 get it? You can't charge more for a MANTLE optimized game right? So it gains devs nothing financially, so again AMD pay me or buzz off with your mantle extra code I need to write crap...Get it?

    On the other hand we ALL want stutter free, tear free gaming. I can sell that for more money (if I'm a monitor maker etc). So I'm inspired to pitch that as it adds new products and EXTRA profits over regular models. NV doesn't have to pay me to want more money, I just get to charge it anyway...LOL. FACT. Easy to push something like that vs. something that gets a dev NOTHING. Understand the difference? EA can't charge more for BF4 because it works better on mantle cards, so they gain nothing but extra dev cost, which they won't pay, hence the check from AMD.

    Each game needs mantle/Truaudio help from devs (extra work), but once I get a monitor with gsync I'm gold for years. Which is easier to push? FACT:the one that takes less work and makes a lot of people extra cash (Gsync).

    Also with Gysnc devs get something they've wanted for a long time at some point if desired (don't have to but if desired once ubiquitous). When you are above perf you need in a portion of the game (say when you're hitting 100fps) they can code to use the extra fps to use extra graphics. So when you have extra power the game AMPS up so to speak on the fly. It's NV's job to get that on the fly part right (or AMD if they end up on board or make their own compatible type gysnc). I'd much rather be dependent on drivers than 100 devs etc to figure out if I can run full blast here or there. It puts more work on NV rather than a dev. They are tired of writing for 30/60fps and being limited by the lowest point in a game, they want to take that extra power to amp up when available.

    Also as the OP stated I added to my base statements with a quote from Anand himself:
    "I can't stress enough just how smooth the G-Sync experience was, it's a game changer."
    Then explained why it affects my purchase and why most will do the same. They get me for the monitor because I just want that stutter free smooth stuff and better games later, then they get me for a card, probably tablet etc, just like the apple ecosystem. Then I note AMD has no way (currently?) that I can see stopping me from getting hit buy an ecosystem.

    BASE (central point, whatever)
    Supporting statements
    Conclusion

    You countered with nothing. 'you have an opinion and have no crystal ball'...Pfft. Best you got?

    It's not conjecture to see Apple's ecosystem suck people in and keep them. This is the same strategy. AMD has a strategy too, it's just not the right one to win for all the reasons I pointed out (they have to PAY people to do it for no return on the time, no higher priced games etc). IF they could get away with charging $80 instead of $60 because it's a "special mantle game" then maybe devs would bite. But reality is that's a tough sell. So if you can't get your tech ubiquitous then it won't sell more of your cards/tech. AMD's ecosystem is a tough sell. NV's isn't and makes money for companies, pushes products, raises prices (at least until gsync is a commodity item at some point). You don't like the facts, so you choose to ignore them. I never said mantle isn't interesting, and I don't even care if its AWESOME (could be, but who cares? only AMD). It's not EXTRA profitable for game devs to write for it, so useless to them. Understand? Code ALWAYS will have to be written anyway for everyone else (until AMD owns 90% of the market...LOL) so this will never be adopted. Remember Glide? Did you watch Carmack, Andersson, Sweeney video discussing this? They all said this is NOT the way to go, with more API's and hope NV doesn't pull this too. Nuff said. No love from top three devs, only they hope it leads to ideas in the mainstream (like stuff added into opengl or something), not that they want it to live. Carmack came right out and said we already have this with nvidia extensions etc if you want direct to hardware it's pretty much already in there. Translation:he's not impressed.
    Reply
  • JacFlasche - Thursday, November 14, 2013 - link

    It seems to me that your entire harangue is base upon a faulty premise. One of the main objectives of mantle as recently stated in the interview on TH is that the companies that code games for consoles will be able to port low level work directly to a PC release that will be able to use them as they are. This in my mind, entirely negates your assumption that it would add expense for game studios. As far as your economic forcasts Fobes disagrees with you. http://www.forbes.com/sites/jasonevangelho/2013/10... At any rate They will be able to sell twice as many of these if they are eventually AMD compatible. I don't see it as an either/or situation. I want and AMD card with mantle and a G-sync type monitor. And there will be other types of G-sync tech in my OPINION. At any rate, from reading about both techs, I still think mantle is the more significant development, and if things stand as they presently are, I will upgrade with mantle and wait to upgrade my monitor until either a head mounted display like rift is perfected in high res (way better and cheaper than a monitor) hopefully with a g-sync like tech, but if not, you will still have mantle work on it. My next monitor purchase will be a oled monitor, for which I have been waiting for a decade now. When they hit 2k in price, I will buy. Until then I will make due with my old Samsung and a new Occulus Rift, when they are more perfected. How is that G-sync going to help me on a Rift? It won't unless it is included in the hardware, which I doubt. What would really be nice is a g-synch module that you can jack in to any display devise. Then It would be killer. Reply
  • SlyNine - Saturday, October 19, 2013 - link

    And I didn't come to any conclusion, I simply countered with; you don't know any more than anyone else. Which is true, is it not? Reply
  • treeroy - Saturday, October 19, 2013 - link

    Next-gen games are already $80 so your argument that "people will only pay $60" is a joke.
    Mantle games are not going to cost $8 million for AMD each - it was only that high because it's the launch title, it's much to do with marketing more than investment in games. Mantle will not cause games to go up in price, that's completely illogical - game prices have been frozen for the past 6 years or so, and there has been PLENTY of new technology introduced in that time.

    And the notion that consoles will block Mantle is crazy. The PS4 for one is being as open as possible, so Sony isn't going to say "Oh actually you can't make that game on our platform because it will move people to PC", and I imagine it's the same on the green side of the consoles. Moreover, AMD has complete dominance in the next-gen console generation, so even if Mantle doesn't take off (I think we're all sceptical of it), multiplatform games will still get optimised for AMD technology, which for many people is going to keep them/move them into the AMD side of the graphics war.

    You seem to be an nvidia fanboy. You should probably stop.
    Reply
  • medi02 - Sunday, October 20, 2013 - link

    If mantle is close to API's exposed to console devs, it will surely take off. Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    lol, DICE's repi has already said that while Mantle adds some overhead, it really has been designed to be practically copy-paste. The only reason it exists at all is because developers want it. 8 Million for DICE to use Mantle? lolwut?

    Then there is TrueAudio, whichsupports FMOD and WWise, the two most popular audio engines, through a plugin. Again, minimal developer input required to use the hardware. It isn't some mystical X-Fi style solution either, it simply takes FMOD and WWise instructions and runs them through a dedicated compute path, away from the CPU.

    Lastly, Microsoft cannot actually block Mantle. They simply do not support, as in provide customer support, for it on Xbone. Sony has no reason to decline Mantle support as they are not trying to force developers to use Direct3D code.

    You think Mantle will fail, which is funny since Maxwell is including an ARM co-processor to basically do the exact same thing, provide an alternative programming path for developers, except it is fully proprietary and only guaranteed to work on Maxwell cards, whereas Mantle will work on any GCN (VLIW 4/1) cores.
    Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    G-Sync will most likely spawn a VESA standard counterpart, which will be a driver update away for AMD cards. Hell, even TrueAudio and Mantle are something Nvidia could get into, if they wanted. Leave it to nvidia to create a closed-source solution where their competitor has been doing open-source dev-and consumer- friendly work all along. Reply
  • LedHed - Monday, October 28, 2013 - link

    "where their competitor has been doing open-source dev-and consumer- friendly work all along."

    I hope you are joking... Mantle could have changed the PC community as a whole if AMD had simply opted to license the API out and NVIDIA could port it through CUDA easily. Instead AMD is holding on to it like a new born baby and will essentially squander the opportunity to create an almost seamless transition from console to PC. Mantle is extremely proprietary (the opposite of open source..) and TrueAudio is also proprietary, so how you came to the conclusion anything they are doing is open source is beyond me. In my opinion I believe Mantle will be implemented into games (the few AMD can afford, look at their current stock trends) in a similar fashion to how we are seeing PhysX implemented into games (Batman Origins). No one is going to want to recode their entire game around a new API that only a tiny percentage of gamers can use (compare Mantle compatible GPUs to all the rest currently used according to the latest Steam survey), so I believe they will code specific elements of a game, just like you see with PhysX effects. I honestly doubt we are going to see anything ground breaking with BF4 and Mantle considering BF4's engine was already well established before AMD even approached DICE with Mantle. Obviously this is all my opinion, but unlike most posters in here, I am an author for a well known GPU Site.
    Reply
  • Exodite - Friday, October 18, 2013 - link

    Did you get to see how this impacts mouse response?

    The reason I personally skip on V-Sync when gaming is that I consider even minimal mouse lag far, far worse than any amount of tearing and the like. Since the G-Sync has to be messing with what comes from the computer I'm needless to say concerned that it'll introduce gameplay issues (with the mouse controls) while providing better image quality.
    Reply
  • Maxwell_88 - Friday, October 18, 2013 - link

    I doubt it will impact mouse response in any way. As far as I understand it is that when ever the GPU has a frame ready only then will the monitor scan out from the framebuffer. Reply
  • Exodite - Friday, October 18, 2013 - link

    That sounds encouraging, I hope you're correct. Reply
  • nathanddrews - Friday, October 18, 2013 - link

    Sounds about right. Reply
  • inighthawki - Friday, October 18, 2013 - link

    It will still have the same impact if the game renders faster than the display can refresh. The actual reason behind the cause of input latency when vsync enabled is caused by the game rendering more quickly than 60fps. In order to do so and not artificially impact performance and tolerance, gpu rendering of frames is queued asynchronously. Games do not start rendering the frames at vsync, they start as soon as the previous frame is done. This means if you have a 16ms frame (60Hz) and you finish rendering in 1ms, it will proceed immediately to render frame 2, then frame 3, and so on (up until a point where the OS blocks you from continuing to prevent too much work). What you end up with is you've rendered 3 frames in 3ms, but now it's queued up and you have to wait 50ms to see the results (3 frames later). Variable framerate doesn't change this, it simply makes things smoother. If you have G-Sync on a 60Hz display and you continue to render at 200Hz, you will see the same thing happen. G-Sync is only beneficial for the flip side - the rendering cannot keep up with the display rate. It solves the common issue with framerate halving with vsync enabled and dropping below the vsync period per frame, but it won't solve the issues for mouse latency. Reply
  • nafhan - Friday, October 18, 2013 - link

    It sounds like this should and will be coupled with high refresh rate monitors... which pretty much negates everything negative you said. Reply
  • inighthawki - Friday, October 18, 2013 - link

    While that would be ideal, I see no reason why this technology would be limited to high refresh rate monitors. 60Hz monitors can certainly make very good use of this technology.

    Higher refresh rates will certainly help, but does not necessarily "negate" the issue. There is and still will be three frames of latency, they will just be smaller frames.
    Reply
  • datex - Friday, October 18, 2013 - link

    As an indie game dev, what can I do to take advantage of G-Sync? Is there a developer FAQ or anything?

    Now looking into ways to not buffer 3 frames in advance...
    Reply
  • inighthawki - Friday, October 18, 2013 - link

    I do not know about OpenGL, but on DirectX, you can use the following function to specify the maximum frame latency per device:
    http://msdn.microsoft.com/en-us/library/windows/de...

    It's targetable on Windows 7 and above, or Vista with the platform update, but if you're already using DX11, then querying support and setting the value can be added in just a few lines of code and no need to even check the OS version.

    Windows 8.1 also goes a bit further using a "low latency present" API, but you would need to be more careful with this one:
    http://msdn.microsoft.com/en-us/library/windows/ap...

    Remember, though, that queuing frames ahead is a method of providing tolerance over the course of a few frames. Imagine your game runs at 100fps. Over time it'll queue a few frames ahead, and if suddenly you take 25ms to render a frame, you already have 2 frames of tolerance queued up so that the presentation remains smooth. Doing a completely serialized render loop (1 frame at a time, never more) can lead to a lot of stuttering when this kind of scenario occurs.
    Reply
  • inighthawki - Friday, October 18, 2013 - link

    To clarify, this is not how to take advantage of G-Sync, just in response to your last comment about how to not buffer 3 frames at a time Reply
  • Maxwell_88 - Friday, October 18, 2013 - link

    Exactly what Nathan said. Nowhere in this presentation have they mentioned that G-Sync monitors will be capped at 60 Hz. It will be capped at whatever refresh rate the monitor supports. And as I see it, it is not a limitation of G-Sync but the monitor itself. Reply
  • inighthawki - Friday, October 18, 2013 - link

    I didn't mean to imply that 60Hz was a cap at all. It was just an example using the standard refresh rate of 99% of monitors ;).

    The input latency introduced by vsync is entirely dependent on the maximum refresh rate of the display device.
    Reply
  • Sancus - Friday, October 18, 2013 - link

    It's not messing with what comes from the computer, it's telling the monitor that it can only refresh when a frame is ready. The reason that vsync increases input lag is because it forces the *video card* to wait for the monitor. This means that if the video card isn't able to output 60fps, it drops even more frames trying to sync with the monitor. G-sync alleviates this because it allows the video card to tell the monitor to only refresh when it has a frame ready, which means the video card no longer needs to drop frames waiting for sync. There should not be vsync-style input lag introduced by g-sync. Reply
  • Exodite - Friday, October 18, 2013 - link

    Thanks for elaborating on the subject, that sounds reasonable. Reply
  • inighthawki - Friday, October 18, 2013 - link

    The video card does not drop frames waiting for vsync. It queues up frames ahead, and that is where the latency comes from. By default, Windows lets DirectX render 3 frames at a time, unless requested otherwise. This means that it's capable for a game to render three frames ahead, meaning you will only see the results of what you render three frames later. THAT is where most of the latency comes from. Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    That sounds an awful lot like: http://www.hardwaresecrets.com/article/Introducing... Reply
  • Threnx - Friday, October 18, 2013 - link

    This bothers me as well, but I've found that turning v-sync off in game, and enabling adaptive v-sync with nvidia's drivers gets rid of mouse lag every time. Reply
  • SetiroN - Saturday, October 19, 2013 - link

    They (nv) specifically said that gsync eliminates the need of buffering, so the natural conclusion would be that we should expect latencies comparable to vsync turned off. Reply
  • inighthawki - Saturday, October 19, 2013 - link

    That's impossible, there is ALWAYS buffering if you are rendering faster than the display's refresh rate. If the display maxes out at 144hz, then you cannot render faster than 144hz without queuing frames ahead. The only thing that reduces input lag here is that frames are smaller at 144hz, so there is less time to drain a queue of frames, leading to a better worst case. Reply
  • Kevin G - Friday, October 18, 2013 - link

    I'm wondering how G-Sync timing is sent to the display. I presume it is over DisplayPort as it has a dedicated AUX channel that could be used for this.

    The other thing that springs to mind is the idea that this is a variant of panel self refresh (PSR). This would allow the panels to refresh on their own independent of the host GPU's timing. PSR was designed as a power saving technique as it would allow the GPU to power down in mobile devices. My guess is that nVidia thought of a nice high performance usage for it by coordinating the refresh rate. This would explain the presence of the 1.5 GB buffer on the G-Sync card (though I'm thinking that the actual amount is 192 MB by way of six 256 Mbit chips).

    I would also fathom that this technology can be used to shave off a few hundred microseconds in the process by preemptively starting the display refresh before the new frame buffer is finished. This is what causes tearing with V-Sync disabled but here with a software driven timing algorithm, the swap to a new frame buffer could happen when it is >90% complete.

    I'm also enthusiastic about G-Sync becoming part of Occulus Rift. This seems like a solution to what Carmack and others have been searching for in VR. In fact, I'm kinda surprised that Jen-Hsun didn't reference Carmack's new position at Occulus when talking about G-Sync. Sure, he's more famous as a programmer but the Occulus side of his work is where G-Sync will have the most impact.
    Reply
  • errorr - Friday, October 18, 2013 - link

    This was exactly what I thought when I saw this. Reply
  • repoman27 - Friday, October 18, 2013 - link

    I highly doubt the AUX channel will have anything to do with timing. DP is packet based, and as Anand pointed out NVIDIA is just manipulating the timing of the v-blank signal within that packet stream to indicate a refresh.

    Also, there is clearly 756 MB of DDR3L SDRAM on that module if you look at that picture closely. H5TC2G63FFR = http://www.skhynix.com/products/consumer/view.jsp?...
    Reply
  • Kevin G - Sunday, October 20, 2013 - link

    Good catch on the RAM parts. Earlier shots of that board weren't high res enough to make out the part numbers. It could be either 768 MB or 1.5 GB if the back side is populated too (which I haven't seen a picture of).

    The Saturday update to this article does point toward this being a twist of PSR as the modification goes pure DisplayPort. I wouldn't have expected that the PSR buffers would need to be so large. Even at cinematic 4096 x 2304 resolution that's 21 times the amount of necessary for a 32 bit frame buffer. 768 MB has enough space for a couple 8K buffers at 64 bit color depth.

    My guess is that that this could be the first phase of the technology's introduction. Phase 2 would likely have the monitor accept multiple DisplayPort inputs from different nVidia cards in SLI. The monitor would buffer the frames from either video card and pick what is best based upon its own time algorithm. This would essentially allow the video cards to operate asynchronously with respect to the monitors refresh and even each other.

    Other possibilities would include multi-monitor setups to have one GPU dedicated to purely rendering what is one screen. Rendering in that fashion is notorious to load balance as what appears on one screen may be easier to render than the others. This produces an asynchronous result on a per monitor rendering basis but if the monitors themselves are handling the refresh, then this problem may become a non-issue. The other downside for per GPU per monitor rendering is that it'll likely be buggy under current API's due to how things are rendered as a single logical surface (ie the frame buffer covers all three displays).
    Reply
  • repoman27 - Tuesday, October 22, 2013 - link

    So clearly when I said 756 MB, I really meant 768 MB (d'oh!). But anyways, I strongly suspect that only one side is populated and that's all she wrote. Either way, it's quite a bit of RAM.

    However, if this works as I surmise and just negotiates the fastest possible DisplayPort main link rate and then uses bit stuffing between frames, this could be a way of buffering that entire firehose. DP 1.2 can shift 2 GB/s, which would make 768 MB a 373 ms buffer if it was able to be used in its entirety. Or to put it another way, at 30 fps the G-Sync chip may need to ingest 68.7 MB of DP data to get one displayable frame.
    Reply
  • Kevin G - Wednesday, October 23, 2013 - link

    I was thinking that the tech could scale to support 120 or 144 Hz refresh rates at 4k resolution. While DP 1.2 is fast, it isn't fast enough to handle that resolution and refresh rate simultaneously. To side step the issue, the monitor uses two DP links. Each frame from each link gets buffered but the display itself picks what would be the most appropriate based upon when they arrive. The only means of getting such high frame rates would be to use SLI. This allows each video card to feed the host display. If the display itself is picking what frames are displayed, then the synchronization between each video card can be removed on the host PC.

    Though by my calculations, a 3840 x 2160 frame would consume 33.2 MByte of data. At 60 Hz, the 768 MB of memory would roughly a 385 ms buffer
    Reply
  • PPalmgren - Friday, October 18, 2013 - link

    Excellent idea, but its going to be really really really hard to sell to the people who need it.

    G-sync targets systems that regularly dip below the 60hz threshold, and costs over $100 at the moment. When you're buying/building a system, do you spend that extra money on a better card so you never have the problems it corrects, or do you buy g-sync?

    Maybe it will get life from the big spenders in the high refresh rate monitors, but its gonna take a lot of production to push that price down. The people who need this the most are the ones that are least likely to afford it.
    Reply
  • kyuu - Friday, October 18, 2013 - link

    Yeah that's my thought. People with cheap graphics cards aren't likely going to be looking to spend up a couple hundred dollars on a monitor. People with more expensive graphics cards who might have the money likely aren't having issues with staying above at or above 60 FPS.

    For me, spending a couple hundred dollars on a new monitor that's going to have a mediocre TN panel and getting locked into one vendor for graphics cards = no go.
    Reply
  • kyuu - Friday, October 18, 2013 - link

    I should say a couple hundred dollars over and above the normal price of the monitor. Reply
  • Braincruser - Friday, October 18, 2013 - link

    Also people with cheap graphics will be limited from the GTX and above club, nvidia set. Reply
  • RoninX - Friday, October 18, 2013 - link

    The point is that you only need to buy the monitor once.

    So, for example, I have a a moderately high-end (but not crazy high-end) GTX680. When it came out, it could pretty much hit 60 Hz with all of the games turned up to max settings. Now, there are a few where it might dip into the upper 50s. The problem is that if you have V-Sync on, 59 Hz is actually just 30 Hz. And if V-Sync is off, you have tearing.

    G-SYNC solves this problem, and you only have to pay the price once, instead of increasing your GPU update schedule from, say once every two years (my schedule) to every year.

    Of course, I wouldn't settle for a cheap TN panel. I'm currently using a Dell Ultrasharp U2410 IPS display, and I would only buy a G-SYNC monitor that could match or better this picture quality.
    Reply
  • RoninX - Friday, October 18, 2013 - link

    I should clarify what I mean. 59Hz is obviously better on average than 30Hz, even with V-Sync, but you will getting stuttering that drops the update rate to 30 Hz between some frames. Reply
  • medi02 - Sunday, October 20, 2013 - link

    @The problem is that if you have V-Sync on, 59 Hz is actually just 30 Hz.@

    Oh dear...
    Reply
  • nathanddrews - Friday, October 18, 2013 - link

    I believe you're looking at this the wrong way. The problem that I see with your reasoning is the 60Hz threshold, which many gamers with expensive systems consider the bottom of the barrel. I could go on regarding lightboost and Catleap and custom overdrive boards, but this solves all that.

    Vsync on causes lag. Vsync off causes tearing. This will let the GPU take full advantage of frequencies all the way up to 144Hz with no lag or tearing. It's the best of all worlds, IMO.

    Now we just need 120+Hz 4K+ monitors...
    Reply
  • GiantPandaMan - Friday, October 18, 2013 - link

    I tend to agree with PPalmgren. The people who would most benefit from this would be better off with buying a more expensive video card.

    There is a different class of user, which you mention, that overdrives monitors and goes 120+Hz. Sure, this will benefit them, but the size of that group is vanishingly small. First, they have to have money. Second, they have to prioritize framerate (TN monitors) over display quality. (Good IPS monitors don't usually support framerates higher than 60 Hz. I hesitate to put overclocked Catleaps into the high quality IPS display category. Not that they're bad mind you.) Third, to drive such massive displays at high framerates it's almost required to go multi-GPU. Multi-GPU configurations have their own problems with keeping smooth framerates.

    Lastly: this has to be an open standard. If Intel or AMD GPU's can't run it, then it'll be stuck to a very small niche market. Personally I'd hope nVidia goes the open route and turns it into a cheap and almost standard feature on monitors. I'm not keeping my fingers crossed, though.
    Reply
  • nathanddrews - Friday, October 18, 2013 - link

    To be fair, there are some very good TN panels out there, but it's a null issue to me. Frame rate is king for me. I guess we'll just have to wait and see what happens in the coming year(s) with GSync. I'm hopeful that this will push some new approaches to TN panels as well as IPS. OLED? <RandyMarsh>"Nyomygot"</RandyMarsh>

    I just got done reading the QA live blog and it sounds like NVIDIA has left this open to licensing this tech beyond their own hardware, which is great news! Of course, the cost will have to come down a lot for the hardware and licensing, whatever those costs are.

    "Carmack: is G-Sync hardware licensable?
    NV: we haven't precluded that"

    Also:
    "Carmack: I've got a lightboost monitor on my desktop, it'll probably move to G-Sync"

    If it's good enough for Carmack, it's good enough for me. ;-)
    Reply
  • Yojimbo - Saturday, October 19, 2013 - link

    A device doesn't have to just integrate into the current market, it can alter the market. Developers develop games with the situation at hand to produce the best experience they can. If the main reason for not rendering less than 60fps is that it causes visual artifacts due to asynchronicity between the monitor and the video card (I don't know if this is the case), then removing that issue will allow them to create games where they allow frame rates to drop below 60fps for WHATEVER hardware profile they might be targeting, even the high end profiles. This means that they can include more features, because it effectively shifts the target frames rate downward, and so equivalently makes everyone who has a compliant device have a virtual upward shift in ability. Of course there would need to be widespread adoption in the marketplace for such a shift to be meaningful to the developers at the high end. Reply
  • n0b0dykn0ws - Friday, October 18, 2013 - link

    But will they get 23.976 right? Reply
  • Ryan Smith - Friday, October 18, 2013 - link

    It can't drop below 30Hz, so you'd technically have to run at double that. Reply
  • MrSpadge - Friday, October 18, 2013 - link

    Yeah, if they're spending >100$ to make the refresh rate flexible they should also unstutter movies. With a lower limit of 30 Hz (which makes sense) they could display them as 2*23.976 Hz. Reply
  • madwolfa - Saturday, October 19, 2013 - link

    I really stopped caring about it since madVR and its "Smooth Motion" function... Reply
  • looper - Friday, October 18, 2013 - link

    Sounds very interesting.

    For the last month or so, I have been learning about nVidia cards using LightBoost with a group of 120 hz TN panel monitors. ( Esp. the Asus mentioned above, and the BenQ XL2420TE) The result is a big improvement in smoothness/accuracy, particularly in 'FPS' style games.

    How does this new 'G Sync' feature come into this equation?
    Reply
  • monitorsrock - Friday, October 18, 2013 - link

    It looks like Overlord has been trying to get G-Sync expanded beyond just TN 1080 panels to their 1440 IPS 120hz monsters. GAWD. That is a wet dream for me - 1440 IPS 120Hz goodness with G-Sync? Yummy! Reply
  • repoman27 - Friday, October 18, 2013 - link

    "NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync."

    Well, unless G-Sync can make use of DisplayPort 1.2 HBR2, the maximum refresh rate for 1920x1080, 24 bpp under DP 1.1a is 149 Hz, or for any practical purposes 144 Hz. And for 2560x1440, 24 bpp, you're looking at a max of 88 Hz, or 85 Hz in practice.

    I'm guessing that the GPU simply negotiates the maximum bitrate possible for the DP link, and then bit stuffs to accommodate lower frame rates.
    Reply
  • Hrel - Friday, October 18, 2013 - link

    This is cool, just like their streaming thing. But I don't care about vendor specific features. We shouldn't be trying to further segregate the market, we should be working on open source, multi-platform tools. Like OpenGL. How have people not learned that proprietary features are bad for the industry at large? It's so obvious. Reply
  • A5 - Friday, October 18, 2013 - link

    Proprietary features like this pop up pretty regularly. I'll be surprised if this particular version gains any traction, but maybe there will be a standard-ish version of it in a future revision of the DisplayPort spec... Reply
  • willis936 - Friday, October 18, 2013 - link

    So will this be completely transparent if you have a supported monitor and GPU? As in will this benefit media playback that's traditionally at 23.976 fps?

    Moreover what is the GPU requirement? Does this need kepler (600+ series)? Could this be used in conjunction with wireless streaming services (shield included) for living room (aka TVs that won't support G-Sync)?

    Also how much will that mod board cost? I'd love to throw that in my Dell IPS and it'd be a good learning experience. I'm on a 500 series GPU so if it requires kepler there'd be no reason to mod before I upgrade (planning maxwell). By that point there should be many mod guides around.
    Reply
  • Friendly0Fire - Friday, October 18, 2013 - link

    The live blog posted earlier said this was Kepler and above only. I'm guessing this is largely targeted at games, but could be adapted to GPU-accelerated movie playback.

    I very much doubt this tech would do anything for wireless streaming; there's an order of magnitude more latency in sending the video feed over wifi than anything related to v-sync.
    Reply
  • SleepyFE - Friday, October 18, 2013 - link

    That's all fine and good (didn't read all the comments, but there are few bad ones on AnandTech) but it costs as much as another graphics card, so is it worth it?
    And why don't they try to match the GPU processing to the monitor frame rate and see how that goes (an you only need to update the driver)?
    Reply
  • ssiu - Friday, October 18, 2013 - link

    I feel the same way -- sounds good but too expensive. amazon says the list price of the current Asus VG248QE is $279. (http://www.amazon.com/VG248QE-24-Inch-Screen-LED-l... If a refreshed VG248QE with this technology built-in is $399 that is a $120 premium. I'd rather spend $120 on faster video card or other upgraded components.

    Get this down to ~$30 and it could be great.
    Reply
  • SleepyFE - Saturday, October 19, 2013 - link

    The article sais they're trying to get it down to 100$ so 30 will never happen. Reply
  • Th-z - Saturday, October 19, 2013 - link

    That can be even harder and impractical as you are not only talking about limitation of GPU, but also the CPU that pushes the GPU, and game codes. Imagining texture quality shifts from low to high, effects on and off, draw distance far to close, 3D models high to low, enemies appear and disappear, etc., all these things happen on the fly just to match monitor's refresh rate when you play the game, assuming it can work. Reply
  • SleepyFE - Saturday, October 19, 2013 - link

    Why wouldn't it. if the GPU can reach 300fps i'm sure they can drop it to 60 and that's that. Reply
  • Gigaplex - Sunday, October 20, 2013 - link

    That's what v-sync does. Reply
  • MrSpadge - Friday, October 18, 2013 - link

    A far better solution could be to render the UI elements at native resolution and everything else with dynamic resolution. Scene getting intense? Just calculate a few more pixels and keep the frame rate up nicely. Could be used to fix the refresh rate.. but would provide less incentive to upgrade GPUs. So.. Intel would be about the only ones really interested in this, since they're the only ones not offering enough GPU performance for games. Reply
  • YazX_ - Friday, October 18, 2013 - link

    why in every damn article, people keep bringing mantle!!

    dont you realize that its dead before it even started, most of people who keep bringing it up dont know what it is. just shut the fuck up and know that DirectX and openGL wont support it, so end of story and its dead.
    Reply
  • Gigaplex - Sunday, October 20, 2013 - link

    Why would DirectX and OpenGL need to support it? Perhaps you're confused about what it is. It aims to completely replace DirectX. Reply
  • JacFlasche - Thursday, November 14, 2013 - link

    It was wonderful seeing that not everyone on this board is above embarissing themselves. Reply
  • SydneyBlue120d - Friday, October 18, 2013 - link

    Two question:
    1) Is the G-Sync a "monitor only" or could we see it in Large LCD/Plasma/OLED/VPR set?
    2) Why DisplayPort is still missing in action outside monitor world?
    2) Will the Kepler based Tegra 5 support G-Sync?
    Reply
  • SydneyBlue120d - Friday, October 18, 2013 - link

    Not two but three, sorry :D Reply
  • djscrew - Friday, October 18, 2013 - link

    yay! finally some innovation in the video/display field! Good job NVIDIA, now partner with a vendor to get me a 27 inch 1440p 120 hz monitor with gsync for $395 and I will buy two. This will be hard for AMD to counter, I wonder if they will be licensing this tech from NVIDIA for the next 50 years. Reply
  • chizow - Friday, October 18, 2013 - link

    Looks interesting, any chance this G-Sync expansion board is compatible with the VG278H? I'm not sure if it has that expansion slot or not. Reply
  • Nathanielvk - Friday, October 18, 2013 - link

    Seems a pointless piece of tech :( Reply
  • Alexvrb - Friday, October 18, 2013 - link

    Yeah, if monitor venders sell a final product that will (eventually) work on non-Nvidia graphics hardware, I'll shop for such a monitor in the future. Otherwise... nope. Reply
  • mattyc1 - Friday, October 18, 2013 - link

    Haters gone hate.. Reply
  • spiked_mistborn - Friday, October 18, 2013 - link

    GSync was already talked about back in March of this year. There is prior art on this idea, so hopefully AMD will support this also and it will become a standard. http://techreport.com/discussion/24553/inside-the-... Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    Or VESA could make an update to the DisplayPort standard similar to this: http://www.hardwaresecrets.com/article/Introducing... Reply
  • Desensitized Lemons - Saturday, October 19, 2013 - link

    Yes Nvidia !! your moving in the right direction, now ditch the Nvidia Shield nobody want's that thing! maybe one day Nvidia would make their own exclusive specialize completely bezeless 144hz Nvidia branded monitor that would coincide with a GTX card like peanut butter & jelly "Smooooth". Reply
  • chadwilson - Saturday, October 19, 2013 - link

    Oh look, more proprietary tech that requires replacing existing hardware. Sorry, for those of us invested in high end IPS monitors, without a consistent add on product (preferrably with the GPU) this is a non-starter. Reply
  • MaineG - Saturday, October 19, 2013 - link

    take my money. im ready for this technology, this is very game changing and what ive been waiting for. dont need to use lightboost anymore. all i want is smooth buttery gameplay with no input lag and this is it. Reply
  • SleepyFE - Saturday, October 19, 2013 - link

    The problem is output "lag". The input (mouse & keys) works fast enough (unless you are slow) and the rest is processing so... Reply
  • SlyNine - Saturday, October 19, 2013 - link

    So you're telling me that you could tell the difference between 48 (144/3 what a Vsynced 50 would be) and 50 fps. But couldn't tell the difference between 45 and 50.

    Now I would agree that 36 is a bit of a dip from 45. But the great thing about 120+ hz screens is with Vsync off its hard to see the tearing at 120hz (8ms before the tear vanishes).
    Reply
  • Mondozai - Saturday, October 19, 2013 - link

    YazX_, fanboys like you crack me up. Reply
  • soryuuha - Saturday, October 19, 2013 - link

    not going to change my 40" inch TV just to enjoy G-sync feature. Reply
  • lilkwarrior - Saturday, October 19, 2013 - link

    This might sound stupid, but would tech like this help out Oculus Rift be rid of its motion sickness problem? Reply
  • bji - Saturday, October 19, 2013 - link

    I'm no expert on the topic, but it was my understanding that the Oculus Rift motion sickness problem is caused by a disconnect between the visual display of motion and the body's not sensing that motion. I expect it's like how some people (including myself, unfortunately) start to feel sick when reading in the car. When you read in a car, your eyes see a stable picture (unmoving words on a page) and yet your body feels movement as the car bounces around. This mismatch between inputs for some reason causes a feeling of sickness. With the Oculus Rift, the eyes see motion but the body feels nothing, which leads to the same mismatch and the same feeling of sickness.

    Improving the smoothness of the motion displayed to the eyes, as this G-Sync technology promises to do, will not alter the fundamental problem, unfortunately. Your eyes will see smoother motion but your body will still revolt because it doesn't feel that motion.
    Reply
  • valinor89 - Saturday, October 19, 2013 - link

    I think you have it right with the cause of motion sickness in the Oculus Rift but you make a wrong conclusion. As you said it is a problem of lag, you move and the image on your eyes stays static for a small time while your inner ear detects movement, that causes motion sickness, true. But this is why you want to get rid of this perceived lag, so this tech works in the sense that it eliminates part of this lag without displaying image defects and ergo the time in witch this lag is apparent can be reduced.
    What I believe is that when your inner ear detects movement and our eyes don't detect that same movement you get sick but the reverse is not true. I can watch a movie witch moves the camera constantly and not get sick. Or play a FPS controlling the point of view with my mouse and not moving my head and not getting sick.
    I hope I make my point clear.
    Reply
  • Gigaplex - Sunday, October 20, 2013 - link

    In your examples of static images only part of what you see is moving. Your peripheral vision is still static (screen bezel, walls etc behind screen). I don't think that is enough experience to assume you won't get sick with a visor image moving with a static head. Reply
  • lilkwarrior - Saturday, October 19, 2013 - link

    Thanks for the thorough analysis; I never had motion sickness before in the contexts you've described, so it' interesting to hear about it from you.

    Reply
  • imaheadcase - Saturday, October 19, 2013 - link

    Ryan Smith..any reason this can't come to other monitors with a display port 144hz already like the BenQ XL2420TE with modification? Or is this just another exclusive for asus and others are going to have to buy new monitors? Reply
  • Ryan Smith - Saturday, October 19, 2013 - link

    We don't know enough about how this works to say for sure. The difference may be something as simple as requiring a different layout for the Tcon board, or as complex as requiring a different strategy for pixel refreshing for the specific panel they're using. Reply
  • jerrylzy - Saturday, October 19, 2013 - link

    Seems some of you have been arguing about whethet gsync or mantle. I just have one question, if g-sync really can solve low-frame smoothness, why should people spend a couple of hundred dollars on a gtx 780, and why would nvidia release gtx 780ti? Since drops of frames doesn't affect smoothness at all, what's the reason to spend a couple of hundred more dollars on an high-end card? just for benchmarks? Reply
  • cspringer1234 - Saturday, October 19, 2013 - link

    I would imagine to improve image quality. A higher performance card could display a nicer looking picture with more graphical features enabled than a lower performance card. Reply
  • surt - Saturday, October 19, 2013 - link

    You seem to have a misunderstanding. Higher end video cards don't drop less frames, they drop more. Reply
  • surt - Saturday, October 19, 2013 - link

    By which I mean, if your video card was generating 120 frames on a 60 hz monitor, it dropped 60 frames. With GSync, it can drop 0. The improvement in animation smoothness will be very noticeable, even if there is no longer any difference in frame to frame smoothness. Reply
  • valinor89 - Saturday, October 19, 2013 - link

    That is not by virtue of the GSync specifically but the monitor that supports more than 60hz. If you put GSync in a 60 hz monitor you will still drop 60 frames.

    The true benefit of GSync is when the GPU generates less than the max supported by the monitor, it helps syncronizing the monitor with the GPU, so that the monitor will wait for the GPU.
    Reply
  • tackle70 - Saturday, October 19, 2013 - link

    ASUS pretty please release that 39" 4k VA panel of yours with G-Sync for $1500 or less... I have money ready to throw at you if yes! Reply
  • wwwcd - Saturday, October 19, 2013 - link

    G-Sinc sound like G-Spot. Oh, I needs of money. Reply
  • hfm - Saturday, October 19, 2013 - link

    Can't wait until we see gaming notebooks with this installed... Reply
  • SleepyFE - Saturday, October 19, 2013 - link

    An ultrabook with g-sync. That will brig the price right in to the Apple territory. Good luck selling those. Reply
  • Gigaplex - Sunday, October 20, 2013 - link

    Ultrabook prices are already in Apple territory Reply
  • surt - Saturday, October 19, 2013 - link

    If you do the math on this, the minimum latency improvement is enough to make it impossible for pro gamers to compete without this. That is going to be a huge marketing advantage. Reply
  • SleepyFE - Saturday, October 19, 2013 - link

    If you use logic on this: Noone can react fast enough to make a difference. Reply
  • surt - Saturday, October 19, 2013 - link

    Actually, they can. Human reaction times are nearly an order of magnitude below current frame latencies. Reply
  • SlyNine - Saturday, October 19, 2013 - link

    Source please. Reply
  • SleepyFE - Sunday, October 20, 2013 - link

    I double-clicked a stop-watch and it took a minimum of 13 miliseconds. Reply
  • SlyNine - Saturday, October 19, 2013 - link

    nonsense. Just turn off V-sync. Plus we don't know if this introduces any latency on the monitor side yet. If it does that would hurt things more then help.

    Point is, before you get all excited, lets just wait and see what happens. Lets also hope the solution comes that supports all videocards and not just Nvidia.
    Reply
  • wojtek - Saturday, October 19, 2013 - link

    If they had noticable shuttering on 120/144 Hz v-sync they either did it wrong or made it to be wrong. The latter is more likely. :P Reply
  • SlyNine - Saturday, October 19, 2013 - link

    I agree. I think they nerfed the V-sync method. I mean 50fps on a 144hz monitor is still 48fps, and it sounded like that was much worse then the 45fps g-sync solution. Reply
  • Soulwager - Sunday, October 20, 2013 - link

    Say you have a 50fps constant framerate and a 144hz monitor. With V-sync on, the game engine thinks each frame is on screen for exactly 20ms, but the actual time each frame is on screen is either 21ms or 14ms, Most of the frames will be 21 ms, but whenever the frame rate catches up to the scan line you'll get 7ms of judder. This means smooth motion from the game engine's perspective will appear to hiccup forward on the monitor. With 45fps g-sync, the game engine thinks each frame takes 22ms, and that's what it looks like on the monitor. Reply
  • SlyNine - Sunday, October 20, 2013 - link

    I thought that was the point of the V-Sync dropping to a multiple of the refresh rate. So the frames could be timed correctly? So you're saying that it continues to render at 50fps but showing at 48fps until a frame is dropped to catch up? Reply
  • Soulwager - Sunday, October 20, 2013 - link

    All v-sync does is limit when buffer swaps can take place to avoid tearing, but if you're using double buffering with v-sync that can cause your framerate to drop to 1/2 or 1/3 your refresh rate. In which case you'd have 48 fps instead of 50fps. Triple buffering means you can keep the GPU busy when there's a completed frame waiting for the next display refresh, which is generally good, because if your framerate is 50fps on a 144hz monitor, you're quite close to the 1/3 threshold and would likely be dipping down into the 1/4 refresh rate territory occasionally, and that's worse than occasionally bumping up into 1/2 refresh rate territory occasionally. There may be additional frame metering going on, but that's more complicated. Reply
  • wojtek - Sunday, October 20, 2013 - link

    It is neglible and unnoticeable to human eye. Do the math: 60 Hz is 16,(6) ms and it is considered to be smooth, 7 ms is not even a half of that value. Reply
  • Soulwager - Monday, October 21, 2013 - link

    60hz is "smooth" if it's consistently 60fps, but you can still tell the difference between 60hz and 120hz. 45fps on a 60hz display doesn't look as good as a rock solid 30fps on a 60hz display. This is because your eye, brain really, is far better at detecting inconsistent movement than it is at noticing missing information. Reply
  • wojtek - Monday, October 21, 2013 - link

    Lets put it that way to be more clear: if you have stable 60 Hz, your brain does not notice anything that is added or not in between two following 16 ms frames. Actually old 50/60 Hz TV/CRT has just black screen there.
    Now, this 7 ms drift means that in most cases you will have one additional frame between those 60 Hz frames and in rare cases, when you missed the v-sync, this additional frame will be skipped and drawn with 60 Hz regime.
    There is no possibility you see any shuttering as there was no possibility you see black luminophore when the cathode ray was shutting somewhere else on CRT screen.
    Reply
  • Meaker10 - Monday, October 21, 2013 - link

    CRTs were never black between the frames, the sulphur on the surface of the display will glow for a short period of time and become dimmer over time until the beam hit it again. Reply
  • wojtek - Monday, October 21, 2013 - link

    And even more. The 120 Hz was added to LCDs to do 3D with suttering glasses, not to make the move more smooth. Back in the CRT domination days, the refresh rates bigger that 60 Hz were done only because it was less tiring for eyes to watch, not because it was more smooth. An even then the dominant comfortable refresh rate was about 85 Hz. There were rare 100 Hz CRTs and nothing like 120 Hz.
    Remember, on CRT, in reality, you have always black screen with tiny lighting point that scans enterie area and this scan fades on your retina so slow that you have impression of stable picture.

    Nowadays we have LCDs with liquid crystals that are much less responsive than luminophore. The best have full transition black-to-black in about 12 ms and gray-to-gray 4 ms with overdrive. That means that the picture you are looking at stays much longer on screen and on your retina, than those single lighting point in CRT screen.
    This simply leads to the situation when noticing any change in such sub-120 Hz situation is even more impossible than on regular CRT.

    So if you have 120/144 Hz LCD any shuttering you see is caused by software hitch not the v-sync desychronisation.
    Reply
  • Soulwager - Monday, October 21, 2013 - link

    I'm not talking about shuttering, I'm talking about judder, which is when an object that should be moving at a constant rate appears to change speed due to asynchronous clocks. What's important isn't the presence of an additional frame, it's that the additional frame isn't where you expect it to be. Here's an illustration: http://i.imgur.com/i690R1p.png Reply
  • wojtek - Monday, October 21, 2013 - link

    This chart is just plain wrong. First it should be discrete not continous. Second what "position" means: pixels, inches, meters? Third it shows the drift that I was talking about and showing you that you retina is not able to adapt fast enough to notice this position discontinuity. And last but not least tripple buffering adds significant latency that you want to avoid as a gamer. Reply
  • Soulwager - Monday, October 21, 2013 - link

    It doesn't matter whether you measure position in inches or pixels, what matters is relative velocity. As in, the slope of the line connecting any two dots. That slope is what your brain interprets as velocity, and it's harder to see that on a scatter plot. Latency is distance between two lines on the X axis.

    Second, you don't know how triple buffering works if you think it adds any more latency than double buffering. All triple buffering does is let the GPU start rendering the next frame while Vsync is blocking a buffer swap.

    Third, I have no idea why you're trying to convince me that there's no visible difference, because this is demonstrably false. Watch your mouse cursor as you move it quickly across the screen, you see it in several distinct locations corresponding to where it was when your monitor refreshed. When your monitor is updating fast enough to outpace what the eye/brain registers, those distinct locations will appear to blur into a line. This doesn't happen at 60hz, it doesn't happen at 120hz, and I'll be surprised if it happens at 240hz. There's a huge difference between "faster than the eye can detect" and "good enough that we need to prioritize other aspects of image quality". If you can't tell the difference between two monitors, that doesn't mean your experience transfers to other people, hell, about 10-15% of people can't even tell the difference between a 60hz and 120hz display.
    Reply
  • wojtek - Monday, October 21, 2013 - link

    OK, you may believe in any voodoo you want, even that your retina is nexgen one. But it is not science it's just your religion. About triple buffering it depends how it operate on buffers (yes there are few strategies). If you want real smoothness then additional latency is added, if you prefer less latency then jagging may occur: http://en.wikipedia.org/wiki/Multiple_buffering Reply
  • Soulwager - Monday, October 21, 2013 - link

    There have been blind studies on 60hz vs 120hz panels, and that's where I got the 10-15% figure. The vast majority of people can feel the missing frame, even though it's only an 8.33 ms difference in how long the frame lasts. Yes a millisecond is a small measurement of time. a thousandth of an inch is a small measurement of distance, but you can still feel it with a fingertip. Relative values are more important than absolute values as far as human perception goes, because our instincts make logarithmic comparisons. It's far easier to detect a 6ms variation on a 12ms frame time than it is to detect the same 6ms variation on a 24ms frame time.

    You're also confusing triple buffering with frame metering, which may be used in conjunction with triple buffering. Backpressure also impacts latency, both perceived and real.
    Reply
  • Soulwager - Monday, October 21, 2013 - link

    And to clarify, even with the D3D implementation of triple buffering, you don't get extra latency until your framerate starts getting up near your refresh rate, which wasn't relevant in the graph I posted earlier. Reply
  • wojtek - Tuesday, October 22, 2013 - link

    OK, And how is that related to low framerate much higher latency (because of frame timing) g-sync solution? :P Reply
  • Soulwager - Tuesday, October 22, 2013 - link

    I'm not sure what you're asking about, so I'll just elaborate in general:

    There's no reason to use triple buffering if you're using G-sync, no-sync, or if you can maintain a consistent framerate equal to your refresh rate. Triple buffering is only useful if you're using v-sync with a framerate below your refresh rate, in which case it will keep your GPU active, and allow framerates between thresholds. With the d3d implementation of triple buffering, you get latency problems when you fill both back buffers before you're ready to scan, because it won't drop the older frame, it will just sit there with both back buffers full until the next refresh. Now, keep in mind you can only fill both back buffers if you have a framerate higher than your refresh rate(or bad frame metering with a multi-gpu setup, but that's a separate issue). If double buffering was used instead in that graph, the delays in the V-sync graph would cause more dropped frames instead of the drift(every frame that took over 16.6ms would be delayed, instead of just when all the delays accumulate).

    G-sync is basically a double buffered V-sync that pushes a frame to the monitor whenever the frame is done. This means when your framerate would exceed your max refresh rate, it's basically equivalent to current double buffered v-sync, but when your framerate drops below the max refresh, you don't get any tearing, latency increase(beyond that caused by increased render time), or judder, it just delays the monitor's refresh until the frame is finished. This means you don't get a big penalty for slightly late frames, like you do with V-sync. If your frame takes 17 ms, you see it at 17 ms instead of waiting until the next refresh. A 1ms late frame is 1 ms late with g-sync, instead of being 8.33 ms late on a 120hz display or, 16.6ms late on a 60hz display. As a bonus, when you have a non-maxed framerate the buffer flips immediately when the frame is finished, so it doesn't cause your frame rate to drop to the next lower threshold, and it doesn't introduce latency by waiting for the next refresh.

    If you normally game with v-sync disabled, g-sync means you get rid of tearing, without increasing latency. Basically, when you're at the point of the scan where the buffer flips, and you would be halfway through scanning the old frame onto the panel, you instead scan the entire newly completed frame onto the panel.

    Oh, and here's a scatter plot version of that graph, in case you're interested: http://i.imgur.com/1Ev18XX.png

    As for whether you specifically can see a difference between G-sync and v-sync, I don't know, I guess you'll just have to wait until you can see it in person, or maybe watch a high speed video comparison. The only people that won't see benefit are those that can consistently maintain a framerate equal to their refresh rate. But not all game can manage that, even on a high end system.
    Reply
  • wojtek - Wednesday, October 23, 2013 - link

    "As for whether you specifically can see a difference between G-sync and v-sync, I don't know, I guess you'll just have to wait until you can see it in person"

    Yes and that is the point. After the technology will be released we should do double blind tests to see if anyone in reality is able to see the difference between g-sync and 120/144 FPS monitors. But ahead of time I am sure g-sync is incompatibile with comfortable backlight strobing (we were talking latter) when it drops below 60 Hz (or even 70 Hz). 120/144 Hz backlight strobes will remain consistent and comfortable.
    Reply
  • wojtek - Wednesday, October 23, 2013 - link

    BTW Temporal resolution, the speed of your retina perceptiveness, is 100 ms for rods (cells that see colors) and 10 to 15 ms for cones (cells that see b&w light), so no chance to see 8 ms timing difference, but great chance to see a lot of blur on objects moving with FPS lower than 60 Hz. http://webvision.med.utah.edu/book/part-viii-gabac... Reply
  • Soulwager - Wednesday, October 23, 2013 - link

    All that means is that your eye adds latency to whatever is coming in. If you have a fully white screen that turns black for 5ms and back to white, you probably won't notice(if anything it would just look like a slight change in brightness).

    The eye doesn't have a scan rate, your rods can fire independently of each other. Say you have a white ball moving left to right across a black background on a 500hz display. If that ball jumps up an inch for a single frame and then back down like nothing happened, you'll still see that it happened, because those rods were triggered, it just takes them some time to report that they were triggered, and reset. When your eye is tracking a moving object, the exact same thing happens in front of and behind the object if it isn't moving as expected.

    http://www.100fps.com/how_many_frames_can_humans_s...
    Reply
  • wojtek - Wednesday, October 23, 2013 - link

    No, as you describe later, latency is one side effect, another is sub 10 ms awareness that does not exist. That is why the strobe above 100Hz and above is irrelevant, and may be used to mask shifting of picture distorsion. In essence what it does is some kind of low pass filtering where blur is caused by high frequency visual noise of overlaping images. Reply
  • Soulwager - Wednesday, October 23, 2013 - link

    10 ms of what? latency? strobe interval? frametime variance? Just because you can't see a light flickering above X hz when it's stationary doesn't mean you can't see it flickering when it's moving. If you don't believe me, try it. build a timer circuit that flashes a led 100 times a second, or however fast it takes to give the illusion of a solid light, then wave it around in a dark room. you WILL see it flashing until you get up to frequencies that would sound absurd for a computer monitor. Reply
  • wojtek - Wednesday, October 23, 2013 - link

    This is what i'm talking about strobbing, but you are watching a stable monitor, or constant place in a motion, where awareness is focused in one place! Take this flashing circuit, focus on it and do some motion. You will see no flashing at all. Reply
  • Soulwager - Wednesday, October 23, 2013 - link

    There is a LOT more to it than just getting rid of flicker. It's like comparing pure audio reaction time(~150ms) to synchronizing to a beat(<20ms).

    When I'm moving mouse around in an game, it's moving around at something like 5k pixels per second. This means ± 8ms variance on frame time corresponds to a mouse inaccuracy of something like 80 pixels. To get around this you need to have a ton of practice with the mouse so you know how far the cursor is going to move without seeing it.

    The equivalent of latency would be the LED appearing to trail it's actual position as you move your arm around. Variance would be it's apparent position relative to it's actual position changing as it's moved at a constant rate. Even if you can't see it watching non-interactive content, you can certainly feel it when looking around in a FPS or moving a mouse anywhere.
    Reply
  • wojtek - Wednesday, October 23, 2013 - link

    When you moving mouse cursor/screen with framerate below 60 Hz you see much more latency (usualy more than 20-30 ms) that is _clearly visible_ and additional +/- 8ms of latency is almost neglible component to it taking into account the additional motion blur that it produces. This is what I'm trying to say, but you maybe you must try it for yourself.
    See how more blurred _and_ shifted back is the 30 FPS animation compared to 60 FPS one.
    http://www.testufo.com/#test=framerates
    Now we are talking about animation below 60 FPS on the 120 Hz screen that causes visual blur higher than 60 FPS, and the timing inaccuracy you are so afraid of is not bigger than a half of 60 FPS blur on 120 FPS move! This sub 60 FPS animation, and the object you track, is already shifted back and blured more than the 60 FPS one no mather how accurate you sync it with the buffer!
    Reply
  • Soulwager - Wednesday, October 23, 2013 - link

    Absolute latency has zero impact on how smooth the animation looks, it has nothing to do with strobing, and it has nothing to do with stutter. It's just a delay. You can't see it unless the system is reacting to your input, which is what happens in a game. If latency is consistent, you can adapt to it, by leading your target for example. If its inconsistent, you can't predict where something is going to be nearly as accurately, because you don't know how far the animation is lagging behind the game engine.

    http://www.youtube.com/watch?feature=player_embedd...

    Here, you can SEE THE DIFFERENCE between 10ms latency and 1ms latency. Now imagine that instead of smoothly following on the 10ms clip, that the white blob is some random point between where it is and where it would be on a 20ms latency test, and that that white blob is what you have to use to figure out where the finger is now. That is why I have more of a problem with variance in frame time than in absolute latency(which is also bad). I'm not just theorycrafting here, I've actually experienced what this problem feels like, and the only solution I've found so far is to play on low enough settings that you NEVER drop below your v-sync framerate threshold. If G-sync fixes this, which it sounds like it will, I'll buy it in an instant.
    Reply
  • wojtek - Thursday, October 24, 2013 - link

    What you see on this video is the _input_ latency, the refresh rate of screen in still 60 Hz! You see your finger _and_ the painted square at the same time compared simultaneously. Even if your see it 10 seconds later you will see the same shift betwen your finger and the sqare, and that is independent to your reaction, that will be performed. Thats why modern gaming mouses has >1000 Hz sampling. Contrary to what you said the real problem is _when_ you see it, and in practice how fast you can react. In fact every latency bigger than 20 ms has noticable lag between your move and the observed action on the screen, and is very uncomfortable especially for hardcore gamers, where such latency is unacceptable. Sub 60 Hz experience should be avoided in hard core gaming at all cost even if it means temporal disabling of some visual improvements! This technique is already used in consoles (i.e. temporal upscaling of lower resolution rendering). Reply
  • Soulwager - Thursday, October 24, 2013 - link

    The "screen" in the video I posted is a research projector, it sure as hell isn't running at 60hz. Gaming mice have high sample rates because gamers click their target before they see their cursor in the right place on screen. If the mouse interface was interrupt based, it wouldn't need a high poll rate. With G-Sync, you're putting the monitor's interface on an interrupt instead of cranking up it's poll rate.

    Also, going from a normal 125hz USB mouse polling rate to 1khz? 7ms max benefit in input latency, average of half that. The difference is people had PS/2 to compare it to, which is interrupt based.

    I think you misunderstood what exactly it was about human vision that was being measured in that research paper you posted.
    Reply
  • wojtek - Thursday, October 24, 2013 - link

    Then ask yourself why you have no such effect on your screen. I have 60 Hz display, normal mouse and gaming mouse. No visible lag on each device. I can even put my mouse on screen and compare it directly with cursor. Nothing! Magic?

    You clearly not understand that process.

    The problem with touch displays is the the sample rate of touch input nothing more. The touch samples are taken in order, then then the draw code is executed for each sample and the sceen is drawn on the closest refresh. All processing from touch to display usually took about ~40 ms with current touch panels. That is the shift you are observing.
    If the input took 1 ms, and the code would be executed below 16 ms then you will see the square as fast as the closest frame refresh on 60 Hz display, which is the case i.e. on my monitor with my hardware, and I believe most current PC users.
    So no. It's not magic.
    Reply
  • Soulwager - Thursday, October 24, 2013 - link

    I DO have that effect on my screen, so do you, you're just so familiar with it that you don't notice. If you actually dropped your mouse sensitivity so it was 1:1 to the pixel density of your and moved the mouse around on your monitor AT THE SPEEDS USED IN A COMPETITIVE GAME, you would realize that what you said is bullshit.

    Even now I don't know what point you're trying to make. There are obvious tradeoffs with V-sync and latency that you simply don't have to make with G-sync. You said yourself no competitive gamer would tolerate a framerate below 60fps, Take a moment to think about WHY that's true, and then re-read everything I've said.
    Reply
  • wojtek - Thursday, October 24, 2013 - link

    "I DO have that effect on my screen, so do you"

    No I don't. You have some hardware or software issue then. It's clearly noticable when you do v-sync below 60 Hz, or most tripple buffering options in games, but it's not when it is perfect >=60 Hz synchronized. Believe it or not, I have explained it to you. I can't help you more.
    Reply
  • Soulwager - Thursday, October 24, 2013 - link

    Most systems(assuming no drastic bottleneck, CPU reasonably matched to GPU), have between 25-50ms of latency when running at 60fps (because of pipelining of frames in the game and graphics engine, plus the latency added by the monitor and mouse). This is measured by the time between mouse click and muzzle flash on panel. Please tell me how you managed to eliminate this latency from your system, and why you think this is any different than the effect demonstrated in the video I posted? Reply
  • wojtek - Thursday, October 24, 2013 - link

    Search about hardware mouse cursor acceleration on graphics cards. It is so old technology that this is not the case from quite some time.
    http://stackoverflow.com/questions/6957039/what-is...
    So yes, you may have decent responsiveness even on 60 Hz if you have fast enough input.
    Reply
  • Soulwager - Thursday, October 24, 2013 - link

    Yes, it helps with an RTS, but not a FPS. Reply
  • wojtek - Monday, October 21, 2013 - link

    And for your funny mouse example: on LCD you see multiple cursors at once during fast move - and surprise! - guess why? Yes your LCD fades slower than the new image appears (hence also blur if it runs slower). But if you try 1000 Hz mouse on 100 Hz CRT display you see one precise moving image. Try it for yourself. :P Reply
  • Soulwager - Monday, October 21, 2013 - link

    Even on a CRT I can see where the cursor was during each refresh, though mine was running at 85hz. It sounds like your LCD might have an overdrive calibration problem Reply
  • wojtek - Monday, October 21, 2013 - link

    You need to do double blind tests on moving objects to prove that. Otherwise you even don't realize which frame you actualy not perceived. It's the same with 24/192 and 16/44.1 music. Some people claim they hear the difference, but then they cannot prove it on double blind tests. Reply
  • mdrejhon - Monday, October 21, 2013 - link

    Here's a better explanation, from the human eyeball perspective:
    http://www.blurbusters.com/gsync/how-does-gsync-fi...
    Reply
  • wojtek - Monday, October 21, 2013 - link

    The higher the FPS the less distance on the shutter which is all I am talking about. And from the human eye perspective interpolated high-FPS motion blur is actually more smooth than accurate low framerate crisp image. It however trades latency for smoothness which is always the case in this matter.
    Some game engines even work that way: they calculate key frames and interpolate pictures between them. Look i.e. on top notch Crysis where the better graphics you have the more smooth motion blur calculation are performaed!
    It is also a must-have in the movie industry where you operate on constant 23.976/24 FPS.
    Reply
  • wojtek - Monday, October 21, 2013 - link

    And by the way it shows only 0,15 seconds of animation where 2 of 9 frames were late by ~1/120 sec. And that is not a surprise that they show such a microscopic period, becuase if they plot full one second with all that 50-70 frames you even didn't see the discontinuity on the chart (too large scale)! So it is pure commercial bullshit, nothing more. Reply
  • mdrejhon - Monday, October 21, 2013 - link

    Are you familiar with strobe backlights?
    G-SYNC includes a strobe backlight mode (sequel to LightBoost):
    http://www.blurbusters.com/confirmed-nvidia-g-sync...

    The backlight is turned off while waiting for pixel transitions (unseen by human eyes), and the backlight is strobed only on fully-refreshed LCD frames (seen by human eyes). The strobes can be shorter than pixel transitions, breaking the pixel transition speed barrier! It eliminates the sample-and-hold effect.

    I don't see any stuttering or motion blur with LightBoost at 120fps @ 120Hz; it looks like a CRT.
    I can even read the street names in the TestUFO Panning Map Test at 1920 pixels/sec:
    http://www.testufo.com/photo&photo=toronto-map...

    You read text in fast moving images on CRT or LightBoost (or G-SYNC's optional strobe mode)
    Reply
  • mdrejhon - Monday, October 21, 2013 - link

    Ooops, the link to TestUFO Panning Map Test is wrong. The correct link is:
    http://www.testufo.com/photo#photo=toronto-map.png...
    It displays a map that pans sideways at high speeds, at a framerate matching refesh rate.
    Reply
  • wojtek - Monday, October 21, 2013 - link

    Do no not forget that the low frequency strobe light it is very tiring and unhealthy for human eye. That was the case why CRTs, which are natural strobes always, were performing higher refresh frame rates than 60 Hz, ie 75 or 85.
    Now you are talking that we actually do the opposite and this is the progress? How ridiculous is that!
    Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    Two things:

    - There's no way to have low persistence without strobing, unless we have 1000fps@1000Hz to gain the motion clarity of CRT without flicker. We're trading off flicker for motion blur.

    - The feature can be turned ON/OFF. CRT couldn't allow you to turn of its flicker. LightBoost and GSYNC allows you to do so.

    - Some of us get motion blur headaches. Eizo FDF2405W uses a strobe mode in their 240Hz monitor to reduce motion blur eyestrain. LightBoost gives me less eyestrain (it emulates a 120Hz CRT) because my eyes are no longer tired by motion blur.
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    In fact it is the opposite! The motion blur _is natural_ for human eyes, if not, you should have motion blur headaches in the cinema as well, which I believe is not the case. The technology you mention has nothing to do with _proper motion blur_ it just tries to reduce the LCD unresponsiveness and in fact it is old technology that many people just dissmised, look for example at BenQ FP241WZ http://www.tftcentral.co.uk/articles/content/benq_... with Black Frame Inserting. It was present in LCDs 7 years ago.

    Search google for people opinions it was barely accepted. It may be accepted for higher framerates like in CRT, but for 24-60 FPS it is just eyes nigthtmare. We already did that lesson. No reason to repeat it.
    Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    Human motion blur should be 100% natural. Displays should not force extra motion blur upon eyes above-and-beyond human limitations. Sometimes we want the display to do it for artistic purposes. Otherwise, Holodeck displays will never arrive.

    Multiple things:
    -- Flicker DOES increase eyestrain.
    -- But trying to use focussing muscles to focus on motion blurry images (e.g. www.testufo.com/photo) for long periods DOES increase eyestrain too.

    Some people are less bothered by one than the other.
    For example, see the testimonials of people who love it, and get less strain:
    http://www.blurbusters.com/lightboost/testimonials...
    Some people stopped FPS gaming when switching from CRT to LCD because of the motion blur problem, and have only returned to FPS gaming because of strobing.

    Remember, everybody's vision is different.
    Some people are color blind. Some people are not.
    Some people hear better. Some people don't hear as well.
    Some people notice certain image artifacts. Others don't.
    Some people see flicker better. Others don't.
    etc.
    Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    Also see: http://www.eizo.com/global/products/duravision/fdf...

    "Blur Reduction with 240 Hz Refresh Rate
    The monitor converts 120 Hz input signals to 240 Hz to reduce ghosting and blurring that is caused during frame changes. This greatly improves sharpness and visibility and reduces eye fatigue that occurs when viewing scrolling or moving images."

    And it uses strobing, according to page 15 of its manual:
    http://www.blurbusters.com/eizo-240hz-va-monitor-u...

    Fortunately, the strobing is optional, but vision research has shown that strobing DOES alleviate motion blur strain FOR SOME PEOPLE (MAYBE NOT YOU) because eye-focussing muscles don't have to struggle as much because you can't undo unnatural display-forced-upon-you motion blur with your human eyes' focussing muscles trying to hunt back-and-forth through the motion blurry images.
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    I completly agree on reduction of motion blur imposed by LCD, in fact it is not natural motion blur, it is just blending blur similar to some old game techniques that simulate motion blur by blending queue of frames. This is bad idea.

    Everything I am talking about is that we should improve high frametate displays to eliminate ghosting and have good, practical and unnoticable margins in stable v-sync, and stable picture reproduction, rather than doing the oposite: trying to achieve variable low framerate that is produced by slow GPU, and has no sense because of flickering and a lot of additional efforts to make it sane.
    You simply try to solve problem that does not exist in high frametare world, which can reproduce stable low ramerate conditions to human eyes!

    The propoer motion blur, with correct optical parameters, should be calculated on the fly by game engine and presented on display that has no additional flaws. This is the way to go. And it is orthogonal to the refresh rate itself, which is just a parameter in such calculations. If you have enough frame slots (120/144 FPS) you don't need to redesign anything, just make sure it is presented with accuracy higher than 60 Hz, which gives you natural impression of smoothnes.

    Now, I understand NVIDIA and its "inventions" because it is business, and they care about moneny. They need to convince people to buy their hardware, and the more unique features they provide the more advantage they may take. The only problem is merit of their inventions and in the g-sync case I see no merit at all, as I see no merit with buying expensive HDMI cables that does not improve anything or listening to 24/192 music that actually produces additional distorsion in most amplituners because of their nonlinearity.

    That is all my point. Nothing more.
    Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    > I completly agree on reduction of motion blur imposed by LCD, in fact it is not natural motion
    > blur, it is just blending blur similar to some old game techniques that simulate motion blur by
    > blending queue of frames. This is bad idea.

    That's not the cause of LCD motion blur on modern LCD's.
    See http://www.testufo.com/eyetracking for what really causes motion blur on modern LCD's.

    GtG is only a tiny fraction of a refresh, most motion blur is now caused by persistence. LCD pixel transitions more resemble a square wave today now: http://www.testufo.com/mprt ... (otherwise the checkerboard pattern illusion is impossible) .... GtG is tiny fraction of refresh; persistence/stableness is most of a refresh.

    This motion blur unfortunately also happens on OLED's too, so they have to strobe OLED's:
    http://www.blurbusters.com/faq/oled-motion-blur/

    Motion blur occurs on all flickerfree displays, even if they have instant 0ms GtG pixel transitions. As you track moving objects on a flickerfree display, your eyes are blurring each continuously-displayed refresh. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh.

    To have 1ms of persistence without using flicker, we need 1000fps@1000Hz to solve strobing AND motion blur. You have to fill all the black gaps with 1ms frames, and all frames have to be unique. 1ms of persistence translates to 1 pixelwidth of tracking motion blur (forced upon you) for every 1000 pixels/second.

    Same motion clarity of a 2ms strobe backlight = you need 500fps@500Hz nonstrobed
    Same motion clarity of a 1ms strobe backlight = you need 1000fps@1000Hz nonstrobed
    Same motion clarity of a 0.5ms strobe backlight = you need 2000fps@2000Hz nonstrobed
    etc.

    Since silly framerates are impossible, we are stuck with strobing for now if we want to eliminate motion blur (regardless of display technology, not just LCD)
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    This is valid reasoning only to some extent, because even when you tracking "moving pixels" this tracking is not more accurate/sensitive than your retina responsiveness which is far lower than 1000Hz and in fact is more content dependent than the real timing http://neuroscience.uth.tmc.edu/s2/chapter15.html

    But strobbing over 100Hz is OK since it is not tiring for eyes. Strobing in 24-60 Hz is tiring and unhealthy for longer exposition. That is medical fact.
    All in all, as I said, we should focus on higer FPS, rather than accurate low FPS because the letter has no benefits.
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    Ah, and keep in mind, that the motion blur you are talking about is also resolution dependent. The higher PPI the less blur because there is less gap between transitions, and that in turn leads again to FPS which uses more pixels to draw movement if it is higher. Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    Also -- motion blur should be 100% natural and 100% artistic. Programmable variable-persistence displays is a good step towards this ultimate goal.

    Today's non strobed displays will never have perfect motion clarity, until there's a breakthrough for the low-persistence zero flicker display that allows fast-panning images (when the app creator wants it) to be as perfectly sharp as a piece of paper going sideways. Also, when wearing virtual reality headsets, turning your head creates panning. Panning creates motion blur on ALL flickerfree display technologies (even instant-GtG displays). So the person can't focus on objects while turning their head. nonstrobed OLED / nonstrobed discrete-LED-arrays doesn't fix the blur problem. Michael Abrash is wishing for a 1000Hz VR headset because of this problem:

    http://blogs.valvesoftware.com/abrash/down-the-vr-...

    So, strobing is a simpler method (for now).
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    No, the ultimate resolution for that problem is direct retinal projection http://ftp.rta.nato.int/public/PubFullText/RTO/MP/...
    And there is first attempt already http://reviews.cnet.com/wearable-tech/avegant-virt...
    Reply
  • mdrejhon - Thursday, October 24, 2013 - link

    Yes, agreed. Retinal project is ultimate, but the all the same stuff I've said still applies _exactly_, full stop. Ultrahigh framerates *or* light modulation (e.g. laser scanning, light modulation, etc). The laser scanning is doing the strobing, CRT style. Nothing new here. Reply
  • mdrejhon - Wednesday, October 30, 2013 - link

    Eizo just released a gaming VA strobe-backlight monitor, the Foris FS2421.
    Here's a white paper on Eizo's method of strobing:
    http://gaming.eizo.com/wp-content/uploads/file/tur...
    Great diagrams.
    Reply
  • poohbear - Sunday, October 20, 2013 - link

    meh, please shoe me this technology with IPS monitors otherwise who cares. Color quality is king for me and many others. Also, what's stopping AMD from developing something similar for their cards? Reply
  • junky77 - Sunday, October 20, 2013 - link

    I don't get it. Why it has to be 'a card'? Why wasn't some standard developed and applied into the GPU and screens right away. They are selling us another thing which should have been there in the first place. Reply
  • mdrejhon - Tuesday, October 22, 2013 - link

    It has to be a card. There's some really, really, really advanced processing going on.

    -- Advanced LCD overdrive algorithms that changes on the fly
    -- It's hard to have consistent color/gamma at all refresh rates.
    -- Previous monitors, 60Hz vs 120Hz vs 144Hz often had to be calibrated differently
    -- Now imagine trying to make 30Hz through 144Hz dynamically work, with overdrive & keep consistent color.

    I'm betting a lot of the 768MB on the G-SYNC board is probably for a massive 3D lookup table or processing memory to assist in accurate LCD pixel transitions.
    Reply
  • ninjaquick - Thursday, October 24, 2013 - link

    or 4K 32bpp Reply
  • junky77 - Sunday, October 20, 2013 - link

    The real amazement is that it was not so till now. Reply
  • wrkingclass_hero - Sunday, October 20, 2013 - link

    Seeing so much memory on the board, and hearing about ghosting artifacts from Eurogamer, leads me to believe that G-Sync is just interpolating frames. It would be interesting to see what happens when there are variable frame rates. Reply
  • Soulwager - Sunday, October 20, 2013 - link

    I think the memory is for PSR below 30fps, if they were interpolating frames they'd just do it GPU side. Reply
  • mdrejhon - Monday, October 21, 2013 - link

    G-SYNC does not use interpolation.

    The ghosting effect is simply a side effect of framerate.
    Motion blur is directly proportional to framerate on flickerfree LCD displays (non-strobe mode).
    Seeing http://www.testufo.com -- Notice how 30fps has more motion blur than 60fps (and if you're using a 120Hz monitor), how 60fps has more motion blur than 120fps.

    With G-SYNC, motion blur gradually ramps up/down instead of you seeing stutters.
    Reply
  • wojtek - Tuesday, October 22, 2013 - link

    It also shows how lower framerate always produces relative shift back to higer framerate despite of actual sync (in this case sync is perfect because 30 divides 60). Take it into account when you talking about object position on screen relative to sync. Reply
  • mdrejhon - Thursday, October 24, 2013 - link

    Yes, with G-SYNC, everything is always matched to refresh rate. 45fps@45Hz, 67fps@67Hz, etc. Varying framerates now always looks like framerate=Hz, provided screen object position corresponds to real time. (to stay in sync with eye tracking position)

    Eye tracking diagrams for G-SYNC are found at:
    http://www.blurbusters.com/gsync/how-does-gsync-fi...
    Reply
  • Veroxious - Monday, October 21, 2013 - link

    Yeah like I am going to risk that amount of cash on a new GPU and G-Sync enabled monitor when nobody has a clue on what the adoption rate would be. The cost is simply prohibitive and foolish of Nvidia. Also there is no mention of what cards will support this. I'm all for smoother gaming but the vast majority of us gamers have limited budgets. That is a serious chunk of GPU budget . In single card configs I buy whatever gives me the best bang for the buck be it green or red. If the monitors were priced the same then it would be a game changer. For what it is worth, stutter has never been an issue for for me on either platforms and on the rare occasions it has occurred it did not spoil the experience for me. As long as I can game at 1080p with medium to high settings I am a happy camper period. Reply
  • polaco - Monday, October 21, 2013 - link

    well I don't blame you, I think the same. Why on earth should I spend that amount of money to get no stuttering that is only noticed on a freezed screen test? when would I want to read text on a moving pendulum? the demo done with tomb rider show little benefits. I prefer to invest in a better gpu or cpu since will make me play at decent frame rates and will make my investment to last more. This G-sync feature is totally hyped! Reply
  • Kookaburra8su - Wednesday, October 30, 2013 - link

    Agreed, double dip expense.

    There was a company called Hi-Toro that developed patents for frame rate/refresh matching from graphics processor to monitor/TV Hz rate in the 1980's.

    This Nvidia tech is ass about.
    Reply
  • jasonelmore - Monday, October 21, 2013 - link

    I wish i knew if the 27" Asus Version of this screen (ASUS VG278HE) will support the add-on card. I'd rather have the larger screen and they are both 144hz Reply
  • Toblerone - Friday, October 25, 2013 - link

    This is great.

    Please stop making things like this proprietary. Create an industry standard with AMD and Intel so it can gain mass adoption.
    Reply
  • Kookaburra8su - Wednesday, October 30, 2013 - link

    This whole concept is ass about, there is no need for this tech. Process scene, send at frame rate processed, capture, reprocessing for new frame rate, dispay - to much double handling, and a waste energy. It's like having a graphics card in 2 parts, and having pay twice to get both. Easier to process, match frame rate of monitor, send to monitor. Lots cheaper too.

    These refresh issues have been solved before, by having the graphics processor match the referesh rate of the monitor. Build a variable version into todays graphics cards and you can output smooth frame rates to any monitor out there today, no need to buy a new monitor, no need to buy Nvidia hardware twice.

    Unfortunately for Nvidia, I don't believe the patents for refresh matching from graphic processor to monitor are owned by them. It was developed by a tech company that matched their graphics processor to the Hz rate of tv's in the 1980's. I remember it was company called Hi-Toro.

    Anyone know who owns those patents now?
    Reply
  • soxfan - Wednesday, November 13, 2013 - link

    See US8334857 - assigned to nvidia on its face. Issued in 2012. I'm sure there are others but I'm not going to waste time finding them. Reply
  • soxfan - Wednesday, November 13, 2013 - link

    I did an assignee search for patents assigned to "hi-toro" and found nothing. In any event, the technology you describe (matching frame rate output of a graphics card to a display) is not what gsync tech does, namely to match the refresh rate of the display to the frame rate output from the graphics card. Reply
  • andrerocha05251975 - Saturday, November 02, 2013 - link

    Amd will probably have to make their cards compatible. Reply
  • Hixbot - Monday, November 11, 2013 - link

    So this allows the frame time variance tro be displayed at monitor, eliminating tearing or vsync buffering (introduces lag).
    It will introduce the concept of displaying inconsistent frame times to the viewer, which may be seen as juddering. Choose your evil, I suppose.
    Also variable screen refresh time means variable gamma.
    The need for strobing is still there to eliminate the persistence problem of LCDs. That stobing will need to varry with the refresh, reinforcing any judder.

    This doesn't get to the core of the problem that is inconsistent frame times. I'd rather see a solution early in the render chain. Target a specific frame time, and adjust the rendering process (quality) on the fly to keep frame times constant. This would allow PC displays to be locked into a clock in the same way consumer TVs are clocked.
    The fact that no graphics card can't deliver a rock solid 23.976hz to my Plasma TV is a shame.
    Reply
  • Felix_Ram - Monday, December 02, 2013 - link

    Would this tech solve SLI issues like microstuttering? Reply
  • Maegirom - Thursday, December 12, 2013 - link

    And what happens with 3d vision. It is supposed that this works only with a DVI DL connection, right? If I have only 1 display port and nothing more, Will I be able to use 3d vision? Reply
  • uscjones8 - Monday, December 30, 2013 - link

    So, I'm in the market for a new monitor now, but won't be looking to buy a new GPU for about a year...right now I have pair of GTX 580s in SLI. I understand that G-sync won't work with my GPUs, but if I bought a G-sync monitor now (or soon), would it still work at the normal 60 Hz refresh rate until I'm ready to upgrade GPUs? Also, anybody hear anything about any 27" G-sync monitors? Reply

Log in

Don't have an account? Sign up now