POST A COMMENT

74 Comments

Back to Article

  • Continuity28 - Thursday, July 24, 2014 - link

    2560x1440@144hz
    8-bit TN panel (rare)
    G-Sync

    It's a real winner for me. I can't wait!
    Reply
  • prime2515103 - Thursday, July 24, 2014 - link

    Where does it say it's an 8-bit panel? Reply
  • Kronvict - Thursday, July 24, 2014 - link

    Asus has already stated in quite a few places that its an 8-bit TN panel. If its that important to you than do your own dirty work and research it. Reply
  • SlyNine - Thursday, July 24, 2014 - link

    They used to have that up officially. Since they have removed that. I hope they haven't silently changed the spec. Reply
  • Fallen Kell - Thursday, July 24, 2014 - link

    You don't get 16million colors without 8 bit panel... 8 bit = 2^8*2^8*2^8 = 256*256*256 = 16,777,216 Reply
  • The Von Matrices - Friday, July 25, 2014 - link

    Just because a panel is advertised as having 16.7M colors doesn't mean it's an 8-bit panel. Every 6-bit monitor uses dithering to simulate 8 bit color, and because of that manufacturers take liberty and advertise them as having 16.7M colors even though that is technically incorrect.

    Look all the TN panels on the market. You won't see any model with specifications showing them as having only 262K colors.
    Reply
  • nathanddrews - Friday, July 25, 2014 - link

    Correct. While is IS technically possible to create a true 8-bit TN panel, I've never actually seen one, which makes me think it's prohibitively expensive and reserved for labs/industrial. They are always 6-bit with FRC (Frame Rate Control (temporal dithering)) or some form or spatial dithering. The former uses TN's pixel speed to rapidly alternate/flash between colors to give the perception of a different color while the latter combines colors on surrounding pixels to give the perception of a different color. The end user is typically unaware of this... until they compare it side by side with an IPS or know what to look for. Some 8-bit IPS displays also use these techniques to qualify as 10- or 12-bit.

    The more I learn about G-Sync and ASync (or whatever it's called), the less impressed I become with G-Sync. It seems like the G only operates between 25-60Hz whereas A works from like 6-240Hz (in theory). Also, A has the advantage of being rolled into the DP standard, which *may* accelerate adoption. Some people argue that over 60fps doesn't matter, but as a high-fps gamer, I want my monitor to match the framerate no matter how fast or slow the game is running - 30fps, 126fps, etc. No stutter or tearing at ALL framerates, please!
    Reply
  • SlyNine - Friday, July 25, 2014 - link

    wrong, Gsync doesn't have upper limits, It can do 30-144. Plus we haven't seen input latency comparisons. My understanding is Gsync gets the frame and displays it. Async tells the monitor what it expects future frames to be (so the videocard has to render at least 2 ahead). If that is the case then Gsync would have 1 frame less latency. Reply
  • nathanddrews - Friday, July 25, 2014 - link

    http://www.blurbusters.com/gsync/preview2/

    I stand corrected. I didn't recall ever hearing NVIDIA mention or demonstrate anything over 60Hz. That's good news and puts them on equal footing performance-wise, but the advantage still goes to ASync since it could be used by any compatible DP monitor and GPU.

    I'm definitely not an early adopter for this. I'm waiting (and saving) for 120Hz 4K DP 1.3 with ASync and the next big GPU from NVIDIA or AMD. I'm currently quite happy with my FW900.
    Reply
  • TheJian - Saturday, July 26, 2014 - link

    Advantage to Gsync because it's actually something you can BUY now. Let me know when AMD theory becomes reality ;)

    Also notice in AMD's slides in their demo of the tech they just say HELPS, rather then ELIMINATES period:
    http://www.pcper.com/news/Graphics-Cards/AMD-Demon...
    That's kind of like using words like Virtually gone etc. I'll believe the tech when a retail product is tested and it gets the accolades NV has already gotten in ACTUAL gaming vs. a windmill demo.

    Also it won't be FREE. The same scaler companies NV couldn't get to budge (which is why they built the card) will have to be convinced by AMD. That R&D will be passed on to consumers which will increase monitor prices JUST LIKE Gsync. It's comic you're waiting for something that hasn't been proven. Like Ryan says:
    "Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate."

    We need actual in-depth gameplay testing to see how well this works. Everyone playing with gsync monitors says they'll never go back. Reality and theory are two different animals until proven otherwise. Note AMD wouldn't name the monitor, wouldn't name the scaler that even worked for their demo etc. Jeez.

    "Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology."
    You'll need a card and monitor most likely pretty much like gsync too. What part of this is free again? ;)
    "Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards."

    Hmmm....
    Reply
  • chenook - Thursday, July 31, 2014 - link

    I think you mean pb278q that goes for $500 Reply
  • chenook - Thursday, July 31, 2014 - link

    The only thing that gets me is at the big reveal they said it would be available for purchase in April, then they said May, then a problem with heat sink design so June then July and then they have shipping issues so August and now seeing September posts. Frustrating. Reply
  • SlyNine - Friday, July 25, 2014 - link

    I am aware. But look at the other TFT panels at newegg. They also advertise 16.7 even tho they use 6bit+dithering.

    So again. I hope this is a true 8bit panel, they made a pretty big deal about it and then silently took the 8bit off their specs.
    Reply
  • SlyNine - Friday, July 25, 2014 - link

    Okay. In there FAQ they have once again stated its a 8Bit panel. So all is well. Reply
  • 2late2die - Thursday, July 24, 2014 - link

    This monitor is definitely on the expansive side, but then again new tech usually is. In any case I think this is a very impressive piece of tech and at $600 or less it would be an insta-buy for me. As is, I'm gonna have to think about it. Guess it's a good thing I still have time. Reply
  • Strulf - Thursday, July 24, 2014 - link

    *expensive Reply
  • bebimbap - Thursday, July 24, 2014 - link

    TFTcentral rates this monitor excellent in terms of color accuracy, once calibrated of course, but even out of the box, it is very good. and talking about sRGB coverage ".. the monitors colour gamut (black triangle) is roughly equal to the sRGB colour space."
    http://www.tftcentral.co.uk/reviews/asus_rog_swift...

    so the ONLY 2 deficiencies I can say are connectivity, dPort ONLY and the viewing angles, still is a TN, and it's a large monitor if you are too close to it, the colors might washout towards the edges.

    I wouldn't agree price is "expensive" for this type of tech. A VG248QE is about $275usd now, plus the gsync kit is another $200. So if you were to take the $800 price tag and subtract $200 you are left with a $600 montior. Previous to this any decent 27" gaming lcd 1440p60 with 5ms GTG is already in the $500 range. so really you are only paying ~$100 more for no motion blur, 120hz + ULMB, or for insane motion clarity 144hz. Well that's just my opinion, and I might be biased since I'll be one of the first people to buy this thing. I already own a VG248QE with gsync and it is amazing compared to anything I had before.

    Oh a quick note,
    a 4k60 monitor only needs about 4.977 Mpixels/sec
    a 1440p144 monitor needs about 5.308 Mpixels/sec, I basically look to spend at least as much on my GPU as I am on my monitor as a good rule of thumb
    Reply
  • urabask - Thursday, July 24, 2014 - link

    You can't use G-sync without displayport so it's kind of pointless to gripe about that. Reply
  • tackle70 - Friday, July 25, 2014 - link

    Means you can't hook anything up except a PC. Lots of people use their monitor for other things, especially gaming consoles. So no, it's not a pointless con, it just may not apply to you if you never envision hooking anything up to it except a gaming PC. Reply
  • agentbb007 - Saturday, July 26, 2014 - link

    Buy a $15 display port to hdmi cable then... Reply
  • tackle70 - Saturday, July 26, 2014 - link

    Um, having to do that (and then manually switch out inputs when you want to switch from PC to something else) is super inconvenient and annoying. Single input is a con for anyone who hopes to use multiple devices on the monitor. Period. Reply
  • Penti - Monday, July 28, 2014 - link

    Would need to be a active converter for a few hundreds. But there doesn't exist much that can do 2560x1440 or do conversion properly. Atlona one isn't sold any more, Kanex might be available still. So not 15 dollar. Reply
  • agentbb007 - Monday, July 28, 2014 - link

    Thanks Penti, I stand corrected. The $15 cable I was looking at on Amazon is not bi-directional, it's DisplayPort to HDMI only. There is an HDMI to DisplayPort from StarTech but it's $115 and only supports up to 1920x1200. Fortunately my GTX Titan has a DisplayPort and I only need one input so this monitor is perfect for my gaming needs! Wish they would just hurry up and get it on NewEgg so I can buy one!! Reply
  • chizow - Friday, July 25, 2014 - link

    More inputs also means they'd have to put in more components between input and display, I/O hardware and a video scaler board at the very least, which adds latency in LCDs. I use my PC monitors for other things too, but they'll just get routed to my 2nd panel (which will soon be my VG278H or U2410 both with multiple inputs, including HDMI). Reply
  • Kevin G - Thursday, July 24, 2014 - link

    I'm curious why the 7680 x 1440 surround resolution is in the specs since that'd actually require three of these monitors. Reply
  • DanNeely - Thursday, July 24, 2014 - link

    Because Surround is nVidia's multiple monitor ultra-widescreen gaming product (like AMD Eyefinity); 3 monitors side by side by side in landscape mode is the most common/standard configuration for it. Reply
  • Kevin G - Thursday, July 24, 2014 - link

    Umm... yeah. It still requires three of these monitors. IE one unit can't do it by itself and thus why would it appear on the spec sheet like that? Reply
  • SlyNine - Thursday, July 24, 2014 - link

    Ya that's why they say the surround resolution. In surround mode (ie multiple monitors). You basically answer your own question and are trying to be difficult. Reply
  • Flunk - Thursday, July 24, 2014 - link

    For that price I was expecting a 4K IPS. Is 120Hz and G-Sync support really worth the increase in price and decrease in display quality? G-Sync has its days numbered now that Free Sync is part of the Display Port standard and achieves basically the same result. Reply
  • chizow - Thursday, July 24, 2014 - link

    It's not just 120Hz and G-Sync, this is also the first non-Korean 120+Hz 1440p panel (meaning it'll actually be a good), so you are getting all of the features of a premium TN gaming panel along with a pretty massive increase in resolution.

    For those that find these gaming-focused features worthwhile, this panel is worth it. For those that value image quality over all else, it won't. Also, from everything I've seen and read, this will be an ultra-premium panel, super thin bezel, 8-bit color (rare for TN), ROG branding and design etc., so there's a premium there.

    I am sure there will be cheaper alternatives (price and build quality) from Acer/BenQ etc in 6-12 mos. but they probably still won't compare to this panel overall.
    Reply
  • ninjaquick - Thursday, July 24, 2014 - link

    Funny, both Samsung and LG, makers of all the best panels on earth, including panels in NEC/Dell ultrasharp, all mobile devices, etc.; are Korean.

    The ASUS panel is probably Korean as well.
    Reply
  • chizow - Thursday, July 24, 2014 - link

    I was referring to the entire monitor, mainly the drive electronics, but yes the panel substrate does come out of the same 2-3 factories, but its the drive electronics that differentiate them. When I'm referring to "Korean" panels I'm talking about the numerous blackbox makes and models from off-brand Korean labels that are unofficially overclocked to 120+Hz. Sure they may technically hit 120+Hz at 2560x1440p, but the end result is going to be far inferior to this panel. Reply
  • londedoganet - Friday, July 25, 2014 - link

    According to TFT-Reviews, the panel on the PG278Q is made by AU Optronics, which is Taiwanese, and is also known to make excellent panels. Reply
  • DanNeely - Thursday, July 24, 2014 - link

    IIRC the cheapest 60hz non-TN 4k displays are still around the $2k mark.

    As for GSync being doomed, only if nVidia decides to implement FreeSync. If they insist on only supporting their proprietary version we'll be stuck with both for the foreseeable future and either be locked into the same GPU vendor for multiple years or have to pay extra for displays that support both.
    Reply
  • otherwise - Thursday, July 24, 2014 - link

    agreed about FreeSync, I don't understand why people think that it will kill GSync. If any company knows how important first mover advantage is, it's probably nVidia, considering that CUDA still dominates OpenCL in terms of market share. Reply
  • RussianSensation - Thursday, July 24, 2014 - link

    The problem with G-Sync is that you the minute you buy a G-Sync monitor, you are stuck with NV GPUs for the next 5-10 years that you keep the monitor. Problem with that is often NV GPUs are far and away too expensive for the performance. For example, R9 290s can be had for $700 but a single 780Ti which is far slower is $650. In other words, there is an expensive price to pay to get the G-Sync feature -- forcing you to pay so much more for NV GPUs with similar or worse performance. For that reason many people are waiting for FreeSync alternative that will work on any GPU. Also, $800 for a TN panel is a no go for many. I would much prefer an IPS 4K 30-32" monitor because washed out greys and TN's poor colour quality is a huge compromise for watching movies and 2D work. Reply
  • inighthawki - Thursday, July 24, 2014 - link

    I'm not sure where you're pulling your numbers, but last I checked, R9 290X's were not $700 (Maybe back during the bitcoining phase where there was low supply?) and a 780Ti has superior performance in 95% of scenarios. Reply
  • chizow - Thursday, July 24, 2014 - link

    "For that reason many people are waiting for FreeSync alternative that will work on any GPU."

    Yes, that is the AMD Mantra, free, wait, someday. Maybe?

    There's no guarantee Nvidia supports FreeSync btw, so you'd be in the same vendor lock-in situation, just with an inferior solution that hasn't even yet been officially announced as being supported by any hardware vendors.

    Nvidia products are more expensive sure, but G-Sync is part of the reason why. You get premium features and support when the products are still relevant during their lifecycle over the endless waiting game that comes with AMD solutions and support.

    Also, the price disparity between R9 290/x and 780/Ti is only a recent development once AMD got their supply issues sorted. We all know that for most of their first 5-6 months on the market, the roles were reversed and AMD was the one charging a huge premium for their products ($550 290 and $750 290Xs).
    Reply
  • Death666Angel - Thursday, July 24, 2014 - link

    "AMD was the one charging a huge premium for their products" You are confusing retailers with AMD. I have seen no mention of AMD charging more for their chips during the Coin-Craze. Not a lot AMD could do about retailers charging as much as customers were willing to pay. Reply
  • chizow - Thursday, July 24, 2014 - link

    No, I'm not confusing anything. AMD wanted you to believe they were being held captive by the invisible hand just like everyone else, but in reality, they were selling their parts at what the market would bear and that price increase was reflected in the prices charged by retailers. There were plenty of indicators however, including from AMD's own board partners, that it was an unexpected supply constraint coupled with increased demand from bitcoin mining that led to the huge increases in price.

    But this is all readily obvious if you look at AMD's Q2 financials that captured all of this action, just as I predicted when this topic was raised here months ago. Decrease in total GPU revenue, units shipped, increase in margins, profit and ASP. Basically, that means AMD was selling fewer cards at a higher price, margin, and profit. Conclusion = price hikes were being driven by AMD. By the time they got their supply in order and cryptocoin demand dried up, there was a glut of inventory and prices bottomed out.
    Reply
  • Spunjji - Friday, July 25, 2014 - link

    You really can make innocent statistics say some funny things if you try hard enough. Reply
  • chizow - Friday, July 25, 2014 - link

    Except in this case, those "innocent statistics" bear out what I said and directly refute the notion AMD didn't profit from those price hikes. Just takes a small bit of accounting knowledge and some common sense, because as a shareholder, I'd want to know why AMD *wasn't* profitting from those price hikes while they were having inventory issues. Reply
  • chaosbloodterfly - Thursday, July 24, 2014 - link

    Samsung U28D590D is on Amazon for $600. Does 4k@ 60Hz Reply
  • doggghouse - Thursday, July 24, 2014 - link

    TFT Central has a thorough review of this monitor... the display is a step above normal TN because it's an 8-bit color panel instead of 6-bit. Also, they appear to be well calibrated right out of the box.

    But yes, this screen is TN, which means it will have the regular problems like contrast shifting at different angles. However, you won't find any (high quality) 120Hz IPS panels in the foreseeable future. From what I've read, overclocked IPS panels have image degradation problems. So it's a trade-off.

    Maybe in a year or two, someone will release a 4K IPS that has G/Free-Sync at 60Hz. I think Acer has their 4K G-Sync panel coming soon, but it will also be TN.
    Reply
  • Death666Angel - Thursday, July 24, 2014 - link

    "overclocked IPS panels have image degradation problems" I've not read too many people complain about their image when overclocking their Korean IPS monitors and doing a recalibration with the right tools. I'm happy with my 320€ 1440p110Hz monitor, same IQ as my Samsung 1440p60Hz before it. The ROG is great if you have money to burn and are in the nVidia camp and intend to stay there. Reply
  • SlyNine - Thursday, July 24, 2014 - link

    Because people like to justify their purchases. unless you do some objective testing for motion resolution, I remain unconvinced. Reply
  • Spunjji - Friday, July 25, 2014 - link

    It's a fair point. Properly performed objective testing is what we're after here. Reply
  • Death666Angel - Saturday, July 26, 2014 - link

    So you make a statement, I question the validity of it because I have not encountered people claiming the same and you ask me for proof? Where is your proof, my friend. Reply
  • doggghouse - Monday, July 28, 2014 - link

    Honestly I haven't read much about overclocking IPS panels, but I could've sworn that I read something about colors getting a bit out of whack after an overclock. Though I suppose that could be fixed with calibration. One thing I wonder though, is the pixel response time actually good enough to display over 60 frames/sec? For example, I have an Alienware laptop with 120Hz TN display, but it's almost worthless (especially for 3D) because the frames blur together anyway due to slow pixel response. I imagine IPS would have the same problem...? Reply
  • bunnyfubbles - Thursday, July 24, 2014 - link

    because this monitor is built for speed

    motion clarity is one factor in determining image quality. IPS is the reigning champ for static image quality, but IPS is also much slower than TN. That being said, ASUS is using an 8bit 1440p TN panel (most TN panels are 6bit and 1080p or less) and early reviews are showing it to be competitive for static image quality, all while not sacrificing the speed as it is one of the fastest monitors you can buy.

    As for why not 4K, well again, this monitor is built for speed, and 1440p @ 120 or 144Hz is just as demanding if not moreso than 4K @ 60Hz, either format requires Displayport 1.2 level bandwidth or better (this monitor is so demanding that a triple screen surround setup requires at least TriSLI because each monitor demands its own unique DP1.2 connection)

    There are already plenty of 4K monitors on the market, ASUS even produces a 28" 4K 10bit TN panel monitor that goes for ~$650, but there are zero monitors that offer what the Swift does.
    Reply
  • SlyNine - Thursday, July 24, 2014 - link

    1440p @ 144hz with Gsync.... Yes it is. Reply
  • tackle70 - Friday, July 25, 2014 - link

    You're bat-**** crazy if you expected 4k IPS for that price. 1440p IPS is barely down that cheap. Reply
  • Dug - Monday, July 28, 2014 - link

    They are already marketing this at high end gaming users. No one I know can game at 4k, and no one I know games on IPS. If you want 4k and IPS you aren't a gamer. Reply
  • chizow - Thursday, July 24, 2014 - link

    Yeah NA delay according to JJ at Asus is because they shipped NA parts via boat.....save a few bucks but I bet we don't see the cost savings in the price tag.

    Will still be picking one up however, been waiting a few years for a true upgrade to my VG278H and 2560x1440p G-Sync in 2D along with 1440p 3D Vision will certainly fit that bill.
    Reply
  • dishayu - Thursday, July 24, 2014 - link

    Is that boat being manually rowed by 1 person? I doesn't take months to go from anywhere to anywhere using a cargo ship. Couple of weeks at max. Reply
  • chizow - Friday, July 25, 2014 - link

    Yeah takes about 2 weeks + time in customs then another 1-5 days for time to retailers, about the same time frame JJ laid out I think. Pre-orders in August, shipped later that month. Reply
  • ruthan - Thursday, July 24, 2014 - link

    One input, im using often 1 monitor with more devices - computers, consoles, so this is stopper for me. Reply
  • chang3d - Thursday, July 24, 2014 - link

    Why can't we get the best of both worlds with a VA panel instead? A 27" QHD G-Sync version Eizo Foris FG2421 would be almost perfect. Reply
  • Asmodian - Thursday, July 24, 2014 - link

    It really would be. Reply
  • DanNeely - Friday, July 25, 2014 - link

    Do modern *VA panels still have a problem with black-crush when viewed strait on? (This is when the darkest grays and black all look the same unless you're looking at the screen from an angle.)

    If so, I think I'd still rather have an IPS display.
    Reply
  • lmcd - Friday, July 25, 2014 - link

    I can distinguish grays and blacks pretty easily on a cheap BenQ GW2255 Reply
  • FaaR - Thursday, July 24, 2014 - link

    Nvidia G-Wank, existing for no other purpose than to milk you of of more money. First on the GPU side, and then again on the monitor purchase. Money right down the drain the moment you decide to no longer be Jen-Hsun's bitch, by the way. Proprietary vendor lock-in was no good when 3DFX was still around, but since, Nvidia have become the corporate grand masters of vendor lock-in, trying to abuse their market position at every turn.

    Thankfully this has largely failed with PhysX (which is a total dud), but thanks to NV's silly proprietaryness antics fragmenting the market there is also no universal go-to game physics library.

    A shining example why proprietary solutions and vendor lock-in is a bunch of horse hockey.

    DON'T support it. You'll just punish yourself in the long run.
    Reply
  • tackle70 - Friday, July 25, 2014 - link

    Oh stop drinking the fanboy coolaid. Without "NV's silly proprietaryness", you wouldn't get Freesync. This type of tech is beneficial, and even if you don't like Nvidia's practices/implementation of it, it's good for everyone in the end. If you prefer the open-source alternative, just wait around for Freesync. Reply
  • zo9mm - Tuesday, July 29, 2014 - link

    VESA's Adaptive-sync has been around since 2009, as part of the eDP standard. The application was slightly different, but the concept of variable refresh rates is the same. So by your logic, NV wouldn't have g-sync without Vesa. NV just stole the idea and is trying to keep it for themselves.... greedy. Reply
  • HollyDOL - Friday, July 25, 2014 - link

    <dreaming>
    Well, still waiting for good IPS (Eizo level) with at least 2560x1440 (ideally with one side double so 3D doesn't need to let go half of the image), 60Hz+, well made passive 3D (no glasses? fingers crossed), GSync, USB hub (pref. 3.0) and at least two digital inputs (ideally dp+dp+hdmi). Oh, and also PSU inside screen, I don't want any brick around. Given my usage scenarios and size of my desk two screens are no go.
    </dreaming>
    Reply
  • kyuu - Friday, July 25, 2014 - link

    I still don't really get why people would prefer to have the PSU built into the monitor -- thus making the monitor thicker and having the PSU's heat output to deal with -- rather than having a brick sitting on the floor. Reply
  • HollyDOL - Saturday, July 26, 2014 - link

    It's pretty simple actually, it allows you much easier cable management which helps if you live in dusty area and need to clean dust frequently. Ofc you could solve the brick as well by adding additional mounts on wall or desk etc.... It's just way easier with PSU inside monitor. And having heavier or thicker monitor? My current 21" screen with inside PSU is still uncomparably lighter and thinner than any old CRT. After all check with modern LCD TVs... they in general have PSU inside and are quite thin... Reply
  • agentbb007 - Saturday, July 26, 2014 - link

    Im with kyuu I prefer a power brick on the floor... Reply
  • hurrakan - Monday, July 28, 2014 - link

    £700 ($1200) in UK :( Reply
  • Kronvict - Friday, August 01, 2014 - link

    $649 USD confirmed via ASUS NA facebook. :D Reply
  • JabBauer - Monday, July 28, 2014 - link

    If you can't wait, you can buy the normal version of the pg278q without G-Sync for ~$500. Reply
  • ssddaydream - Wednesday, August 06, 2014 - link

    link? are you sure about that? Reply
  • chenook - Tuesday, August 12, 2014 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    He had the wrong model number but this would be one of the closest ones.
    Reply
  • chenook - Tuesday, August 12, 2014 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    He had the wrong model number but this would be one of the closest ones.
    Reply

Log in

Don't have an account? Sign up now