ASUS ROG Swift PG278Q Monitor Released in APAC/EU, North America Coming September
by Ian Cutress on July 24, 2014 5:49 AM ESTOne of ASUS’ many releases during Computex was for their new ROG Swift PG278Q monitor that boasted a number of impressive specifications all at once. The PG278Q combines a 2560x1440 panel capable of 120/144 Hz operation with support for NVIDIA G-Sync and 3D Vision, putting it firmly in the region of gaming and hence the ROG moniker.
Aside from NVIDIA G-Sync, the PG278Q comes with a Turbo Key on the rear for quick selection between 60 Hz, 120 Hz and 144 Hz depending on user preference. The GamePlus hotkey gives a crosshair overlay to enhance the gaming environment (useful in games that do not offer steady central crosshairs), as well as timer functions. The OSD is navigated by a joystick-like nub behind the side of the monitor.
Response time is listed as 1ms GTG, with 16.7M colors and 160-170º viewing angles. Connectivity is via DisplayPort 1.2 only, with a USB 3.0 pass-through hub also in the electronics. VESA support is for 100x100mm, and the monitor is listed at 7.0 kg (15.4 lbs). The PR gives a bezel dimension of 6 mm.
Due to the high refresh rate and inclusion of G-Sync, the Swift comes in as one of the most expensive TN panels on the market. Pricing will start at $799, varying by region, and the monitor should be available in Taiwan, APAC and EU today, with China in mid-August and North America by the end of August.
ASUS ROG Swift PG278Q | |
Display | 27-inch (68.5cm) widescreen with 16:9 aspect ratio |
Resolution |
2D mode: 2560 x 1440 (up to 144 Hz) 3D mode: 2560 x 1440 (up to 120 Hz) 2D/3D surround: 7680 x 1440 (2D up to 144 Hz / 3D up to 120 Hz) |
Pixel pitch | 0.233mm / 109 PPI |
Colors (max) | 16.7M |
Viewing angles | 170-degree (H) / 160-degree (V) |
Brightness (max) | 350cd/m² |
Response time | 1ms (GTG) |
ASUS-exclusive technologies |
ASUS GamePlus Technology (Crosshair / Timer) ASUS Refresh Rate Turbo Key (60 Hz /120 Hz/ 144Hz Overclocking) ASUS 5-way OSD Navigation Joystick |
NVIDIA® technologies |
NVIDIA® G-SYNC™ Technology NVIDIA® 3D Vision™ Ready NVIDIA® Ultra Low Motion Blur Technology |
Input/output |
1 x DisplayPort 1.2 2 x USB 3.0 (Upstream x 1, Downstream x 2) |
Stand |
Tilt: +20°~-5°, Swivel: ±60°, Pivot: 90° clockwise Height adjustment: 0~120mm VESA wall mount: 100 x 100mm |
Size | 619.7 x 362.96 x 65.98mm |
Weight (est.) | 7.0kg |
Source: ASUS
74 Comments
View All Comments
Continuity28 - Thursday, July 24, 2014 - link
2560x1440@144hz8-bit TN panel (rare)
G-Sync
It's a real winner for me. I can't wait!
prime2515103 - Thursday, July 24, 2014 - link
Where does it say it's an 8-bit panel?Kronvict - Thursday, July 24, 2014 - link
Asus has already stated in quite a few places that its an 8-bit TN panel. If its that important to you than do your own dirty work and research it.SlyNine - Thursday, July 24, 2014 - link
They used to have that up officially. Since they have removed that. I hope they haven't silently changed the spec.Fallen Kell - Thursday, July 24, 2014 - link
You don't get 16million colors without 8 bit panel... 8 bit = 2^8*2^8*2^8 = 256*256*256 = 16,777,216The Von Matrices - Friday, July 25, 2014 - link
Just because a panel is advertised as having 16.7M colors doesn't mean it's an 8-bit panel. Every 6-bit monitor uses dithering to simulate 8 bit color, and because of that manufacturers take liberty and advertise them as having 16.7M colors even though that is technically incorrect.Look all the TN panels on the market. You won't see any model with specifications showing them as having only 262K colors.
nathanddrews - Friday, July 25, 2014 - link
Correct. While is IS technically possible to create a true 8-bit TN panel, I've never actually seen one, which makes me think it's prohibitively expensive and reserved for labs/industrial. They are always 6-bit with FRC (Frame Rate Control (temporal dithering)) or some form or spatial dithering. The former uses TN's pixel speed to rapidly alternate/flash between colors to give the perception of a different color while the latter combines colors on surrounding pixels to give the perception of a different color. The end user is typically unaware of this... until they compare it side by side with an IPS or know what to look for. Some 8-bit IPS displays also use these techniques to qualify as 10- or 12-bit.The more I learn about G-Sync and ASync (or whatever it's called), the less impressed I become with G-Sync. It seems like the G only operates between 25-60Hz whereas A works from like 6-240Hz (in theory). Also, A has the advantage of being rolled into the DP standard, which *may* accelerate adoption. Some people argue that over 60fps doesn't matter, but as a high-fps gamer, I want my monitor to match the framerate no matter how fast or slow the game is running - 30fps, 126fps, etc. No stutter or tearing at ALL framerates, please!
SlyNine - Friday, July 25, 2014 - link
wrong, Gsync doesn't have upper limits, It can do 30-144. Plus we haven't seen input latency comparisons. My understanding is Gsync gets the frame and displays it. Async tells the monitor what it expects future frames to be (so the videocard has to render at least 2 ahead). If that is the case then Gsync would have 1 frame less latency.nathanddrews - Friday, July 25, 2014 - link
http://www.blurbusters.com/gsync/preview2/I stand corrected. I didn't recall ever hearing NVIDIA mention or demonstrate anything over 60Hz. That's good news and puts them on equal footing performance-wise, but the advantage still goes to ASync since it could be used by any compatible DP monitor and GPU.
I'm definitely not an early adopter for this. I'm waiting (and saving) for 120Hz 4K DP 1.3 with ASync and the next big GPU from NVIDIA or AMD. I'm currently quite happy with my FW900.
TheJian - Saturday, July 26, 2014 - link
Advantage to Gsync because it's actually something you can BUY now. Let me know when AMD theory becomes reality ;)Also notice in AMD's slides in their demo of the tech they just say HELPS, rather then ELIMINATES period:
http://www.pcper.com/news/Graphics-Cards/AMD-Demon...
That's kind of like using words like Virtually gone etc. I'll believe the tech when a retail product is tested and it gets the accolades NV has already gotten in ACTUAL gaming vs. a windmill demo.
Also it won't be FREE. The same scaler companies NV couldn't get to budge (which is why they built the card) will have to be convinced by AMD. That R&D will be passed on to consumers which will increase monitor prices JUST LIKE Gsync. It's comic you're waiting for something that hasn't been proven. Like Ryan says:
"Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate."
We need actual in-depth gameplay testing to see how well this works. Everyone playing with gsync monitors says they'll never go back. Reality and theory are two different animals until proven otherwise. Note AMD wouldn't name the monitor, wouldn't name the scaler that even worked for their demo etc. Jeez.
"Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology."
You'll need a card and monitor most likely pretty much like gsync too. What part of this is free again? ;)
"Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards."
Hmmm....