Jarred’s Best of CES 2012

CES is all wrapped up and everyone is back home (presumably—there are probably a few who remained in Vegas to lose more money gamble a bit more), and one of the questions I’ve been asked repeatedly by friends and family is, “What was the coolest thing you saw at CES this year?” Now, keep in mind that I am only one person and I didn’t even see a fraction of the show floor, as there were plenty of meetings set up around Vegas, so this is just my perspective on the coolest technology trends at the show. You’ll also notice that there’s a common thread in what really impressed me, but this is a highly subjective topic so take it for what it’s worth: one man’s opinion. (And note that I am specifically not speaking for the other editors; I'm sure most of them would have a different top three.)

I Have Seen the Future, and the Future Is 4K

The most impressive thing I saw at the show for me is the 4K displays. Several places had such displays on hand, but I didn’t spend a lot of time with the various display/HDTV vendors so the first real close up encounter I had with a 4K display was at AMD’s meeting rooms. They had a 4K panel hooked up to a 7970 running an in-house demo. The demo itself wasn’t anything special, but the display… wow! I didn’t have a tape measure handy and the AMD reps I asked weren’t sure, but the panel appeared to be a 46” model (possibly 42”). I did check the native resolution, and while I’m not sure if all 4K displays will use the same resolution, this particular panel was running at 4096x2160, so it’s even wider than the current 16:9 aspect ratio panels (and closer to cinema resolutions); thankfully, with 2160 vertical pixels, I’m not sure many will complain about the loss of height.

Other than the sheer size of the display, what really stood out was the amazing clarity. The dot pitch at 4096x2160—even on a 46” display!—is slightly smaller than that of a 30” 2560x1600 display. I don’t actually need a finer dot pitch, and I had to increase the DPI of Windows in order to cope with my degrading vision (some text just looks too small to comfortably read from a couple feet away), but for videos and images I’m of the opinion that “more is always better” (provided you have the hardware to drive the resolution, obviously). Where I really see 4K being useful outside of people that love high DPI computer displays is for home theater enthusiasts that have 60” and larger displays—particularly projectors—where 1080p just doesn’t really cut it.

If you want another perspective, the consumer electronics industry is always looking for ways to get people to upgrade. When HDTV first came out, you had to choose between 720p and 1080i. A couple years later, 1080p launched and everyone “had to” upgrade. Then of course we had the 120Hz/240Hz/480Hz offerings, and 3D displays got thrown into the mix as well. Now that 1080p 120Hz displays are going for $500-$800 for 40-52” HDTVs, for a lot of people we’re at the point where our displays are good enough to last the next decade. So how do you convince people that they need to upgrade again? You come out with an even better standard. (I also suspect we’ll see a follow up to Blu-ray with native 4K support at some point in the not-too-distant future; that will also be when the content providers come up with a new “unbreakable” DRM standard that will cause a lot of grief and still get cracked within a year of launch.)

Now, I’m all for giant HDTVs, but even I would suggest that a 42” or 46” computer display sitting on your desk would be too much. Still, if I could get an IPS, PLS, or *VA panel and the weight was manageable for my desk, I’d be willing to give it a go. The only drawback I can really see is pricing; I don’t know what these displays will cost when they start showing up en masse at retail, but I wouldn’t be surprised to see five figures for a while. Then again, I remember when 60” plasma displays were going for >$20K about eight years ago, so given another decade we should see these panels in the <$1000 range (for 40-60”). However long it takes, when the price is right I know I’ll be eager to upgrade.

Looking Forward to WUXGA and QXGA Tablets
Comments Locked

78 Comments

View All Comments

  • therealnickdanger - Wednesday, January 18, 2012 - link

    That's your problem. 42" is too small to appreciate the detail. I know, I've got a few 1080p displays (17" notebook, 42" LCD, 60" plasma) and none of them compare to my 1080p projector (120"). 4K would be great to have though to more accurately capture the detail inherent to 35mm and 70mm film. 8K would be great too, but that's a ways away yet.

    We're "stuck" at 24fps because that's how film is shot and has been shot for about 100 years.
  • Finraziel - Wednesday, January 18, 2012 - link

    Well I'm exagerating my point slightly, I don't actually mean that I see no point at all in upping the resolution and obviously on way bigger screens the advantage will be more obvious, I'm just saying that I think that increasing the framerate might be a bigger win for a lot of people. As for being stuck on 24 fps because that's just how it's always been done, well, I guess you still go around with a horse and cariage or take the steamtrain/boat for larger distances? Just because something was done in a certain way for a long time doesn't mean you can't improve it. But I'm glad to see what name99 and B3an are saying below :)
  • name99 - Wednesday, January 18, 2012 - link

    You are right about frame rate but there is s small amount of good news on that front. A few Hollywood directors who actually understand tech and are in a position to push the issue (notably James Cameron) are trying to ramp up frame rates.

    http://www.hollywoodreporter.com/news/james-camero...

    Obviously with digital cinemas this is a lot easier to push, but I expect that even if
    Avatar2 is shot in 48 or 60 fps, there will be a long long period of crossover. I mean, my god, we're still stuck with interlace on plenty of broadcast TV.
  • B3an - Wednesday, January 18, 2012 - link

    The Hobbit movie is shot in 4k and 48 FPS.
  • sicofante - Tuesday, January 17, 2012 - link

    The problem with high-DPI displays for laptops and desktops is none of the main operating systems are designed to handle resolution-independent graphics. Even OSX does it in a tricky way, and it works because they control everything (as usual). Windows or Linux should go the true resolution-independence way (not the tricky OSX way). Then, and only then, maybe, and just maybe, manufacturers would consider enhancing the DPI of their screens and consumer would buy into them. While a user gets tiny text on any display, high-DPI displays can't start showing on ordinary computers. That just doesn't happen on tablets. That's why you get high-DPI displays there.

    BTW, true resolution independence calls for hardware acceleration, but that shouldn't be an issue on laptops, much less on desktops.
  • sicofante - Tuesday, January 17, 2012 - link

    I meant "NO hires displays for computers while on Windows, OSX or LInux" for the title. Don't understsand why there's no edit button here.
  • LesMoss - Tuesday, January 17, 2012 - link

    Not to mention that many web pages break at higher resolutions.
  • JarredWalton - Tuesday, January 17, 2012 - link

    They break because the browsers aren't coded to be DPI aware right now. I think a lot of this will get fixed with Windows 8 and Metro apps; we'll find out later this year. Anyway, I'm using a 30" display with 120 dpi setting in Windows 7 and browsing the web hasn't been one of my complaints (though I wish text and Flash would scale rather than being done at a fixed pixel size). I suppose if you define "break" as "are really tiny and hard to read on a high DPI display" then I can agree with you.
  • name99 - Wednesday, January 18, 2012 - link

    Bullshit. They break on your crappy browser.

    Do web pages display fine on iPhone Safari? OK then.

    I don't understand why people feel a compulsive need to say something doesn't work when proof that it works has been shipping for almost two years.
  • Malih - Tuesday, January 17, 2012 - link

    does the Yoga 13" have some sort of Thunderbolt port?

    I wish it does, external GPU is something I look forward to with future Ultrabooks to make my desktop obsolete, since my work doesn't use that much CPU anyway.

Log in

Don't have an account? Sign up now