Comments Locked

65 Comments

Back to Article

  • anactoraaron - Tuesday, December 28, 2010 - link

    That they would pack in USB 3.0, bluray and then put in that below average 1080p display. Not that it matters with Sandy Bridge on the horizon. Best advise is still to wait.
  • ET - Tuesday, December 28, 2010 - link

    It's nice to see 1080p becoming more prevalent at this size laptop, but why can't we see some higher res displays at 20"+? I had a 19" 1600x1200 CRT eight years ago, and resolution hasn't gone up since then, and even dropped from 1920x1200 to 1080p in recent times. Laptops these days have some high DPI displays and I'd love to see some on the desktop.
  • Ushio01 - Tuesday, December 28, 2010 - link

    1920x1080 monitors are replacing 1680x1050 TN panels in the mid range monitor segment just as 1680x1050 replaced 1280x1024 monitors with the advantage of either 120hz TN or IPS screens. 1920x1200 monitors still exist and are just as expensive as always along with the 2560x1440 and 2560x1600 in the high and very high end segments.
  • jabber - Tuesday, December 28, 2010 - link

    1080p will be a curse for us all in a couple of years time.

    Never will a standard have been surpassed and found wanting so quickly.

    They should have made it 1440p at least.

    Now us computers users have to suffer from the display world being lazy and sticking to a screen depth not much more that what we were used to 10 years ago.

    Thats progress.
  • DanNeely - Tuesday, December 28, 2010 - link

    I think the main bottleneck for the resolution picked for the HD standard was the capacity of dtv broadcast/blueray/hddvd disks without any compression artifacts. Bumping the frame sizes up 77% would have needed a significantly higher compression level and would've resulted in the videophiles who're currently reviling netflix/hulu/etc's streaming offerings for low quality to have slammed the new standards; potentially rendering them stillborn at birth, and almost certainly slowing adoption down significantly.

    The other hangup would be the size of the TV screen needed to get full use of the resolution in the living room. 1080p is generally not worthwhile on less than a 40" screen because the angular size of the pixels at 720p are too small to resolve at couch distance. The smaller pixels of a 1080p screen won't be visible as individual pixels until about 56". At the time the standards were being written 56" was an enormously large TV. It's still larger than most TVs sold today.

    Until that changes (and bluerays, or the bandwidth needed to stream them at full quality, become commodity items) I don't expect anything to change on the consumer video market. When that happens I expect the new standard will be one of the 4k resolutions; probably either 3996×2160 (1.85:1) or 4096×1714 (2.39:1). We'd also need a higher density video cable standard. DP 1.2 will carry the 2d version of either signal, but would need doubled again to support 3d. Hopefully lightpeak will be mainstream by then and able to carry the data.
  • TegiriNenashi - Tuesday, December 28, 2010 - link

    2.39:1 ? That is insane.
  • DanNeely - Tuesday, December 28, 2010 - link

    It's the wide-wide screen mode at theaters today. IT would render all but the largest desktop computer displays too short to be useful for anything except consuming content. The video industry would see this as a feature.
  • TegiriNenashi - Tuesday, December 28, 2010 - link

    I don't think letterbox has any future in the movie industry itself. Avatar 3D was rendered at 1.78 : 1. Let the 2.39:1 die, the sooner the better!
  • Hrel - Tuesday, December 28, 2010 - link

    here here
  • JarredWalton - Tuesday, December 28, 2010 - link

    Hear, hear?
  • DanNeely - Wednesday, December 29, 2010 - link

    I wouldn't hold my breath. The theater's originally went widescreen (1.85 in the US, 1.6 in the EU) to differentiate themselves from the 1.33 aspect ratio of TV and offer something more than a giant screen to compensate tor the extremely expensive food and obnoxious idiots you had to share the theater with.

    1.85 isn't much more than 1.77 and with 3D poised to invade the living room as well it won't serve well as a differentiator. Unless the studios decide to throw the theaters under a bus I expect something wider to go mainstream even if they stop short of 2.39.
  • DanNeely - Tuesday, January 4, 2011 - link

    The end has begun, Vizio just launched a pair of 2.33:1 TVs at 2560x1080. I hope everyone is looking forward to their 2013 laptop running at 1400x600. It won't be deep enough to have a touchpad, so your lousy low contrast ultra-superglare LCD will be covered in fingrerprints from the touchscreen layer.

    http://ces.cnet.com/8301-32254_1-20027127-283.html
  • therealnickdanger - Tuesday, December 28, 2010 - link

    The 1080 resolution was a standard HD resolution in the 80s and 90s, long before flat-screen, fixed-pixel displays were even being sold.

    While you may argue that 1080p is a step backward in resolution from the 1600x1200 CRTs of yesteryear - not even my beloved (and perfectly calibrated) Sony FW900 24" CRT can hold a candle to the clarity of my 1080p LCDs. Not to mention the LCDs are thinner, lighter, and much cheaper. Plus, having a true 1:1 pixel ratio for HD content is so much better. My wife is a professional video effects editor and can attest to the benefit of 1080p displays for her own reasons as well.

    That's progress.

    The only regress I can think of with modern displays is the loss of refresh rates over 60Hz. That's the only reason I keep the FW900 - for gaming w/VSYNC @85Hz and up. Analog FTW in that case. More and more 120Hz and 240Hz LCDs are coming out, but without proper mainstream connectivity, what's the point? Meh to that.
  • ET - Wednesday, December 29, 2010 - link

    I agree that in some respects current displays are better than what we had ten years ago, but some things took a step back, and even if everything else was equal, it's not such significant progress. If I want a monitor that's better than 1920x1200 I need to pay a lot more than I did for the 1600x1200 19" monitor I bought 8 years ago, and it'd be a lot larger.

    One would have thought that by now it'd be possible to display high quality text and images on a PC monitor, but somehow we've degenerated into believing video is the only application that matters.

    I agree that for standard users, who do just web and content surfing, current monitors are a step up from what they had in past years (1024x768, 1280x1024), but anyone more demanding could ten years ago get something that was a step up yet took about the same space and didn't cost 5 times more.
  • chemist1 - Tuesday, December 28, 2010 - link

    Yup, what DanNeely said is right. Even with Blu-Ray, which represents the highest data rate currently available for consumer 1080p video (roughly twice what you get with terrestrial HD broadcasts, which in turn have higher data rates than cable, satellite, hulu, and netflix), the signal has to be compressed an amazing ~100:1 vs. a raw video feed! Only the cleverness of the compression algorithms, combined with the fact that large parts of a typical picture don't change much from frame to frame, allow this compression to still look good ---though it is still perceptually lossy on a high-end system (I understand Joe Kane did some studies to determine what data rate you would need to avoid all perceivable compression losses, but the results were for a private client and thus not published).

    Plus don' t forget that the current bandwidth limitations force compromises not just in spatial resolution, but also chromatic and temporal resolution. Blu-Ray movies today have 8-bit color (allowing for only 2^8=256 gradations). The standard does allow for higher color depth (up to 16 bit), but that means more data and, with the current bandwidth limit, that in turn would necessitate more compression. Likewise, at 60 fps we'd get more temporal resolution than we do at 24 or 30 fps, which would result in less blurring during fast action scenes. But if you go to 60 fps, you've got to give something else up.

    I.e., with the current bandwidth limitations, we're at about the limit of how much spatial resolution the system can offer, unless we want to increase compression artifacts or give up further on the already-compromised chromatic or temporal resolution.

    Don't get me wrong -- I have a 100" screen (JVC RS1 projector), and would love to see a consumer 4K format. But I'd also like to see at least a 12-bit 4:4:4 color space, and fewer compression artifacts---which is not going to happen until they can offer a bandwidth about an order of magnitude higher than what Blu-Ray currently offers.

    And unfortunately, a lot of video seems to be moving in the same direction as music -- less resolution for more convenience. So I think it may be a while before we see market pressure for a higher-resolution video format.
  • DanNeely - Tuesday, December 28, 2010 - link

    We also appear to be reaching the limits in what compression can offer. Over the summer I read that the team working on the H.265 algorithm were concerned that they'd only be able to reduce bitrates to 70% of current levels while maintaining quality levels vs the 50% target that they'd set when beginning the design process.
  • torgal - Tuesday, December 28, 2010 - link

    Well, and now Dell XPS 15 no longer have the 1080p upgrade (http://www.dell.com/us/p/xps-15/fs). Or have I got the wrong XPS 15?
  • jigglywiggly - Tuesday, December 28, 2010 - link

    hai guise my name is asus we make a good laptop and then ruin it by putting a POS LCD on it.
  • Kaboose - Tuesday, December 28, 2010 - link

    I think with sandy bridge on the horizon the majority of the people this laptop seems to be targeted at would be better off waiting a month or so for something more substantial for their ~$1000.
  • jabber - Tuesday, December 28, 2010 - link

    Surely it doesnt take 1 minute to wipe a product down before taking pics of it?

    Just makes it seem a little more pro.
  • JarredWalton - Tuesday, December 28, 2010 - link

    I do wipe off fingerprints, but those glossy bezels pick up every little touch and the flash photography tends to bring them out more than usual. You're not seriously going to complain about one photo (out of a couple dozen) where a few fingerprints are somewhat visible, are you?
  • therealnickdanger - Tuesday, December 28, 2010 - link

    I dunno, I took time out of my busy day at work to read an article about a laptop I didn't know existed 10 minutes ago and probably will never buy anyway because the perfect laptop that I want doesn't exist/costs too much. It really bothers me that you didn't take more time to be professional and do it perfect. Now I'm going to be tormented for the rest of the day about that photo and my overall productivity is going to suffer. Thanks a lot. BTW, Merry Christmas and Happy New Year, jerks.

    <hopefully obvious sarcasm>
  • DanNeely - Tuesday, December 28, 2010 - link

    I disagree. Years of simply saying glossy sucks when it's where it'll get fingerprints on it hasn't hammered the point home to the PHBs who write the laptop design specs. Perhaps if reviewers all start showing pictures of how disgusting it ends up looking after a week or two of use the point will finally get through.
  • hybrid2d4x4 - Wednesday, December 29, 2010 - link

    That's actually not a bad idea, but very ballsy/risky. I could see the manufacturers getting pissed at the 1st site that did that, stop sending them review units, and then no other site would do it out of fear of getting the cold shoulder. Then again, they don't seem to care about reviewers ranting about these issues in text, so maybe I'm worried over nothing. More likely, though, mfg's don't actually bother to read reviews of their own products...
  • KZ0 - Tuesday, December 28, 2010 - link

    "Mafia 2 manages 35FPS at 769p and 21.5FPS at 1080p"
    Guessing you meant 768p.

    Thanks for another good review.
  • radium69 - Tuesday, December 28, 2010 - link

    When are you going to contact MSI, to review their G series? Especially the older GX740.
    Can't beat the value and the performance ;)

    It's a shame you guys,seem totally ASUS minded the last couple of months...
  • cgeorgescu - Tuesday, December 28, 2010 - link

    People... full HD on a regular 22" makes for 100ppi, that's pretty comfortable, but on 15.6 it means 141ppi, that's a lot of pixels per inch. Don't tell me about the font scaling in Win7 cause FullHD@125% displays exactly like 1600x900@100%, no advantage if all screen elements are bigger, I don't get any extra screen real estate. Plus that the scaling doesn't work with all apps, there are plenty who don't scale at all.

    I'm very used with 1400x1050@15", 116ppi, but I wouldn't stand 141ppi all day long. Am I having problems with my eyes, is everybody else comfortable with fullHD on 15.5 (usage of 12h/day)?
  • DanNeely - Tuesday, December 28, 2010 - link

    I'm not. 1600x900 seems to be a lot rarer on 14/15" laptops than 1680x1050 was a few years ago. For that matter, has anyone reviewed the current crop of 1600x900's to see if they're good panels like most of the 1920x1080's or garbage like the 1366x768s?
  • JarredWalton - Tuesday, December 28, 2010 - link

    The two 1600x900 displays I've seen in the last year are both junk. I also think 1080p on 15.6" will be a stretch for the over-40 crowd, but I'm okay with it. Those who suggest we need 4K screens on laptops, though... I have problems with a 30" LCD at 2560x1600; what would it be like to have that resolution in 1/4 the area!?
  • DanNeely - Tuesday, December 28, 2010 - link

    Enough DPI that AA won't be needed much. GPUs capable of pushing that many pixels are some years down the pipeline though. According to the Eyefinity lead at ATI 3x25 mega pixel monitors placed to completely fill your field of view would have a high enough DPI that you'd be unable to resolve individual pixels with your eyes. At typical laptop distances an 8MP screen would probably be approaching that level.
  • visibilityunlimited3 - Wednesday, December 29, 2010 - link

    I had problems with my eyes looking at an 14" XGA. I almost went blind before I got my 15.4" Inspiron 1920x1200 screen many years later. My eyes have fewer problems after looking at that high resolution display for many years now.
    Consider the difference between old dot matrix printers and laser printers. Is reading 1200dpi text uncomfortable? The real problem is Windows being optimized for low res screens. There are a few configuration changes that can help. I actually dual boot with Windows XP and Debian and prefer the Debian for being better equipped to manage the high resolution display. I spent a little extra time fine tuning the X Window System to do exactly what I wanted and I am very comfortable now. I am in no hurry to downgrade to the 1080p display until my old Pentium M gets really tired. The display is that much more important than anything else in my opinion. Thanks Jarred for recognizing the value of the premium displays.
    I would like an ebook reader with 1200dpi resolution to match my laser printer and expect that would be very comfortable to use also.
  • Davelo - Tuesday, December 28, 2010 - link

    How many laptops have went bad because of bad BGA solder of Nvidia chips? I've seen so many I would not touch one.
  • DanNeely - Tuesday, December 28, 2010 - link

    Weren't those all the first generation lead-free BGA chips? I didn't think any of the newer ones were having problems.
  • JarredWalton - Tuesday, December 28, 2010 - link

    Yeah, this is old news. A few anti-NVIDIA sites made a huge deal about the failures, but I never personally had any of those chips fail on me. Of course, I wasn't playing a lot of games on laptops, so maybe that's why. Anyway, anything in the post-8000M era should definitely be fine. Actually, I think it was mostly the old GeForce Go series that had problems.
  • sucram03 - Tuesday, December 28, 2010 - link

    Somewhat on & off-topic question: So in all honesty, with such a horrendous screen, where does that leave value-minded users that want a laptop with a nice 1080p screen and a GeForce video card? The application I'm thinking of is CUDA-accelerating H.264/AVC 1080p videos.

    The XPS 15 isn't listing the B+RG LED as an option, as mentioned in the article. Has anyone else heard from Dell about reasons why/if it will come back? The Clevo seems like an OK option but... well.. it now seems like the only option.

    Any thoughts?
  • JarredWalton - Tuesday, December 28, 2010 - link

    At this point, I'd say just wait for the CES announcements and see if anything new turns up. :-)
  • Kaboose - Tuesday, December 28, 2010 - link

    Could Dell be holding back the 1080p panels for a sandy laptop in the next few weeks?
  • sucram03 - Tuesday, December 28, 2010 - link

    All signs point to yes, especially that suspect smiley face from Jared :) Damn you insider knowledge!
  • Kaboose - Tuesday, December 28, 2010 - link

    thank goodness i have been saving up for the past two months in anticipation of sandy bridge and my need for a new portable computer. luckily they coincide.
  • JarredWalton - Tuesday, December 28, 2010 - link

    I've got Sandy Bridge, but I have no idea if Dell is holding back panels. I sure hope so...
  • Kaboose - Wednesday, December 29, 2010 - link

    i sure hope that is why there isn't a single dell laptop offering a 1080p screen at the moment. (including alienware taxed items)
  • chemist1 - Tuesday, December 28, 2010 - link

    Hi Jarred,

    Thanks for the review. A friend of mine recently priced out a Sony Vaio F series laptop: 16.4" 1080p screen, Blu-Ray R/W drive, NVIDIA GeForce GT 425M GPU, and an Intel Quad Core i7-840QM Processor (1.86GHz, turbo up to 3.20GHz) ---he said it was about $1300. Perhaps that is worth a review.....
  • cgeorgescu - Tuesday, December 28, 2010 - link

    Very nice laptop... Check out the "premium" screens on all Vaio models, really nice, not led-backlight or any fancy stuff but perfect angles, 100% adobe RGB, perfect. And matte.

    I've got not the F but the EC because of 17.3 instead of 16.4 and two drive bays.
  • chemist1 - Tuesday, December 28, 2010 - link

    correction: just checked it myself, and it's $1300 (on the Sony site) with a Core Quad i7-740QM processor (1.73GHz with turbo up to 2.93GHz).

    The EC series cgeorgescu mentioned might be an even better buy. With a 1080p 17.3" screen (a bit more suitable for 1080p than the F's 16.4) , Blu-Ray R/W, ATI HD 5650 (don't know how that compares with the 425M on the F series), and Core i5-580M processor (2.66GHz, with turbo to 3.33GHz ) (Core i7 not offered on the EC series), it prices out to $1200.

    And, as with the F series, if you downgrade from a Blu-Ray RW to a CV/DVD RW, you can subtract $150.
  • chemist1 - Tuesday, December 28, 2010 - link

    Further, if we downgrade the EC series to make it comparable to the Asus reviewed here (Blu Ray read only + CD/DVD RW, Core i5 460, 1080p), the Sony site has it at $1020 --- nearly the same as the $1030 Asus but with what I understand is a much better screen (plus the extra drive bay that cgeorgescu mentioned, and the free Adobe Acrobat/Photoshop bundle).
  • JarredWalton - Tuesday, December 28, 2010 - link

    Don't forget that quad-core Clarksfield CPUs are horribly power inefficient, so you'd sacrifice quite a bit of battery life. Given that Sandy Bridge will address this, there's basically no point in looking at any more Core 2010 or Clarksfield laptops.
  • chemist1 - Wednesday, December 29, 2010 - link

    Understood, thanks for your reply. But that leaves unanswered the obvious follow-up question, which is that of why, given that these Vaios have been out for a while, and given that they may represent the best value available in ~$1K laptops (say, the dual-core EC series), you folks didn't include them among your recent looks at mid-range laptops (e.g., the Vaios weren't mentioned in your 11/15/10 "Holiday Buyer's Guide: Notebooks"). Did you consider them and discount them for some reason, or was it something else? Since choosing what to review from amongst a large universe of products is a significant part of what a tech journalist must do, I was just wondering what goes into these sorts of decisions.
  • JarredWalton - Wednesday, December 29, 2010 - link

    The biggest issue is that Sony basically has no interest in seeding reviewers with hardware. While you could try to buy/review/eBay laptops, I don't have enough time/money to go that route, and we've been busy with other items. We did mention the VAIO Z in the guide, but most of the time I have difficulty justifying the Sony Tax. And not all Sony laptops have good displays either -- I've looked at more than a few at Best Buy, etc. Without hands-on time or input from someone I trust, I'm not willing to recommend a laptop as having a good LCD. :-\

    I'll see if I can get Sony to be a little more forthcoming at CES, but I've gone down that road before to no avail.
  • chemist1 - Wednesday, December 29, 2010 - link

    Thanks for the explanation! Why there had been no review of this particular (and seemingly high-value) part of the Vaio line was something I'd been curious about for a while, so it's nice to understand the manufacturer's role in this (a factor I had not considered).
  • Hrel - Tuesday, December 28, 2010 - link

    You guys and ur glossy bezel on the screen. Put ur thumb on the edge of the screen to open the laptop, there, problem solved. lol. wow.

    Other than that nit-picky sillyness I was REALLY saddened to see those low scores on that Asus. I read it had to same display as the Dell used to and got all excited then those scores... I guess they had to save money somewhere to hit 1000 bucks AND have a blu ray drive. Honestly, I almost never use disks at all anymore and have never even touched a blu ray disc. Don't include any CD drive at all, put in a bigger battery and better screen and non-name-brand speakers that don't suck and I'd be good. If the marketing guys insist on a cd drive use the cheapest one you can find.
  • JarredWalton - Tuesday, December 28, 2010 - link

    Look, it's not like we *try* to put fingerprints all over the laptops. Just regular use will put them there, even if you're careful (which I am). If I walked around with white gloves on all the time, it wouldn't be a problem, but I'm not going to do that. Saying "just use your thumb" doesn't entirely fix the problem either, because you WILL overlap into the glossy area every time. A better solution, amazingly enough, is to stop using stupid piano black glossy plastic on laptops. There, problem solved, and it wouldn't cost anything extra.
  • IanWorthington - Wednesday, December 29, 2010 - link

    Got to take issue with this "brighter is better". Maybe for some stuff but for photo editing, where you would care about color gamut, even 100 cd/m2 is likely to be over bright for accurate work.

    i
  • JarredWalton - Wednesday, December 29, 2010 - link

    You can always turn it down if you need to, but if you're outside and can't read the display because it's not bright enough (I've had that happen with numerous laptops over the years), then brighter *is* better. Apple does this with MacBook Pro, where they get up to 350nits or something, but you can always set it to 50% or 100nits or whatever if that's what you need/like.
  • blackrook - Wednesday, December 29, 2010 - link

    Now I'm just getting confused with ASUS's naming schemes. What does the N in N53 mean? The K? U? UL? G?

    It's the same with video cards, what with the GTX460 1GB/768MB/SE or the 5850 > 6850 business. Companies need to differentiate their product lines more intuitively.
  • 86waterpumper - Wednesday, December 29, 2010 - link

    I agree that displays have slipped lately. I am building a new desktop rig currently, and I hate the 16:9 displays enough that I am sticking to a old dell 4:3 17 inch. Pretty sad I would need a 24" or something to even match the height of it. My wife has a 17.3 hp dv7 laptop, and she downloaded the amazon kindle software. I double clicked it to check it out, and opened up a book...It looked hilarious to see the middle 1/4 of the screen being used and nothing on the sides, all of these apps and programs are going to have to start allowing for wrapping and double width viewing if this stupid trend continues. I literally wanted to turn the screen sideways it would have been much better.
    I tell you another problem too with the piano black glossy finishes. Our daughter is 7, and uses the laptop sometimes for schoolwork or to look at disney website etc. A very heavy laptop is a pain even for me to carry with such a slick surface. It is utter stupidity!!! I wonder how many people have dropped their expensive laptops and ruined them due to this. I always make sure my hands are 100 percent dry before carrying the thing, but it's really tough for my daughter which is why I have started to let her use my much lighter netbook more. Anyway a rougher, matte finish would provide tons more grip and look better on the fingerprint front as well. I can imagine the pain in the tail it must be for people doing these reviews to try to get the thing fingerprint free under camera flash.
  • Luke2.0 - Wednesday, December 29, 2010 - link

    Hi Jarred, can the "Blu-ray Combo" do DVD-burning?
    Thank you.
  • chemist1 - Thursday, December 30, 2010 - link

    If you google the drive name listed in the spec table at the beginning of the review ( "Philips/Lite-On DS-4E1S") you can get the full tech specs. But from what I vaguely recall, this drive can burn CD and DVD, but it's read-only for Blu-Ray.
  • JarredWalton - Thursday, December 30, 2010 - link

    Correct: this is DVDRW and BD-ROM. I would say "BD-ROM/DVD-ROM" otherwise, but I suppose you have no way of knowing that. :-)
  • Luke2.0 - Thursday, December 30, 2010 - link

    Thank you for answering my previous question.

    Got another question though: The spec table shows only 1 HDD. Does this laptop support dual HDD, or SSD-HDD combo?
    IMO It'd be a shame as a 15-incher not to be capable of it.
    I have checked Asus International and it does not seems to support it, but could you please confirm?
    http://www.asus.com/product.aspx?P_ID=zzD4OFFWhspr...

    Thank you again.
  • JarredWalton - Friday, December 31, 2010 - link

    There's no room for a second HDD. If you wanted to get creative, you could try removing the optical drive and installing a second drive there, but ASUS doesn't sell the necessary caddy so you're pretty much on your own. Actually, very few 15.6" or smaller laptops have room for two drives in my experience; that's usually a feature of 17" notebooks, or special laptops that skip out on other items in order to fit two 2.5" drives. Granted, there are exceptions, but I don't think we've reviewed any in the past year at least.
  • Luke2.0 - Friday, December 31, 2010 - link

    Aww.... I see. I must have had false first impression when first getting in touch with Asus G51 specification, now that one feels real huge.

    Still, when you mentioned in the review this N53J being "heavier, wider, thicker, deeper than that one which in turn slightly larger than yet another one" I had some hopes LOL. (Not blaming you for this)

    Thanks anyway.

    Happy New Year 2011~~~
  • JarredWalton - Sunday, January 2, 2011 - link

    I think the G51 is indeed heavier and larger than the N53; I was comparing the N53 to the Dell XPS 15. The G53JW in fact does support two hard drives, and it actually has some really interesting specs. When the Sandy Bridge refresh of that unit comes around, I'll be sure to hound ASUS about getting a review sample. We've looked at G73 twice, but no G53 yet.
  • Luke2.0 - Monday, January 3, 2011 - link

    Hi Jarred, I suppose you are busying yourself with new top-notch toy named Sandy Bridge.
    However, I stumbled upon this yesterday
    http://forum.notebookreview.com/asus-reviews-owner...
    Member "mzil" of the forum suspects the brightness level was not maxed during the test due to the reason he explained in there. Perhaps you could do a short check if this is so (and thus the display might be better than as recently reviewed)?

    Let us know how things turn out, won't you? =)
    Thank you.
    (Gotta read the SNB review asap, thanks for this one as well)
  • manu12 - Wednesday, January 12, 2011 - link

    there's something strange in your tests.
    For the Dell XPS15 laptop you specificy a
    ( 15.6" WLED Glossy 16:9 1080p (1920x1080) - AU Optronics B156HW1) screeh;
    and you do the same for this Asus N53JF
    (15.6" B+GR LED Glossy 16:9 1080p (1920x1080) - AU Optronics B156HW1)
    and you have 'graphics benches' wich are completely differents , including big differences in contrast ?!
  • klunoee - Wednesday, May 4, 2011 - link

    The only thing B&O is the amplifier for the speakers (B&O ICEpower technology). This means that the laptop has an efficient and powerful digital amplifier, but it tells you nothing about the quality of the speakers themselves or the audio codec delivering the sound to the amplifier.

Log in

Don't have an account? Sign up now