Benchmarking the Matrix

With the updated test suite, we’re also losing some points of reference to our back catalog of laptops. Obviously, the biggest change is in the gaming results, and we decided to take one of our recently reviewed laptops for a spin using the new benchmark suite. (We may look at adding a couple more lower end laptops from late 2011 to the charts as well in the near future.) ASUS was kind enough to let us hang onto the G74SX until the new suite was complete, and given the reasonably high-end hardware and continued availability, it makes for a good starting point for our 2012 laptop results. We updated to the latest NVIDIA drivers (290.56 at the time of testing) and ran through all of our gaming tests. You can find the complete results in Mobile Bench, and the games are all grouped under the Mobile Gaming 2012 category; since we only have one laptop tested right now, we’ve summarized the gaming scores below.

In our 2011 gaming suite, the ASUS G74SX—and NVIDIA’s GTX 560M—proved capable of handling the majority of games at our Enthusiast settings and 1080p while still breaking 30 FPS. With some of the latest titles at similar “maxed out” settings, frame rates now drop below 30FPS in five of the seven titles, but remember that our new Enthusiast is equivalent to last year’s “Ultra”. There are certainly other games that will tax the GTX 560M, and our recommendation is that you consider disabling antialiasing or dropping the quality down a notch if you want higher frame rates, but in general the GTX 560M is still a good solution for notebook gamers.

Closing Thoughts

As a sci-fi buff, it’s pretty exciting to see the rapid pace of advancement over the last few years. Today’s smartphones pack about as much power in a small portable device as the PCs we used less than a decade ago. If you’ve ever dreamed of real-world tricorders and holodecks—or maybe cyberspace and Ono-Sendai decks—they’re getting tantalizingly close. Maybe we won’t have exactly what the sci-fi writers of 20 or 30 years ago envisioned, but we’re definitely shedding the wires and I look forward to seeing where we will be in another ten years!

Back on topic, no benchmark suite can ever (reasonably) contain every performance metric, and we do understand that mobile gaming is still a small piece of the larger mobility pie. Even so, it’s still important to consider mobile GPU performance, and with the improving nature of integrated graphics we felt it was time to finally ditch the 2006-era graphics quality settings and shoot for something more visually appealing. Our mobile gaming suite now represents some of the latest DX11 titles, and even at our Value settings all of the games look quite good. If you’re looking for basic gaming capabilities, all you really need is a mobile GPU that can hit 30 FPS at our Value settings in all seven titles and you should be set. If you’re after higher quality and higher resolutions, you’ll want something more than midrange GPUs, but be prepared to pay the price—both in terms of cost as well as in terms of notebook size.

With the updated laptop benchmarks now in place, we’re still early enough in 2012 that if you can make a good case for other benchmarks that we haven’t included we’re willing to consider adding a couple more tests. Remember that the goal is to provide a reasonable test suite from which you can estimate performance in other similar benchmarks, so adding three more video encoding tests isn’t really going to add much; on the other hand, if there’s a class of application you don’t feel our test suite adequately covers, sound off in the comments.

As a final thought, I’ve been the head laptop tester at AnandTech since early 2006. While we have frequently heard about the increasing importance of laptops in the overall computer market, the past two years have really shown tremendous growth. We had seven mobile articles on AnandTech in 2006, 15 in 2007 and 2008, and 32 in 2009. That’s pretty reasonable, but then in 2010 we had a whopping 107 mobile articles and 2011 eclipsed that with 166 articles. Wow! Granted not all of the articles in the past two years are about laptops, and we've had a lot of shorter articles in the past two years, but however you want to view it one thing is eminently clear: mobile devices are now well and truly established and our increased coverage reflects that. It’s also worth noting that Intel’s Sandy Bridge and AMD’s Llano launches were both more about the mobile sector than about desktops, and the upcoming Ivy Bridge, Trinity, and Haswell appear to continue that trend.

Here's looking forward to another awesome year in the mobile space, kicking off with CES next week. Hint: besides the usual plethora of large displays and 3D demonstrations, CES is all about smartphones, tablets, and laptops. (I almost feel sorry for Brian...almost.)

All New Gaming Test Suite


View All Comments

  • JarredWalton - Saturday, January 7, 2012 - link

    I don't know anyone that uses PowerDirector 10, so I'd be curious about how it's viewed (note: I'm not a video editor by any stretch). WinZip on the other hand is a far more interesting option; I'll keep an eye out for that one. :-) Reply
  • Ryan Smith - Saturday, January 7, 2012 - link

    I'd note that we dropped our video encoding benchmark on GPU Bench midway through the year last year, because GPU accelerated video encoding was actually CPU limited. Performance quickly plateaued at GTX 460/Radeon 5750 levels, as at that point the GPUs outran the CPU. Reply
  • QChronoD - Friday, January 6, 2012 - link

    Would it be possible to add the screen size to the specs listed for each system in Bench? It's kinda silly to be missing since that's one of the primary criteria people use to narrow down models. Reply
  • ArKritz - Friday, January 6, 2012 - link

    Wouldn't it make more sense to just use medium presets for the medium benchmark, high for high, ultra (or very high) for ultra and just drop the "low" benchmarks altogether? Reply
  • JarredWalton - Saturday, January 7, 2012 - link

    Basically, we've done what you suggested, only we call the settings Low, Medium, and High rather than Medium, High, and Ultra. It just seems weird to call test settings Medium/High/Ultra--or take it to another level and test at High/Very High/Extreme--when we can call the settings Low/Med/High. It's just semantics. Anyway, the settings were selected for two things:

    1) Get reasonable quality for the target res/setting (Min/Low is often insufficient)
    2) Make sure there's a difference between the detail settings (this is why we don't test DiRT 3 at Ultra and Ultra + 4xAA for instance).

    In several games, the difference between many of the settings is negligible, both in terms of quality and in terms of performance. We don't feel there's much point in testing at settings where games run at 100+ FPS unless there's no other option (e.g. Portal 2), and likewise we didn't want to have results where it was basically same quality, different resolution (unless we couldn't find a better option). Batman is another example of this, as 1366x768 at Low settings is only slightly slower than 1366x768 at Very High settings. Anyway, the main thing was to let people know exactly how we plan to test in one location, so that I can just link it in future reviews.
  • Gast - Saturday, January 7, 2012 - link

    Have you looked at changing the names to providing some sort of meaning other than just the level of the test? Something along the lines of "Playable, High Quality, Max Quality"

    Changing the names from Medium, High, and Ultra will be jarring for me. When skimming I will see "Low" and think the minium settings needed to run the game. Which is different than the "playable" or "medium" settings you are presenting.

    While I can learn to adjust to this change, irregular AT readers might not and walk away with the wrong impression of what the test was representing.
  • PolarisOrbit - Saturday, January 7, 2012 - link

    I agree that when you use the terms "low / medium / high" there is an implication that you may be referring to the in-game settings rather than your interpretation of the different settings that are worth benchmarking. A careless reader may not notice the difference.

    To me, it makes sense to compare a cheap laptop to the cheap level and an expensive laptop to the expensive level (obviously I mean expensive gaming laptop since for a gaming benchmark). So I would suggest dividing it by market segment like so:
    low -> value settings
    medium -> mainstream settings
    high -> performance settings
  • JarredWalton - Saturday, January 7, 2012 - link

    Our charts will continue to list the game settings we use for testing, plus I intend to link back to this article on the gaming section so that new readers can understand exactly what we're testing. We could also call the settings "Value/Mainstream/Performance" or something, but all that really says to me is "they are using custom settings so make sure you know what is being tested". Which isn't necessarily a bad thing.

    I think at some point I need to go through the games and capture screenshots at our Low/Med/High settings as well to give a better indication of what the various settings mean -- and maybe add a "minimum" screenshot as well to show why we're skipping that in most titles. That probably won't happen until post-CES though.
  • Gast - Saturday, January 7, 2012 - link

    "they are using custom settings so make sure you know what is being tested"

    That's basically what I'm pushing for. If would be ok if your medium test was very similar to the medium setting, but since almost all of your tests have a naming conflict with in game settings (low test = medium settings) I would find it helpful to call them something different.
  • JarredWalton - Saturday, January 7, 2012 - link

    Okay, I've gone ahead and renamed the benchmarks to Value/Mainstream/Enthusiast, as I can see where confusion might otherwise result. Hopefully I caught all the references, but if not I'm sure someone will set me straight. :-) Reply

Log in

Don't have an account? Sign up now