Taking on the Dark Lord, Mobile Style

There have been a few recent product launches, with more to come in the near future, from AMD, Intel, and NVIDIA. On the CPU side we have Intel’s Ivy Bridge and AMD’s Trinity, both arguably more important for laptop users than for desktops—and in the case of Trinity, it’s currently laptops only! The two products both tout improved performance relative to the last generation Sandy Bridge and Llano offerings, and in our testing both appear to deliver. Besides the CPU/APU updates, NVIDIA has also launched their Kepler GK107 for laptops, and we’re starting to see hardware in house; AMD likewise has Southern Islands available, but we haven’t had a chance to test any of those parts on laptops just yet. With all this new hardware available, there’s also new software going around; one of the latest time sinks is Blizzard’s Diablo III, and that raises a question in the minds of many laptop owners: is my laptop sufficient to repel the forces of Hell yet again? That’s what we’re here to investigate.

Before we get to the benchmarks, let’s get a few things out of the way. First, Diablo III, for all its newness, is not a particularly demanding game when it comes to graphics. Coming from the same company as World of WarCraft and StarCraft II, that shouldn’t be too surprising: Blizzard has generally done a good job at ensuring their games will run on the widest array of hardware possible. What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft (note: I'm not counting the DX11 update that came out with Cataclysm), DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings.

The second thing to bring up is the design of the game itself. In a first person shooter, your input is generally linked to the frame rate of the game. If the frame rate drops below 30 FPS, things can get choppy, and many even consider 60 FPS to be the minimum desired frame rate. Other types of games may not be so demanding—strategy games like Civilization V and the Total War series for instance can be played even with frame rates in the teens. One of the reasons for that is that in those two titles, mouse updates happen at the screen refresh rate (typically 60 FPS), so you don’t feel like the mouse cursor is constantly lagging behind your input. We wouldn’t necessarily recommend <20 FPS as enjoyable for such games, but it can be tolerable. Diablo III takes a similar approach, and as a game played from a top-down isometric viewpoint, 30 FPS certainly isn’t required; I have personally played through entire sections at frame rates in the low to mid teens (in the course of testing for this article), so it can be done. Is it enjoyable, though? That’s a different matter; I’d say 30 FPS is still the desirable minimum, and 20 FPS is the bare minimum you need in order to not feel like the game is laggy. Certain parts of the game (e.g. interacting with your inventory) also feel substantially worse at lower frame rates.

Finally, there’s the problem of repeatability in our benchmarks. Like its predecessors, Diablo III randomizes most levels and areas, so finding a section of the game you can benchmark and compare results between systems and test runs is going to be a bit difficult. You could use a portion of the game that’s not randomized (e.g. a town) to get around this issue, but then the frame rates may be higher than what you’d experience in the wilderness slaying beasties. What’s more, all games are hosted on Blizzard’s Battle.net servers, which means even when you’re the only player in a game, lag is still a potential issue. We had problems crop up a few times during testing where lag appeared to be compromising gameplay, and in such cases we retested until we felt the results were representative of the hardware, but there’s still plenty of potential for variance. Ultimately, we settled on testing an early section of the game in New Tristram and in the Old Ruins; the former gives us a 100% repeatable sequence but with no combat or monsters (and Internet lag is still a potential concern), while the latter gives us an area that is largely the same each time with some combat. We’ll be reporting average frame rates as well as providing some FRAPS run charts to give an overall indication of the gaming experience.

And one last disclaimer: I haven’t actually played through most of Diablo III. Given what I’ve seen so far, it would appear that most areas will not be significantly more taxing later in the game than they are early in the game, but that may be incorrect. If we find that later areas (and combat sequences) are substantially more demanding, we’ll revisit this subject—or if you’ve done some informal testing (e.g. using FRAPS or some other frame rate utility while playing) and you know of an area that is more stressful on hardware, let us know. And with that out of the way, let’s move on to our graphics settings and some image quality comparisons.

Update: Quite a few people have pointed out that later levels (e.g. Act IV), and even more so higher difficulty levels (Hell) are significantly more demanding than the early going. That's not too surprising, but unfortunately I don't have a way of testing later areas in the game other than to play the game through to that point. If performance scales equally across all GPUs, it sounds like you can expect Act IV on Hell to run at half the performance of what I've shown in the charts. Give me a few weeks and I'll see if I can get to that point in the game and provide some additional results from the later stages.

Diablo III Graphics Settings and Image Quality
Comments Locked

87 Comments

View All Comments

  • mepenete - Saturday, May 26, 2012 - link

    Thank you so much for this article. I'm looking to buy a new sub $500 laptop and was looking at the AMD A8 processors... and then the A10's got announced.

    Regardless, I was curious of how they would stack up. Really helpful seeing how midrange equipment stacks. Looks like I'm gonna pick up an A10 laptop.
  • JarredWalton - Saturday, May 26, 2012 - link

    Hopefully prices come down a bit; right now the only A10-4600M laptops I can find are going for over $700. They're decent chips overall, but I'm not convinced they're better than a dual-core Sandy Bridge with GT 540M. The Acer I used is clearly not the best representative of that market, as the 13.3" chassis is quite thin and just can't cool the CPU+GPU well enough to avoid throttling; pretty much any 15.6" chassis should do better.
  • frozentundra123456 - Saturday, May 26, 2012 - link

    It seemed to me that the Llano laptops were marketed the same way. The A8 seemed to always be considerably more expensive than the A6 model, when the chips could not be that much different in cost. However, the A8 was usually better equipped with more ram.
  • JKnows - Saturday, May 26, 2012 - link

    Pretty cool from A10-4600M to providing the same performance as Sandy Bridge with GT 540M for half energy consumption. Is it two weeks after Trinity's launch? Too bad still not possible to find these laptops.
  • CeriseCogburn - Saturday, June 2, 2012 - link

    Right and the price is not going to be nice, so later in here a GT540 optimus laptop is recommended with a link by the author. $599 MC, $679 egg.

    So long gone is the hope the A10 comes in a cheap $350 or $400 walmart special like the (low end) brazos.

    If amd is going to charge high prices for their cheapo chip, the rest of the laptop had better be awesome not some plastic creaking crud - and the screen had better be good.

    I think what we'll see instead is junky cheap builds that cost a lot.
  • QuantumPion - Tuesday, May 29, 2012 - link

    I got the Acer laptop with the core i5 and GT540M at newegg for $499 last year. That or a similar model would probably be a better bet.
  • CeriseCogburn - Saturday, June 2, 2012 - link

    Well that was a find, still hard to get at that price.
  • narlzac85 - Saturday, May 26, 2012 - link

    For people planning to play into Hell or Inferno difficulties. Be aware that elite monster packs will have 3 magical abilities on Hell and 4 on Inferno. You could also run into two or more packs of monsters at once, so you could be looking at 6, 8 or more magical effects. Also, some of the monster abilities cause them to replicate. I've definitely seen 30 or more monsters at once with the entire screen covered with fire, poison and lightning.

    I don't play on a laptop or use an IGP, but I assume this could have negative impact on performance. Normal mode and even Nightmare mode would probably not be too bad though.
  • cjb110 - Saturday, May 26, 2012 - link

    I guess this is where the Low FX setting will be most useful then.

    Certainly a play through on normal doesn't seem to really get worse near the end...on average the mob size is slightly bigger, but that's about it.

    Co-op would be another interesting test, that's probably the most demanding that the graphics would get.
  • Herald85 - Saturday, May 26, 2012 - link

    I'm in Nightmare Act 4 and in the Keep (Tristram equivalent, the homebase) I get over 100fps. Taking on average mobs outside the Arreat Gate drops fps to 80. And that's on a 6950 , 1900x1200 everything maxed.

    On my laptop with a 8600m GT I could play fine in Normal until I got to Act4. I can still play but it's annoyingly choppy when there are lots of mobs. The 8600m GT is mentioned as supported 'Low' on the Bliz site.

    I would LOVE to see a review done on Hell.

Log in

Don't have an account? Sign up now