Gaming Performance

For Z590 we are running using Windows 10 64-bit with the 20H2 update.

Civilization 6

Originally penned by Sid Meier and his team, the Civilization series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer underflow. Truth be told I never actually played the first version, but I have played every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, and it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

GTX 1080: Civilization VI, Average FPSGTX 1080: Civilization VI, 95th Percentile

Shadow of the Tomb Raider (DX12)

The latest installment of the Tomb Raider franchise does less rising and lurks more in the shadows with Shadow of the Tomb Raider. As expected this action-adventure follows Lara Croft which is the main protagonist of the franchise as she muscles through the Mesoamerican and South American regions looking to stop a Mayan apocalyptic she herself unleashed. Shadow of the Tomb Raider is the direct sequel to the previous Rise of the Tomb Raider and was developed by Eidos Montreal and Crystal Dynamics and was published by Square Enix which hit shelves across multiple platforms in September 2018. This title effectively closes the Lara Croft Origins story and has received critical acclaims upon its release.

The integrated Shadow of the Tomb Raider benchmark is similar to that of the previous game Rise of the Tomb Raider, which we have used in our previous benchmarking suite. The newer Shadow of the Tomb Raider uses DirectX 11 and 12, with this particular title being touted as having one of the best implementations of DirectX 12 of any game released so far.

GTX 1080: Shadow of the Tomb Raider, Average FPSGTX 1080: Shadow of the Tomb Raider, 95th Percentile

Strange Brigade (DX12)

Strange Brigade is based in 1903’s Egypt and follows a story which is very similar to that of the Mummy film franchise. This particular third-person shooter is developed by Rebellion Developments which is more widely known for games such as the Sniper Elite and Alien vs Predator series. The game follows the hunt for Seteki the Witch Queen who has arisen once again and the only ‘troop’ who can ultimately stop her. Gameplay is cooperative-centric with a wide variety of different levels and many puzzles which need solving by the British colonial Secret Service agents sent to put an end to her reign of barbaric and brutality.

The game supports both the DirectX 12 and Vulkan APIs and houses its own built-in benchmark which offers various options up for customization including textures, anti-aliasing, reflections, draw distance and even allows users to enable or disable motion blur, ambient occlusion and tessellation among others. AMD has boasted previously that Strange Brigade is part of its Vulkan API implementation offering scalability for AMD multi-graphics card configurations. For our testing, we use the DirectX 12 benchmark.

GTX 1080: Strange Brigade DX12, Average FPSGTX 1080: Strange Brigade DX12, 95th Percentile

CPU Performance, Short Form Overclocking
Comments Locked

39 Comments

View All Comments

  • JVC8bal - Friday, April 30, 2021 - link

    I don't understand your point you responding to what I wrote. This has nothing to do with AMD vs. Intel. I guess there is a MAGA-like AMD crown on here looking for conspiracies and confrontations.

    As written above, the PCIE 4.0 specification implementation first found on x570 showed up on Intel's first go-around. If anything can be said, those working on the Intel platform motherboards learned nothing from prior work on the AMD platform. But whatever, read things through whatever lense you do.
  • TheinsanegamerN - Friday, April 30, 2021 - link

    I thought it was more of a BLM- like intel crowd that looks for any pro AMD comment and tries to railroad it into the ground while dismissing whatever merit the original comment may have had
  • TheinsanegamerN - Wednesday, April 28, 2021 - link

    I'm dissapointed that these newer boards keep cutting down on I/O. This board only offers 3 PCIe X16 slots, the third is only x4 and the second cuts half the bandwidth from the first slot despite multi GPU being long dead. So if you had, say, a sound card and a capture card, you'd have to cut your GPU slot bandwidth in half AND have one of the cards right up against the GPU cooler.

    IMO the best setup would have all the x1/x4 slots ont he bottom of the motherboard so you can use a tiriple slot GPU and still have 3 other cards with room between for breathing, with all the bottom slots fed fromt he chipset not the CPU.

    And for those whoa re going to ask: "why do you want more expansion everything is embedded now blah blah". If you only have a GPU and dont use the other slots that's why you have mini ITX, or micro ATX if you want a bigger VRM. Buying a big ATX board for a single expansion card is a waste.
  • abufrejoval - Thursday, April 29, 2021 - link

    While I am sure they'd love to sell you everything you're asking for, I'm less convinced you'd be ready to pay the price.

    You can't get anything but static CPU PCIe lane allocations out of a hard wired motherboard, with bi/tri/quad-furication already being a bonus. You need a switch on both ends for flexibility.

    That's what a PCH basically is, which allows you to oversubscribe the ports and lanes.

    In the old 2.0 days PCIe switch chips were affordable enough ($50?) to put next to the CPU and gain full multiple x16 slots (still switched), but certainly not without a bit of latency overhead and some Watts of power.

    All those PCIe switch chip vendors seem to have been bought up by Avago/Broadcom who have racked up prices, probably less because they wanted to anger gamers, but because these were key components in NVMe based storage appliances where they knew how much they could charge (mostly guessing here).

    And then PCIe 3.0 and 4.0 are likely to increase motherboard layout/trace challenges, switch chip thermals or just generally price to the point, where going for a higher lane-count workstation or server CPU may be more economical and deliver the full bandwidth of all lanes.

    You can get PCIe x16 cards designed to hold four or eight M.2 SSDs that contain such a PCIe switch. Their price gives you some idea of the silcon cost while I am sure they easily suck 20 Watts of power, too.

    If you manage to get a current generation GPU with PCIe 4.0, that gives you PCIe 3.0 x16 equivalent performance even at x8 lanes. That's either enough, because you have enough VRAM, or PCI 4.0 x16 won't be good enough either. At both 16 or 32GByte/s PCIe is little better than a hard disk, when your internal VRAM delivers north of 500GB/s...because that's what it takes to drive your GPU compute or the game.

    The premium for the ATX form factor vs a mini ITX is pretty minor and I couldn't care less how much of the tower under my desk is filled by the motherboard. I tend to go with the larger form factors quite simply because I value the flexibility and the ability to experiment or recycle older stuff. And it's much easier to manage noise with volume.
  • TheinsanegamerN - Friday, April 30, 2021 - link

    Boards like the gigabyte X570 elite exist, which have a plethora of USB ports and multiple additional expansion ports none of which sap bandwidth from the main port.

    This master is a master class is taking money for looking "cool" and offering nothing of value.
  • Spunjji - Thursday, April 29, 2021 - link

    Agreed, that layout is a big mess and rather defeats the point of having an ATX board - but then a huge number of these are just going to go into systems that have one GPU and nothing else, but the buyer wants ATX just because that's what they're used to 🤷‍♂️
  • Linustechtips12#6900xt - Thursday, April 29, 2021 - link

    AGREED, my b450m pro 4 has like 4 USB 3, 1 USB-a 10gbps, 1 USB-c 10gbps and 2 USB 2.0. frankly amazing io and i couldn't appreciate it more
  • Molor1880 - Thursday, April 29, 2021 - link

    Not completely the motherboards fault though. There are only 20 PCIe 4.0 lanes from the CPU. 4 for IO and 16 for graphics. There are no general purpose PCIe 4.0 lines off the Z590 chipset, and the DMI link is wider, but still just PCIe 3.0. When Intel starts putting general purpose PCIe 4.0 lanes on the chipset (690?), a lot of those issues would be resolved. Otherwise, it's a bit of a wonky workaround to shift things for one generation.
  • Silver5urfer - Wednesday, April 28, 2021 - link

    Unfortunately GB BIOS is not that stellar ? And why does this mobo have a fan to cool the 10G LAN chip ? I do not see that with some other boards like X570 Xtreme, X570 Prestige Creation and Maximus XIII Extreme.
  • TheinsanegamerN - Thursday, April 29, 2021 - link

    Gigabyte BIOS is fine, the UI is a tad clunky, but hey it's a huge leap from BIOSes from the core 2 era. Just takes a little getting used to.

Log in

Don't have an account? Sign up now