Briefly announced and discussed during AMD’s 2015 GPU product presentation yesterday morning was AMD’s forthcoming dual Fiji video card. The near-obligatory counterpart to the just-announced Radeon R9 Fury X, the unnamed dual-GPU card will be taking things one step further with a pair of Fiji GPUs on a single card.

Meanwhile as part of yesterday evening’s AMD-sponsored PC Gaming Show, CEO Dr. Lisa Su took the stage for a few minutes to show off AMD’s recently announced Fury products. And at the end this included the first public showcase of the still in development dual-GPU card.

There’s not too much to say right now since we don’t know its specifications, but of course for the moment AMD is focusing on size. With 4GB of VRAM for each GPU on-package via HBM technology, AMD has been able to design a dual-GPU card that’s shorter and simpler than their previous dual-GPU cards like the R9 295X2 and HD 7990, saving space that would have otherwise been occupied by GDDR5 memory modules and the associated VRMs.

Meanwhile on the card we can see that it uses a PLX 8747 to provide PCIe switching between the two GPUs and the shared PCIe bus. And on the power delivery side the card uses a pair of 8-pin PCIe power sockets. At this time no further details are being released, so we’ll have to see what AMD is up to later on once they’re ready to reveal more about the video card.

POST A COMMENT

133 Comments

View All Comments

  • jackstar7 - Wednesday, June 17, 2015 - link

    You're saying a lot of things with no data anywhere to support your assertions. Just so people don't confuse your conjecture for facts. Reply
  • CPUGPUGURU - Wednesday, June 17, 2015 - link

    Get out of the cave you've been living in and OPEN your eyes to the Facts.

    The tech forums are full of gamer with mods using more than 4GB of VRAM but you are blind to those facts. You are also blind to the Fact that NO 28nm GPU in the world is memory bandwidth bottleneck with a properly designed GPU. YES, the next process node 16nm will need HBM, but not this many years old 28nm process.
    Reply
  • DigitalFreak - Wednesday, June 17, 2015 - link

    Yeah, this "4GB is plenty of VRAM" crap is getting old. It's been proven multiple times that some games will use over 4GB, and not just ones that are modded. http://www.hardocp.com/article/2015/06/15/nvidia_g... is a perfect example. Reply
  • CPUGPUGURU - Wednesday, June 17, 2015 - link

    For making a future proofing 4K gaming buying decision 6GB is a MUST HAVE, 4GB is too little and too lame to 4K game. Reply
  • Black Obsidian - Wednesday, June 17, 2015 - link

    Given Fury's significantly increased memory bandwidth and (alleged) additional compression algorithms, we don't know if it's subject to exactly the same limits as other cards.

    Surely we're all adults who can handle waiting a week or two for concrete data before making blanket statements.
    Reply
  • masouth - Wednesday, June 17, 2015 - link

    Yes, it will be nice to actually get the benchmarks but there is some validity to people's worries so far.

    If a game requires size X amount of textures to be held at certain resolutions and settings then bandwidth will mean absolutely squat. 4GB can't hold more than 4GB at a given time no matter how fast it is because speed has nothing to do with the capacity...just how fast you can move information into and out of that set capacity. =/
    Reply
  • CPUGPUGURU - Wednesday, June 17, 2015 - link

    masouth knows his Facts,

    Bandwidth, "mean absolutely squat" if more than 4GB of VRAM is needed performance is killed and there are NO 28nm GPUs that can utilize all the bandwidth that HBM provides, other than a smaller form factor that's negated if you add a massive radiator and reducing wattage HBM1 is just not needed on 28nm GPUs. AMD rushed into HBM thinking we would have GPUs built on smaller process node with many more transistors and that didn't happen with 20nm. Next year GPUs using 16nm TSMC will need HBM2 with vastly more GB of memory available, having only 4GB is future proofing 4K gaming obsolete.
    Reply
  • eachus - Thursday, June 18, 2015 - link

    Sigh! There are two questions here, and they interact a lot. First, how good is AMD's recent compression technology? The good/bad news can be summed up by the R9 285 card. You can buy 3 Gig 280 and 280X or 4 Gig 290 and 290X cards. I think I have even seen 8 Gig 290X cards floating around. But the card manufacturers, given the option (and it is a pretty easy option) to build 285 cards with 4 Gig of memory have, in effect refused to do so. Why, because of the compression the 285 can compete with the (3 Gig) 280 and 280X that are still available--at least for smaller displays.

    Once DX12 comes along, I expect that for 1920x1200 or smaller displays, a 2GB Tonga will be fine. For now buy a 280X or 290X if you are at 2560x1600 or so. I hope that 4 GB 380X cards will show up soon. (The R9 380 renames Tonga) but no 380X has been announced yet. I think AMD may be holding off to see the supply and demand for HBM. AFAIK, Tonga can use HBM but there is no point at 2 Gig.

    So what happens with Fiji. The simple answer is that 8 Gig cards will come (and 16 Gig dual GPU cards). But early next year is more likely than later this year. Practically speaking, though a 4 Gig 390X should use about 2/3rds of the memory needed for the nVidia 980 Ti. If these benchmarks are correct: http://www.techpowerup.com/213528/radeon-fury-x-ou... the picture is very clear... At 4K the (4 Gig) 390X outdoes the Titan X and 980 Ti. At 5K that is reversed, and at 8K resolution only the 8 Gig Nano and Titan X survive.

    When will we see 8 Gig 390Xs (and 4 Gig 380s)? Pretty soon I think. The issue is memory availability more than anything else.
    Reply
  • Mark_gb - Wednesday, June 17, 2015 - link

    Not according to AMD. And since they designed this card, know what technologies are built into the silicon and worked into the drivers, I suspect they know far better than you and I. And they say the Fiji chip will handle all 4K gaming just fine. Reply
  • Mark_gb - Wednesday, June 17, 2015 - link

    You cannot use an Nvidia card to make your point about a brand new GPU from AMD that was designed from the ground up to do 4K video.

    By your logic, a rocket exploded back in the 1950's... So all future rockets, no matter who makes them, will also explode.

    That doesn't work. Show us a real use situation on a Fiji processor where 4GB is not good enough, and then we'll say its not. But don't come walking in here with some fantasy situation like Prime95 generates on a CPU, and try to make us believe it.
    Reply

Log in

Don't have an account? Sign up now