AMD Kaveri Docs Reference Quad-Channel Memory Interface, GDDR5 Option
by Anand Lal Shimpi on January 16, 2014 10:51 PM ESTOur own Ryan Smith pointed me at an excellent thread on Beyond3D where forum member yuri ran across a reference to additional memory controllers in AMD's recently released Kaveri APU. AMD's latest BIOS and Kernel Developer's Guide (BKDG 3.00) for Kaveri includes a reference to four DRAM controllers (DCT0 - 3) with only two in use (DCT0 and DCT3). The same guide also references a Gddr5Mode option for each DRAM controller.
Let me be very clear here: there's no chance that the recently launched Kaveri will be capable of GDDR5 or 4 x 64-bit memory operation (Socket-FM2+ pin-out alone would be an obvious limitation), but it's very possible that there were plans for one (or both) of those things in an AMD APU. Memory bandwidth can be a huge limit to scaling processor graphics performance, especially since the GPU has to share its limited bandwidth to main memory with a handful of CPU cores. Intel's workaround with Haswell was to pair it with 128MB of on-package eDRAM. AMD has typically shied away from more exotic solutions, leaving the launched Kaveri looking pretty normal on the memory bandwidth front.
In our Kaveri review, we asked the question whether or not any of you would be interested in a big Kaveri option with 12 - 20 CUs (768 - 1280 SPs) enabled, basically a high-end variant of the Xbox One or PS4 SoC. AMD would need a substantial increase in memory bandwidth to make such a thing feasible, but based on AMD's own docs it looks like that may not be too difficult to get.
There were rumors a while back of Kaveri using GDDR5 on a stick but it looks like nothing ever came of that. The options for a higher end Kaveri APU would have to be:
1) 256-bit wide DDR3 interface with standard DIMM slots, or
2) 256-bit wide GDDR5 interface with memory soldered down on the motherboard
I do wonder if AMD would consider the first option and tossing some high-speed memory on-die (similar to the Xbox One SoC).
All of this is an interesting academic exercise though, which brings me back to our original question from the Kaveri review. If you had the building blocks AMD has (Steamroller cores and GCN CUs) and the potential for a wider memory interface, would you try building a high-end APU for the desktop? If so, what would you build and why?
I know I'd be interested in a 2-module Steamroller core + 20 CUs with a 256-bit wide DDR3 interface, assuming AMD could stick some high-bandwidth memory on-die as well. More or less a high-end version of the Xbox One SoC. Such a thing would interest me but I'm not sure if anyone would buy it. Leave your thoughts in the comments below, I'm sure some important folks will get to read them :)
127 Comments
View All Comments
chaosbloodterfly - Friday, January 17, 2014 - link
This. Imagine a slightly worse quality Razer Blade with 90% of the performance for under $1k.robbertbobbertson - Thursday, January 16, 2014 - link
Make a video card that is just an APU with its own memory. Imagine putting 2 of them in an APU system. extra everything..... i mean as long as im throwing reality out the windowmeacupla - Thursday, January 16, 2014 - link
I am not convinced that a big Kaveri is possible without a significant heatsink upgrade, seeing as the current A10-7850K is already at 95W TDP. PS4/XB1 utilize 8x low power and slow bobcat cores to keep their TDP lower.It would be interesting if AMD could do a big Kaveri while getting thermals lower. Such an APU would be amazing inside a desktop replacement ultrabook, laptop or NUC like computer.
Get the big Kaveri to fit laptops without throttling and AMD could give i7-4xxxMQ a run for its money.
jimjamjamie - Friday, January 17, 2014 - link
On the subject of big Kaveri, if AMD cannot push the frequencies high then surely it would make sense to release a 3 or 4-module version for the desktop?This would have a potentially serious impact - enthusiasts could now get FX-class performance - with the potential of huge gains due to the GPU compute features - from an APU. This would also mean that AMD could EoL the AM3 socket and focus on just FM3 (obvious benefits here), thus migrating all their customers onto APUs. Enthusiasts are enthused, and AMD has a streamlined product line.
testbug00 - Friday, January 17, 2014 - link
cost to design and validate those parts is not worth it. Not to mention you would need to design and validate a new socket.It would be a good thing for overclockers, but anyone else would just have something that is more or less the speed of a 8350.
lmcd - Friday, January 17, 2014 - link
You know the Athlons that ran in FM2? Imagine that instead of just cutting the iGPU, they dumped in two more modules.frostyfiredude - Thursday, January 16, 2014 - link
A quad channel interface and 6MB L3 added to the current version would have been a great enhancement, along with higher GPU clock rates in the 100W version to best make use of the die. The L3 missing in these APUs is quite puzzling to me, I would have expected the APUs and their memory starvation to have been an ideal candidate for an L3.Next year they will move to TSMC's 20nm node I imagine, it wouldn't surprise me if AMD went with a larger 896 or 1024 SPU GPU design, added an L3 and a wider DDR4 memory interface to keep the extra cores fed.
kwrzesien - Friday, January 17, 2014 - link
Take out the GPU, put in four modules and an enormous L3 cache like you say and have quad channel DDR3 memory. With a discrete video card this will be a great (and competitive) desktop solution. If they have to leave a few CU's in just to support integrated desktop video then fine, but stop trying to game on it. There just isn't enough heat dissipation from the CPU to compete with a dedicated card that has it's own cooling system.FernanDK - Thursday, January 16, 2014 - link
Uh, what i would be interested in buying is a GDDR module.. it would also be vital that both IGP and traditional GPUs (even the ones made by other brands) would have to have priority/instant access to the GDDR module, this is to ensure customizability and hardware endurance.What would also be cool, is that costumers would have the opportunity to buy a larger size GDDR if i need it, 4k? surround 4k? 8k? surround 8k? no problem... just buy a new module with more gigabytes and be done with it..
Let's say i have an APU that is using my GDDR module , but then i decide to buy a high-end VGA with a high-end GPU inside it and all the sudden my GDDR module worths nothing because the VGA comes with its own GDDR chips soldered into the PCB, the solution to this problem would be to sell GDDR and GPUs separately, so that when someone decided to buy a high-end VGA, the VGA wouldn't come with GDDR chips soldered into its PCB, instead the consumer would also have to buy a high-end GDDR module with whatever amount of gigs the consumer sees fit.
I can't emphasize enough how much i would love to see the GDDR and GPUs being sold separately... and i strongly think the majority of consumers would love it too, but this wouldn't have to be forced into peoples' throats, it could be a passive-addiction, meaning that it is optional, one could continue to buy the standard VGAs and choose to not buy any GDDR separate modules, the regular easy-choice. While the option to go hardcore and customizable choice would also be available..
jljaynes - Friday, January 17, 2014 - link
(If I'm remembering this right) I remember the first PC I built back in the mid to late 90's had a graphics card that you could stick extra 1 MB chips in. That'd be cool to see something like that now