With rumours pointing to a May/June launch at Computex, more information is coming about Nvidia’s upcoming GTX 1070 and 1080. At times taking on an x70 and x80 moniker, the two chips are slated to use the GP104 Pascal die and take up the role currently filled by the GTX 970 and 980. Today, another leak has come out detailing what the two cards will look like and it seems the GTX 1080 will have a lot more memory bandwidth than the GTX 1070.
According to the leak, both the GTX 1080 and GTX 1070 will be based on the GP104 die. This will slot into the Pascal lineup just like the GK104 and GM204. While previous cards have mostly differentiated in the core specifications, it looks like this time, memory bandwidth and a lot of it will be the difference. The GTX 1080 will reportedly use GDDR5X while the 1070 will use GDDR5. This should give potentially up to 100% more bandwidth for the GTX 1080 and better energy efficiency to boot. Due to the use of different memory, the 1080 will boast 20 more pins. The 1080 will use the GP104-400-A1 and the 1070 the GP104-200-A1.
For two similar cards based of off the same GP104, there seems little reason to split the memory between two different types since the required memory bandwidth should be similar. One possibility is that the GTX 1080 may be using relatively slower GDDR5X that isn’t much faster than the fastest GDDR5. This makes sense if GDDR5X is supply constrained for the top end models. Another possibility is that both were meant to get GDDR5X but supply meant only one of them could. Finally, it could also be a way to differentiate the GTX 1080 as it may remain the GeForce flagship for quite a while.
Last of all, we have also gotten information about the connectors, Both cards will feature 3x Displayport, 1 HDMI, and 1 DVI for display connectivity. Power will be provided by 2 x 8pin PCIe power connectors which is actually more than what the Titan X has. Given the efficiency from moving to 16nm, this points to either a monster chip or lots of dual precision hardware being left in. With only several weeks left, will be interesting to finally see what Nvidia has cooked up for us.