Cut-down GP104 might be used for Nvidia’s budget card
The GeForce GTX 1060 with 3GB of video RAM could possibly become a more compelling budget graphics card option, at least if speculation about it adopting the same chip as the GTX 1070/1080 is on the money.
According to the rumor mill, Nvidia is set to use GP104 chips (as seen on the 1070/1080) rather than the GP106 GPU normally used in the GTX 1060, although of course these won’t be full-powered versions of the GP104 (as clearly that would be silly).
Rather, these will be GPUs which failed quality testing and weren’t up to scratch for use in the GTX 1070/1080, but instead of binning them as ‘defective’, they can be reused in a lesser card like the GTX 1060 with some of the cores (including the duff ones, of course) disabled.
So why the excitement, then? Well, there’s the possibility that this fresh version of the GTX 1060 3GB could offer a bit more potential in terms of cranking up clock speeds.
And graphics card tinkerers out there might run the risk of flashing these cards to try to unlock further boosts, although Nvidia will likely defend against that prospect.
Indeed, as Hot Hardware, which spotted this development, notes, the 3GB version of the GTX 1060 (as opposed to the full powered 6GB card) may have been chosen for these cast-off GP104 GPUs because that memory limitation will keep a lid on how much these cards could potentially be exploited power-wise, anyway.
Speaking of power, on the negative side, the other issue is whether these new variants of the 1060 will require more juice and further tax your power supply. Some folks aren’t best pleased about this prospect, or indeed the fact that Nvidia isn’t likely to differentiate these different GTX 1060s in terms of naming conventions. Particularly as it could be important to know what model you’re buying if you’re running your rig close to its power limits.
Although bear in mind that this is still very much a rumor at this point. However, given that this sort of repurposing of chips is common practice in the graphics card industry (and elsewhere for that matter), it’s hardly an unlikely possibility.