power connection do not dictate power draw. if they did, the 970 strix would consume less power than the gigabyte gtx960 WF2OC and the same as the evga gtx 960 gaming. it simply doesn't work that way. the Titan Z and R9 295X2 consume less power than the 290X Lightning? I think not. you can go in the bios of any gpu and set how much power can be drawn through each pci-e connector as well as the pci-e slot. just because it's there doesn't mean the cards use them fully. the cards can pull way more than 300 watts each. so can a gtx 970, but that runs 420 watts system load in SLI. sites and calculators that go by TDP and connectors are pretty useless in the face of real world testing.
But they dictate the maximum power that could be drawn. I prefer planning for the worst-case, such as the GPU actually pulling 300w. If the GPU doesn't pull that much, then you get extra life out of it or can possibly expand later on/use it in another build.
As for setting the max power draw through the GPU BIOS, I'm wondering what impact that could have on performance. Got any info on it?
as long as you don't play furmark the gpu's power consumption is not gonna peak. then there's the fact that cpus don't max out power consumption in gaming loads. an equivalent 1000 watt would last maybe a year longer, but the 850G2 is still gonna run this build for 6+ years. setting power limits higher mostly helps as an way around power limit throttling. I increased my 970's 6-pin power limit from 75 to 100 watts so the extended power limit from 220 watts to 245.
Just now came back and read more comments. Glad I did :) You 2 discussed power draw and now I have a much better idea of what I should look for when choosing a PSU. Thanks~