No, 8 pin is ~150 watt rating each.
EDIT: I guess I should say 150+150+75, technically the standard for two 8 pin PCI-E connectors should be around 300watts but the the pci-e slot supplies an additional 75 watts. Keeping in mind the rating is just a standard and not an absolute most of us exceed the various standards with overclocking.
Last edited by highoctane; 03-06-2012 at 10:42 AM.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
Maybe when Nvidia says that this architecture will have better performance x watt, it does not mean consuming less but better use of the power necessary for the chip to work, on average equal to or use of Fermi, but with better performance
*sorry my english ugly ... I'm practicing the language
Is it basically what happend by going from 40 to 28nm .. Well this have been allways part of any talk and launch, just watch all release slides of gpu from thoses last 6years, and you will get an "improved performance / watts " slide somewhere ( AMD or Nvidia ). This is just cause with all new generations, AMD, Nvidia, Intel ( same for ARM chips makers ) try to improve the performance / watts. ( and they do, well normally ( Fermi was a real accident on this part ) .
This is a part of the job for each new generation ( CPU, GPU, monitor ), improving performance / watt
Last edited by Lanek; 03-06-2012 at 11:19 AM.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
Do PCI-E v3.0 slots have the ability to provide more power to the card from the board itself vs PCI-E v2.0 ???
I thought they were limited by the board's actual physical properties (i.e. too many watts would simply fry the board) but I could be well wrong
I never heard of something like that related to AIB's especially in cases where the power is supplied directly from the power supply via the pci-e power connectors. I could possibly see pushing things a little far with a OC'd video card that's strictly powered from the pci-e slot through the mb.
At any rate you would be reading about allot of meltdowns if that where the case, especially on this forum where OC'ing and voltage tweaking push power draw much higher than normal along with the use of power supplies with enough current reserve to weld with.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
^^^Originally Posted by Kyle Bennett
Last edited by RPGWiZaRD; 03-06-2012 at 02:53 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Dissapointing, if true.
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
Just saw that myself... will be interesting to see how it does with launch drivers, especially. I hope the pricing isn't stratospheric however. Who knows how accurate this is though...
With pre-launch drivers that they're still pushing on quickly, it doesn't sound disappointing to me assuming pricing is decent. With their smaller die size I would think they could compete on both price and performance here.
Last edited by GoldenTiger; 03-06-2012 at 02:56 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
and later on he said he'll keep his 7970 XFire setup ...
Originally Posted by Kyle
if i remember correctly that in order for pci-e 3.0 to have provided more power they would have to have increased the traces on the mobo which would have really added to the complexity and cost. From what i remember pci-e 3.0 provides the same amount of power as pci-e 2.0
So? What he said basically made the cards sound like they're going to trade blows. His system is still state of the art, so if the GK104 and 7970 are really similar performance he has no reason to move. Besides the big chip isn't here yet. Lets see if he is going to say that when he gets that on his hands. Back a few year ago he did not switch his 5870 Crossfire setup to a GTX 480 SLI when it first came out anyway.
Matches my statement of about 1.5x over GTX580.
We all know shaders can end up idle. By allowing shaders that are being accessed to use the resources of the idle shaders you fixed that problem; greatly improving efficiency per shader.
Anyway, of course NVidia are contemplating calling this the 680... If you could beat the opposing company's high end with what would've been a midrange chip that's smaller and know your opponent will take awhile to counter attack, wouldn't you do the same exact thing? I'm thinking the 6xx series will be short-lived, and they'll drop the 7xx (headed by the GK110) when AMD refresh this fall, where we'll see the 680 slightly tweaked with higher clocks as the GTX 760ti.
There is absolutely NO reason to upgrade if you have XF'ed 7970's. Name a game that XF'ed 7970's can't max out at perfect frame-rates! Only reason there would be is if a huge title comes out using physx that the physx makes a huge difference on.
Not too disappointing. They said it beats the 7970 in games and game related benchmarks. So it's better than AMD's best where it counts. Definitely not disappointing to me.
Of course, with this beating the 7970, that means the GK110 may be the next G80.
Bookmarks