Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
315W3DMark03 Nature at 1280x1024, 6xAA, 16xAF.
Highest single reading during the test.
453W(!!!)Maximum: Furmark Stability Test at 1280x1024, 0xAA.
http://www.techpowerup.com/reviews/A...5_MARS/27.html
so yes, they did exceed 300W, not by much but they did...
and yes, of course its possible to use 2 8 pin connectors and go for more than 300W... and in reality you dont even need 2 8 pin connectors as highend mainboards can usually supply more than 100W through the slot and 8+6 can deliver more than 225W, its not like the psu will shut down if a card pulls more than 12.5A through a 8pin vga connector... yes, you break the spec... but who really cares? some cards are already pulling more than 100W through the pciE slot, and the spec only allows 75W...
nvidia said gt300 power consumption will be comparable to the current gen... that doesnt really mean anything as there isnt much headroom left to go up... as we all know thermals are holding vgas back...
according to pcgh, nvidia told them that gt300 will go for a re-spin before itll be sold to retail iirc? but doing a respin in around 4 weeks? is that even possible? hot lots again?
So the PEAK was 315, average something like 280 W, and this in a single benchmark run. Hence the card falls to the under 300 W in games.
My point is that there won't be a card which just uses more than 300 W constantly in games. Most probably there won't be a retail card exceeding Mars' power consumption. And as PCI-E standards should be backwards and forwards compatible, the 300 W standard limit isn't going anywhere anytime soon.
So, all in all, this means that GF100 X2 card won't be much more power hog than the current X2 cards.
Nvidia it self is promoting CPU-GPU software that means the GF100 cores will be used in higher percentage than games. Other than that the GF100 seems to have 8 + 6 instead of 6 + 6, if rumors are to be believed and GF100 does eat around 230W spread around 8 + 6 and PCIe slot the X2 would most likely have 8 + 8 instead of 8 + 6 used in GTX 295 "If a shrunk is not used"
Coming Soon
Still the power consumption must remain under 300 W in PCI SIG's internal testing, otherwise the card does not meet PCI-Express standards and hence can not be sold as PCI-Express compicant device. Oh and there is only talk about 6+8 pin in the standard, as far as I know.
Last edited by Calmatory; 10-03-2009 at 10:04 AM.
Well, I don't know much about hotlots other than how risky they are. Respins usually take 6-8weeks to get back from TSMC, which means they will get it back the middle of Nov.
So, best case for Nvidia if this next spin is good to go and they start ramping production, they might get a handful of wafers ready for the end of the year, like I said before, maybe a couple hundred cards.
Again... you cannot use an 8+8pin on a "single" GPU, you will not pass PCI SIG certification.
Last edited by LordEC911; 10-03-2009 at 10:04 AM.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
Using 8 + 8 does not mean they will use 300w+ avg. The GF100 is suppose to have 8 + 6 and not use the combo to the max, having extra power on tap does not mean the card will use it, but in super intense situations extra power will come in handy "As is with MARS"
Coming Soon
Nvidia Admits Showing Dummy Fermi Card at GTC.
http://www.xbitlabs.com/news/video/d...r_Q4_2009.html
Internet will save the World.
Foxconn MARS
Q9650@3.8Ghz
Gskill 4Gb-1066 DDR2
EVGA GeForce GTX 560 Ti - 448/C Classified Ultra
WD 1T Black
Theramlright Extreme 120
CORSAIR 650HX
BenQ FP241W Black 24" 6ms
Win 7 Ultimate x64
http://www.bit-tech.net/news/hardwar...-wasn-t-real/1
tim says somebody showed him a picture of what is supposed to be a real fermi card, but he couldnt make out anything cause it was a mobile phone pic and basically more wires than pcb... sorry, i dont buy it...
o check out our brand new fermi card! see! we DO have real next gen cards, and they are working fine!
x huh? thats not a real card!
o yes, this IS a real fermi *cough* card!
x no its not! i hear something moving inside when shaking it!
o ohhh i think you misunderstood me. I said its a real fermi prototype card... the retail cards will come later...
x so you DONT have a real fermi card right now...
o of course we do!
x well you called me here to show them to me, so where is it?
o ohhhh thats too bad, fedex JUST picked them up 5mins ago... im so sorry...
x . . .
I agree completely. Just as you said, it's probably a lot harder for ATI to keep newer parts fed the wider their SP engine gets. I actually think their utilization goes down significantly as the SPs go up, resulting in poorer efficiency with each subsequent generation. It's why a GTX 295 with 1.8 TFLOPs is able to beat a 5870 with 2.7 TFLOPs. The amount of FLOPs matters even less than it used to, because it's all in how you use them.
A GF100 might have 1.7 TFLOPs when it gets released, which is only about 60% more than a GTX 285 at 1 TFLOP (more if you're rounding, of course). However, 1/3 of the theoretical FLOP rate in G80-architecture-based cards goes almost entirely unused in real-world situations. The FLOP rate on GF100 is what it can achieve in real-world, and on top of that, it's more efficient than G80/GT200's baseline FLOP rate.
AMD definitely has a ways to go with their drivers, as with a VLIW architecture they rely heavily on their compiler. Has anyone done any testing of AF performance specifically?
Last edited by Cybercat; 10-03-2009 at 10:54 AM.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
CPU: Intel 2500k (4.8ghz)
Mobo: Asus P8P67 PRO
GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
Sound: Corsair SP2500 with X-Fi
Storage: Intel X-25M g2 160GB + 1x1TB f1
Case: Sivlerstone Raven RV02
PSU: Corsair HX850
Cooling: Custom loop: EK Supreme HF, EK 6970
Screens: BenQ XL2410T 120hz
Help for Heroes
Bullcrap the MARS uses over 450W at peak. As mentioned, the PCI-e slot and the cables can supply way more power than they're rated for. If worse comes to worse, they just won't advertise it as PCI-e compliant
I mean it is THEIR product, just because it doesn't pass PCI SIG testing, how can they prevent them from releasing it? Perhaps they will be paid off by nvidia.
That is a much larger problem for ATI than nvidia.
Last edited by 003; 10-03-2009 at 10:52 AM.
Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
—Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Are you an idiot OR did you beknowingly miss my post which saidAlso the PCI-E standard says:Still the power consumption must remain under 300 W in PCI SIG's internal testingEvery PCI-Express device MUST BE certified by PCI SIG, otherwise the device can not be sold as PCI-Express device.A single x16 card may now draw up to 300 W of power up from 225 W
Whatever it is, upcoming Nvidia cards WILL NOT GO OVER 300 W on average. Period.
oh dont get me wrong, i completely trust tim on this!!!
its just that he said himself, he couldnt really tell much from the picture he saw, and it sounds like he wasnt 100% convinced that the picture he saw was the REAL REAL fermi card... otherwise he wouldnt even have mentioned that it was a blurry cell phone pic of lots of wires coming off a pcb... if hed really believe he had seen fermi in that moment, he would have written just that, that nvidia then DID show him a picture of the actual card...
btw, a good friend apparently managed to get a copy of the picture in question and mailed it to me
behold! this little puppy right here... this is fermi!
i dont think so either... power aside, the only way to cool 300W or even more is with a 3slot+ heatsink or with water... and thats just too expensive...
Last edited by saaya; 10-03-2009 at 11:09 AM.
Source?
Unless it's based on the fact that when the chip size grows, less chips fit on the wafer, AND when the chip size grows, there is higher chance to have a defect on the chip.
So actually that makes some sense. Defect rate grows linearly, while chips per wafer decrease linearly. As these two add up, yields decrease expotentially. No?
Last edited by Calmatory; 10-03-2009 at 11:20 AM.
afaik the amount of "holes" in the wafer is almost constant between a wafer with many small and few big chips... if a small chip that gets you 400 chips per wafer has a yield of 90% that means probably around 40+ defects per wafer (even if a chip gets struck it may still work with some redundant logic disabled or as a cut down version etc)
if you go for a bigger chip that only fits around 100 times on the wafer like gt300, having 40 defects per wafer means you will have a yield of 60%+ plus because the bigger the chip the higher the chance two defects happen in the same chip.
rv870 is 333mm^2 and rumored to have started with yields of ~60% it seems.
im just guessing here, but 333mm^2 should mean they can get around 175 chips per wafer, so that means 105 functional chips which means 70+ defects. gt300 should be around 550mm^2 which means around 100 chips per wafer max, and with 70+ defects, 30+ fully functional chips.
another factor is the bigger your chip, the more wafer space is wasted on the edges, but thats not a huge diference...
wafer costs are around 3000-5000 us$, so 30 chips per wafer = 100-166$ per gpu, pure die costs
for rv870 it should be around 100 chips per wafer so 30-50$ per chip cost...
these numbers are just examples, they arent accurate...
but you can see, for a rough 50% transistor increase of gt300 over rv870, the costs more or less tripple...
Last edited by saaya; 10-03-2009 at 11:38 AM.
here too is a comparison to crunching on guru.
http://www.guru3d.com/article/radeon...review-test/25
has anybody read the gt300 whitepaper?
i just saw that nvidia said enabling ecc caused a performance drop of around 20% as a result of over bandwidth? :o
I hope desktop gpus will have ecc disabled then!
Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
—Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.
With a kit cost of ~$70-80 yeah they are.
Compared to Nvidia's G200 w/ a kit cost ~$120-130 and GF100 which should be upwards of $150, I think I guesstimated just shy of $200.
As I have said before, DP is totally dependant on clockspeeds, if they hit their clock targets, they should have ~40% advantage over a 5870, if they have G200 like clocks, that can drop to a ~20% advantage.
Then what if Cypress has more units on it for the 5890 plus a higher clockspeed?
That could make it even closer.
Last edited by LordEC911; 10-03-2009 at 11:55 AM.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
Bookmarks