Not only ECC, but also the 1/2 DP GPGPU logics.
Printable View
Reducing by ~20%... isn't it great?
http://www.anandtech.com/show/3809/n...the-200-king/3
32SM -> 48SM
+ functional units
+ the warp schedulers with superscalar dispatch capabilities
- ECC and FP64 hardware
=size of an SM increased by 25%
We can speculate as much as we want, but my guess is nVidia cancelled the Gual GPU GF104 solution for one of three factors.
1) It is not that competitive compared to ATi's Dual GPU solution
2) The costs did not warrant the product going into full production.
3) The power and thermal requirements for the card might be just too high.
Either way, something is coming from the land of nVidia very soon.
http://img409.imageshack.us/img409/6...peculation.jpg
As to what it is and how it performs nobody knows, but what I can say is that nVidia would have learned from the GTX 480 launch and will have listened to what the PC community want at large.
I would sadly suspect to see a decrease in the amount of computational progress by the next GPU (compared to the huge leap from G200 to GF100) however I would expect to see performance increases in DirectX11 and D3D10 games as the cards are more geared towards gaming.
Don't get me wrong, it would still be a good folder and cruncher, just nVidia would like to encourage people to move into the land of Tesla for their computational work.
Anyway I wonder what the next Quadro is going to be... Hmmm rumours suggest it would be of the GF104 flavour with 4GB of RAM and 384SP, but nobody has seen such a thing...
Either way that would suggest what GPU would eventually end up in the land of the desktop market.
John
i think they know that.designing a chip is a bit more complex than just putting stuff on there. keep in mind they had around four years to design fermi. 6 months is only a 10% delay. there is almost no room to screw up with a schedule that tight.
you can. g80 had more gpgpu features than r600. shaders and gpgpu are becoming more and more similar which is why dx11 features directxcompute. features like caches will help graphics more and more in the future.Quote:
you can't get maximum gpgpu power if you also want to maximize graphics performance
GTC JHH's keynote start in 50 minutes Here
It started ...
started!!
EDIT: darnnet, i was third
lol it starts
I think your thoughts & opinions have merits, but i don't want to sell nVidia totally short in this generation. While i'm no fan of green team especially their management (focused on their CEO), i think GF 104 as a half way rightful move by nVidia after GF 100 debacle. They managed to increase (on nVidia standard) the efficiency of the design, but the yield seems still in the same gutter.
Now, make another half step for success by redesigning the chip to get better yield while increasing the size & raw performance, this is a must for nVidia, since i just can't believe this company will simply lie down & accept ATi's Northern Islands steam rolling until 28 nm node arrives. They're still overall the market leader in 3D graphics industry and quite strong financially, i think they can absorb the investment needed to keep their market share intact, fanbase faith, & shareholder trust.
But we'll see, especially the pressure from ATi's side. IMHO, a strong Northern Islands gen will force nVidia's hand, but if that doesn't happen, my thoughts are just for naught. Regards. :)
Nvidia is gonna fail once again. Whatever BS card they release, you can bet it's either gonna be:
a) a sandwich (two slapped together GF104/6's with a sweet new name :rolleyes:)
b) a mobile heater.
wow that iray rendering actually looks real (after its been rendering for a while)
I've been watching GTC and a lot of what they are talking about is extremely impressive. Not to mention I have a great new avatar. The Matlab and other stuff is really awesome. And CUDA-x86 with PGI? Wow. Iray is amazing...
You know why Nvidia is throwing sand in your guys' eyes with all this iray, Matlab, CUDA crap?
Cause they know they can't release a GPU that isn't gonna get beasted free by ATI's next gen lineup.
Notice how their "focus" on CUDA, PhysX, rendering, and the other garbage technology that's supposedly targetting the professional market started when ATI started beating their ass?
It's Nvidia's way of changing the conversation.
Jen-Hsun Huang is a pathetic douche. It's OK though, no amount of talking will help heal the pain once ATI unleashes in a month.