Quote Originally Posted by Sampsa View Post
First rumors from Muropaketti (me) didn't make much sense and now they are obvious?
the first 2 infos were obvious, the cuda core count doesnt make sense imo...

Quote Originally Posted by flopper View Post
He is one of those that is confused on this forum and distort a lot.


Quote Originally Posted by Baron_Davis View Post
Lol I'm not biased at ALL. I buy whatever is best for me. But Nvidia has sickened me over the past couple of years. They release the same over and over with a different name, and worst of all, they can't even release top of the line but expensive cards anymore.

They are talking about everything EXCEPT gaming now because even they are smart enough to realize how terrible their recent GPU's have been.

If you really think all the mumbo jumbo they're trying to feed you now actually matters, you must not know how businesses work.

Nvidia's business plan: If you're failing at making the best gaming GPU's, start talking about other useless technologies that look pretty on a Power Point presentation.

Nvidia's gaming market business plan: If you're failing at making the best gaming GPU's, start focusing on some specific advantage of your GPU's, and release benchmarks in games that take advantage of said technology.


I'll bet you guys Nvidia will NOT have a response to ATI's October lineup. If they have the audacity to release a dual GF104 card, I will laugh, cause it will be such an obvious and pathetic attempt at keeping up. It will prolly consume 400W, and run at 100C.
what else COULD they do?
they can either shut up, which isnt a good idea for a company, or they can try to focus on areas where they are still competitive... every company does that...

i dont like this marketing bs either, but thats really all they CAN do in situations like this, and the sad part is that for 99% of the people out there it actually works... so not doing it would mean theyd lose money...

Quote Originally Posted by Manicdan View Post
agreed
we can expect about 1.5-2x perf per watt just going to 28nm, which means its going to get 2x better than that from their improvements. which should mean that a 28nm midrange chip should offer some incredible bonuses. imagine a physics card for <100$ that is the same as a gt480, but uses only 60-80W
Quote Originally Posted by kaktus1907 View Post


28nm Kepler

Focus is apparently on performance/watt with Kepler aimed to be 3-4 times more efficient..
according to the graph its 5x, but if you check actual numbers, then youll notice that tesla is positioned too high on the graph and fermi too low... which artificially increases the jump to keppler making it look a lot more impressive than it actually is... according to the graph keppler will have 5gflops/W while fermi supposedly has around 1gflop/W... in reality tesla c2070 cards are at close to 2.5gflops/W, so keppler is merely a double of that, slightly more than that, but definitely not a 5x jump as the graph makes it look

Quote Originally Posted by Mechanical Man View Post
I really hope they will release two different dies, one for gaming one for gpgpu.
me2, but i dont think so... :/ theyd have announced it, and i think the hardware would be so similar it doesnt make sense... all they really need to do is make those massive chips on a node once they AND tsmc had some experience with it, and not when the node is brand new... i think then their strategy of having one big chip that does it all can actually work out... but of course it looks like they will throw a massive chip at tsmc again once they announce their next node...

Quote Originally Posted by Dimitriman View Post
The question is: need there be more at this point?
if nvidia wants to own the highend, yes...

Quote Originally Posted by Sampsa View Post
Lets see if GF110 will be a "mid-life kicker" which Huang said would be a product which launches in-between each new NVIDIA chip.
thats just jensen marketing speak for "refresh" :P