jep, those clocks really suck
thats 24% less in comparison to the 9800gtx
Printable View
jep, those clocks really suck
thats 24% less in comparison to the 9800gtx
Not to mention the 9900gtx will probably be somewhere close to 2ghz shader with the g92b core and will definitely oc better than gt200
That's right guys, keep your hopes down.
The lower your expectations become, the bigger the excitement with the first numbers ;)
Awfully low shader domain clock...
.
.
.
Yay! More OC'ing headroom!!!
:D
lol god, this articel is so full of errors... i think he wrote the while he was drunk....
Quote:
The clocks are 576MHz GPU, 999MHz memory and 896MHz GDDR3 on a 448b memory interface
uh, so we have ram with 2 clock domains... nice. :rofl:
not necessarily
imagine what those card suck energy wise with voltmods+ xtreme oc,
probably >300 w
Sometimes I wonder just how much about hardware the INQ actually knows...
x2
GT200 is a Monster in everyway...
Just Smells like barbecue to me, that's the part i don't like
And hughe electricy bills...
Amd can't compete with this GPU in single, perhaps with G92b 55nm
but Geforce GTX 280 can't even be cooled by air, that's how bad it gets!
imagine Ultra was 185w and nvidia had heat issues with G80,
so we all know 240w it's just to much for aircooling!
aquacomputer shown us they're offer allready, and this tell much about it.
They could use this sentence to promote the card:
GEFORCE GTX 280
So Powerfull that can't be cooled by Air,
Get your Nitro Kit, only 99 000$ Nvidia®
ahhaha LOL :D
cheers
Hehe look at the third quote in my sig :D,dates back to who knows when(when first rumors of the GT200 series emerged).With a such a HUGE die area,it better have some great air cooling solution,since if that's not the case,it will be the best selling product in "artic circle and surrounding areas" :D.
Benchmarkreviews tells a completely different story than The Inquirer:
Quote:
Jason Paul, the GeForce Product Manager and NVIDIA veteran, came right out and dropped the new product bomb: the GeForce GTX 200 graphics platform. Perhaps it was the off-interest discussion of CUDA which lowered the attention span, but Jason's brief discussion exposed that the new GPU play Crysis "damn fast". There wasn't any time wasted, and Jason quickly introduced Tony Tamasi to introduce the GTX 200 compute architecture. Unfortunately, the non-disclosure agreement Benchmark Reviews honors with NVIDIA prevents me from disclosing the details for this new GPU.
So you might be wondering what Jason's holding in the image above, right? It's large, almost the size of an original Penium processor, except that this particular item has 240x the number of cores inside of it. I would love to tell you what it is, but suffice it to say there's a good reason why Mr. Paul has a smile on his face. It put one on my face, too. Benchmark Reviews will reveal more at 6AM PST on June 17th, 2008.
http://benchmarkreviews.com/index.ph...d=178&Itemid=1Quote:
Just wait until June 17th when the GTX 200 series of GPU's launch, and you'll start asking yourself when you last witnessed such a dramatic technology improvement. If you thought the GeForce 8 series blew the 7-series out of the water, this is going to leave you in shock. That's not my own marketing spin... Benchmark Reviews is presently testing the new GeForce video card.
LoL...
Best quote ever-
Since I know nothing of the site, I did a little investigating.Quote:
Originally Posted by Bencmarkreviews
Out of the 17 pages of reviews, 5-10 reviews per page(most were not of GPUs), there was 1 for an AMD/ATi card...
While the G80/G92 series had multiple reviews of the same card? 10 reviews total.
Interesting...
Exactly, there's going to be a LOT of shocked faces when this thing drops. :yepp:
Seriously, I don't get why people are still under-estimating this thing. NVidia had an entire extra year to work on this chip and tweak it to it's maximum. Why do people think it's going to be slow enough for a RV670 on steroids to surpass it?
So then the performance gain from G80 > G200 will be much higher than that of 7900GTX/G70 > G80! This implies a 3-times gain, which would mean this product is a must-have! Wonder f its price/performance will be greater than that of 9600GT.
Easy tiger.
Don't rush into things, and certainly hold your laughing for later usage ;)
If we were talking about drugs, then AMD's steroids are ecstacy... while nVIDIA uses a mix of every kind of drug around.
Plus the MUL is working "properly" all the time now ( one thing you're missing when comparing the "GT200" with G80 ).
[mode=hands_on];)[/mode]
Well lets just hope they get the high TDP worth it
After all, they've had a lot of time considering how sure some members here were about G92 being the 1TFlop beast hitting us at the end of last year...