So is this new 8800GT supposed to beat out the current 8800GTX? I'm asking specifically for gaming.
I'm just so confused with all the cards and different revisions coming out.
Printable View
So is this new 8800GT supposed to beat out the current 8800GTX? I'm asking specifically for gaming.
I'm just so confused with all the cards and different revisions coming out.
I'd say it's a "duh" that a upper midrange card isn't going to hold it's own at resolutions that most current cards struggle at. That isn't the point of this card though, this is meant to compete with the 29x0Pro which will be a handful considering how it is just a neutered 29x0XT. Will be interesting to see which card in the $250 bracket comes out on top as that'll definitely sell some cards as the midrange has been ignored for the last year.
NO !!!
Fill the gap between 8600GTS and 8800GTX (8800GTS 320mb discontinued and there is a new GTS )
http://www.expreview.com/img/news/07...family_fix.png
regardsQuote:
Nvidia kills 8800GTS 320MB, relaunches 640MB
We knew that Nvidia was planning to kill the 8800GTS 320MB in order to make room for the 65nm die-shrink that the world has come to know as G92. But it turns out that you cannot order 320MB versions any more either, since it is being pronounced as an EOL (End of Life) product. Next in line to go through a change is the 8800GTS 640MB, which is being tweaked up in order to live through the 512MB and 256MB versions of G92.
Nvidia decided to raise the specs by another 16 scalar shader units, so the 8800GTS will now feature 112 scalar shaders, 16 less than 8800GTX/Ultra. Clockspeeds remain the same, as do thermal and other specs. But there are a lot of miffed Nvidia partners, crying foul over the situation. Imagine the surprise of AIBs that have thousands of printed retail boxes with old specs.
I have to wonder why they're not using the 8900 monicker...having so many 8800GTS's is going to confuse most consumers. :rolleyes:
Or maybe use GTR or something...
Shader clock was increased noticably from 1200 to 1500MHz. So yea NVIDIA for once did smart choises, more SPs and a lot higher clocked shader domain clock will mean these cards will do good in benchmarks vs today's 8800GTS cards in modern games as it seems they keep only getting more shader heavy. The focus is clear on 8800GT, providing great shader processing performance for a low pricetag.
shader clock can be overcloked with new rivatuner + 163.67 drivers >> http://forums.guru3d.com/showthread.php?t=238083
regards
I know, I was talking from a non overclocking point of view since overclockers are far in the minority today the stock performance is always more important for great success in the market than overclockability. For non overclockers the 300MHz higher clocked shader clock is a great thing coupled with 112 SPs of course. Comparing a 8800GTS 320/640MB stock vs a 8800GT stock for example should show a significant difference in shader heavy games while in older games the gap won't be that huge.
I already know the rumored specs but what I stated is the release is coinciding with Crysis at that time and I don't expect it to be a worse performer in DX10 than the older GTS models regardless of the lower memory bus width. 65nm is just a bonus and helps for oc, cooler operation and higher clocks giving better perf.
Someone with a Ultra and a Q6600 run 3DMark 06 with AA at 4 and let us see the results plz. SS or it did'nt happen :up:
Well, that just depends perky...It's got the memory bandwidth to do it, as it's not that far from the GTS in bandwidth. Kick the ram up to 2ghz and you match stock gts bandwidth.
Here's what we need to know...
How many rops does this thing have. If it matches the GTX/Ultra in rops, it should scale VERY well in aa/af... No marketing tricks required here.
As for the RV670, I'm not counting on it's AA prowess unless ATi learned from the R600 and allowed the rops to handle AA resolves. ;)
Test some real games not ePenis Mark. This benchmark tells me nothing about how a REAL game will play. I know some of you know what a mark equates to in a certain game but a real game and a benchmark are two different animals.
It shouldn't enrage owners. GTX owners knew they were getting an awesome card. They saw the high price and still paid it. I would call this progress. Good cards at cheaper prices. It's great for everybody!
Those bleeding-edge folks should take comfort that without them, the companies probably wouldn't have enough money later on to come out with even more stuff. Thank you all, bleeding-edge people.
I'll cheer for progress! :woot:
I'm not going to be enraged by the 8800gt.
I'll have gotten a full year of maxed out everything by the time it releases. Normally, you get 6 months before another high end comes out and slaps your card down to mid-range size.
If anything, we should be happy that our investments are paying off the way they have been. This has been the first card in a long time that still maxes out everything a year down the line without issue!
This is proof that the 8800GT only has 92 shader units. Proof:
1. The 8800Ultra is ~20% faster than the 8800GT.
2. Assuming 112 shader units for 8800GT, the 8800Ultra would only have a 14% functional unit lead.
3. The 8800GT has a faster clock speed than the 8800Ultra, which should DECREASE the lead for the 8800Ultra.
4. Therefore, 8800GT can not possibly have 112 shaders. The only other choice is 96 shaders.
Hmm, wonder where you took that number from. Some random 3Dmark results? Come again, that won't necessary reflect real world performance.Quote:
1. The 8800Ultra is ~20% faster than the 8800GT.
Also you seem to forget like astrallite pointed out the 256bit bus vs 384bits bus. To me it seems more logical with 112 SPs still due to the strong source indicating that. Also we don't know ROP clock of 8800GT yet.
I guess it's just best to wait as usual and let the time tell but some midrange refresh sure comes handy.
I'd rather see a regular 3dmark06 with the card being oced.
Damn, that makes me wanna sell the GTX and pick up a couple GT's for Crysis. :D But then I'd have to get rid of the DFI...and that's not going to happen.
I don't see any reason for those with an already great graphics card like 8800GTX should need to upgrade theirs to something slower, these upcoming midrange cards however are VERY interesting for all those yet holding on to their DX9 cards in deperation until some decent price for performance DX10 combo arrives as personally I just can't afford spending $400+ in a graphics card. Personally I was more waiting for card that is a worthy upgrade for my 7900GTO for up to 250 EUR which is more like my budget and can provide at least ~80%+ better performance. These new midrange cards certainly seems to easily meet all those criterias.
Cause 2 GT's in SLI would be quicker than 1 GTX. But of course all the headaches of a dual card system come too...and want no more of that! Also, it seems Crysis might be the only game to push a single GTX @ 1680x1050, so no worries.
It sure does seem like a helluva value tho. Especially if they wind up on sale under $250.
yeah those scores with AA activated are way low, i can do a nice 13.9k with 3.2 ghz quad and stock Ultra.
But nice 8800 GT card!