More scores
Fake ?????????
more tests In games
Quote:
:up:
Printable View
More scores
Fake ?????????
more tests In games
Quote:
:up:
Nice, game benchies already! And those are games I'd think 8800GT do relatively bad in as it's not Unreal3 engine based games for example where I think 8800GT will do a bit better due to focused shader performance increase in comparision vs current 8800GTS even. So based on those it looks like 8800GT can be perhaps overclocked to around 8800GTX performance or slightly more. If so it's what I've hoped for. :yepp:
The reason behind those good 3Dmark06 scores in relation to the actual gaming benchmarks would in that case be due to great shader processing performance I'd guess and those games benched are more traditional games where bandwidth plays a greater role but complex shader rendering are where the future game engines are heading for.
Would love to see 8800GT benchmarks from UT3, MoH: Airborne, BioShock etc. :)
So these benchmarks above gave a good view of how effectively the 256bit bus works and the bandwidth of 8800GT. 256bit doesn't look to be a bottleneck at all I'd say for this card.
consistently faster in games than the 8800gts and faster than the 8800gtx in one of them plus the crazy oc potential and the fact that it may perform better in newer shadder based games, for $250=winner
im really liking this card. im just waiting to see more benches.
The GT is definitely looking to be a serious power house for the money!!
SOB = I'm still paying on my 8800GTS I got at the end of August!
THose game benches and all are nice but..
How the hell does the GTX vs. Ultra differ so much? Unless that Ultra is a really OC'd one and the GTX is the stock version, I don't see an 11 FPS difference with the same settings and resolution if its stock vs. stock
Same for the GTX beating the Ultra in certain games...
All the rumors though that the GTX and Ultra would remain the high end until year's end is indeed true though...
Now that's a card that won't break my bank!
Forceware 167.26 huh... so who has the time machine?
mascaras: Post#28 the image and the quote say 2 different things.
Pic says new 8800GTS 128SP and quote says new 8800GTS 112SP.
follow the links provided. It would give you insight on why the results turned out like that.
Originally Posted by cookerjc View Post
If you got a 88GTS 320bit or 88GTX and want a better card, you'd better sold it and wait for G92.
if you intend to use AA and AF you should keep your GTX/Ultra
I bet its a 8900GT.
The thing that interests me is that now supposedly the 8800 GT has a higher 3Dmark06 score than the 8800 GTX (I have no idea how that's physically possible given it's lower SP amount and throughput), yet it's 20% percent slower than the 8800GTX in 3Dmark06 if 4xAA is enabled. Even stranger is that those numbers came from a slower CPU than this one, and the 8800GTX still managed 11k with 4xAA, but the 8800GT slowed down to 9k.
Really makes you wonder if these benchmarks were all manufactured from some random guy playing with MS Excel.
these benchmarks are bogus, for one that driver doesnt exist. 2. the ultra is slower than the GTX in mark, and several games. which is bs, unless they had the GTX oc'ed...bogus. Im gonna wait for the real deal next year, not a refresh with a gimped bus. and less memory. Besides i picked my 640 gts up for 299$ so im not complaining:D
As for the driver, it's probably the latest beta 163.76 that got numbers mixed up a little. Have seen that happen quite a few times when ppl specify the driver used. :D
The author meantioned he had some problems with the 8800 Ultra card. But what matters are the 8800GT game benchmark results looks exactly what I had expected of this card and seem logical to me except S.C.C.T benchmark where 8800GTX for some reason didn't get as good result like it should (expected around 121~123 FPS) but I've seen similiar weirdness in an own review I've made once so I know such things can happen. :p: At least Quake 4 and FEAR benchmark looks like it makes all sense for all cards.
Hmm so perhaps 167.26 exists but only available to a few ppl. Too bad GPU-Z didn't show everything. Core clock overclocked to 900MHz and 2250MHz shader? Holy smokes if that's true, this will be my next card probably. :eek: lol
Wow 900 core and 2250 shader clock would mean better performance than a GTX/Ultra probably
The new high end for Nvidia will probably be ridiculous then
BTW they are all the same source expreview so it depends on if they're legit or not
BTW, obviously these scores were done on a PCI-E 2.0 motherboard as GPU-Z indicates, do you think PCI-E 2.0's extra bandwidth capacity will show any difference at all or will PCI-E 1.0 work just fine? I know this card would work in a PCI-E 2.0 motherboard without any extra power connector plugged into the card due to the 150W support from the PCI-E 2.0 slot alone but do you think it will make any difference to performance at all for these cards? Since there's hardly any difference between 8x PCI-E and 16x PCI-E either I guess it shouldn't at least not yet. (talking mostly single card, not SLI)
That's a helluva clock increase. I struggle to get my GTX past 648/15xx. :(
Quote:
The 8800GT with MeII-32, a QX6800 and 2G of DDR2-1000 ram all on a 780i board scores 14200 on 3DMark 05.
This is a little higher than the scores Theo saw, and just about even with the 8800Ultra scores at this rez.
http://www.theinquirer.net/gb/inquir...2-scores-outed
3dmark 2005 ??????
14K in 2005 its "nothing" , maybe they meant to say 3dmark 2006 or maybe they run 3dmark 2005 in another resolution than the default
INQ........................... :D
btw:
cumpsQuote:
NVIDIA Confirms 1.5GHz 8800GT With 112 Shader Processors
While AMD is working hard on putting the finishing touches on the RV670, NVIDIA isn't going to let itself just be forgotten about. In an NVIDIA presentation slide, the 8800GT got confirmed as having 112 shaders working at an impressive 1.5GHz. This is much better than the 96 shaders at 1.2GHz rumor heard earlier. The actual core/memory clocks are still up in smoke though. However, the latest rumors hint that they will be around 600/900 for the core and memory, respectively. The VRAM will be 256-bit GDDR3, the amount of which will be determined by manufacturers. Expect the cards to cost between $199 and $249 USD when they hit retail channels worldwide.
Source: Nordic Hardware
http://forums.techpowerup.com/showthread.php?t=42126
:up: