So is this new 8800GT supposed to beat out the current 8800GTX? I'm asking specifically for gaming.
I'm just so confused with all the cards and different revisions coming out.
So is this new 8800GT supposed to beat out the current 8800GTX? I'm asking specifically for gaming.
I'm just so confused with all the cards and different revisions coming out.
Last edited by Danger30Q; 10-12-2007 at 07:51 AM.
I'd say it's a "duh" that a upper midrange card isn't going to hold it's own at resolutions that most current cards struggle at. That isn't the point of this card though, this is meant to compete with the 29x0Pro which will be a handful considering how it is just a neutered 29x0XT. Will be interesting to see which card in the $250 bracket comes out on top as that'll definitely sell some cards as the midrange has been ignored for the last year.
NO !!!
Fill the gap between 8600GTS and 8800GTX (8800GTS 320mb discontinued and there is a new GTS )
regards
Nvidia kills 8800GTS 320MB, relaunches 640MB
We knew that Nvidia was planning to kill the 8800GTS 320MB in order to make room for the 65nm die-shrink that the world has come to know as G92. But it turns out that you cannot order 320MB versions any more either, since it is being pronounced as an EOL (End of Life) product. Next in line to go through a change is the 8800GTS 640MB, which is being tweaked up in order to live through the 512MB and 256MB versions of G92.
Nvidia decided to raise the specs by another 16 scalar shader units, so the 8800GTS will now feature 112 scalar shaders, 16 less than 8800GTX/Ultra. Clockspeeds remain the same, as do thermal and other specs. But there are a lot of miffed Nvidia partners, crying foul over the situation. Imagine the surprise of AIBs that have thousands of printed retail boxes with old specs.
Last edited by mascaras; 10-12-2007 at 08:19 AM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
Windows 7 Ultimate 64bit, SP1 Power By:
Asus Rampage IV Formula
Intel SBE 3930K
Gskill RipJaws Z 14900 (1866 DDR3) 16GB 10-11-10-24
EVGA 570 GTX SLI
SSD Samsung P830 240GB
Corsair AX850
I have to wonder why they're not using the 8900 monicker...having so many 8800GTS's is going to confuse most consumers.
Or maybe use GTR or something...
Last edited by mascaras; 10-12-2007 at 09:16 AM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
Shader clock was increased noticably from 1200 to 1500MHz. So yea NVIDIA for once did smart choises, more SPs and a lot higher clocked shader domain clock will mean these cards will do good in benchmarks vs today's 8800GTS cards in modern games as it seems they keep only getting more shader heavy. The focus is clear on 8800GT, providing great shader processing performance for a low pricetag.
Last edited by RPGWiZaRD; 10-12-2007 at 09:06 AM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
shader clock can be overcloked with new rivatuner + 163.67 drivers >> http://forums.guru3d.com/showthread.php?t=238083
regards
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
I know, I was talking from a non overclocking point of view since overclockers are far in the minority today the stock performance is always more important for great success in the market than overclockability. For non overclockers the 300MHz higher clocked shader clock is a great thing coupled with 112 SPs of course. Comparing a 8800GTS 320/640MB stock vs a 8800GT stock for example should show a significant difference in shader heavy games while in older games the gap won't be that huge.
Last edited by RPGWiZaRD; 10-12-2007 at 09:38 AM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Phenom II 940 BE / ASUS M4A79 / HD5770 Crossfire
3770mhz CPU 2600mhz NB | DDR1040 5-5-5-15 | 900/1250
I already know the rumored specs but what I stated is the release is coinciding with Crysis at that time and I don't expect it to be a worse performer in DX10 than the older GTS models regardless of the lower memory bus width. 65nm is just a bonus and helps for oc, cooler operation and higher clocks giving better perf.
Someone with a Ultra and a Q6600 run 3DMark 06 with AA at 4 and let us see the results plz. SS or it did'nt happen![]()
Well, that just depends perky...It's got the memory bandwidth to do it, as it's not that far from the GTS in bandwidth. Kick the ram up to 2ghz and you match stock gts bandwidth.
Here's what we need to know...
How many rops does this thing have. If it matches the GTX/Ultra in rops, it should scale VERY well in aa/af... No marketing tricks required here.
As for the RV670, I'm not counting on it's AA prowess unless ATi learned from the R600 and allowed the rops to handle AA resolves.![]()
Test some real games not ePenis Mark. This benchmark tells me nothing about how a REAL game will play. I know some of you know what a mark equates to in a certain game but a real game and a benchmark are two different animals.
C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64
It shouldn't enrage owners. GTX owners knew they were getting an awesome card. They saw the high price and still paid it. I would call this progress. Good cards at cheaper prices. It's great for everybody!
Those bleeding-edge folks should take comfort that without them, the companies probably wouldn't have enough money later on to come out with even more stuff. Thank you all, bleeding-edge people.
I'll cheer for progress!![]()
I'm not going to be enraged by the 8800gt.
I'll have gotten a full year of maxed out everything by the time it releases. Normally, you get 6 months before another high end comes out and slaps your card down to mid-range size.
If anything, we should be happy that our investments are paying off the way they have been. This has been the first card in a long time that still maxes out everything a year down the line without issue!
This is proof that the 8800GT only has 92 shader units. Proof:
1. The 8800Ultra is ~20% faster than the 8800GT.
2. Assuming 112 shader units for 8800GT, the 8800Ultra would only have a 14% functional unit lead.
3. The 8800GT has a faster clock speed than the 8800Ultra, which should DECREASE the lead for the 8800Ultra.
4. Therefore, 8800GT can not possibly have 112 shaders. The only other choice is 96 shaders.
oh man
Hmm, wonder where you took that number from. Some random 3Dmark results? Come again, that won't necessary reflect real world performance.1. The 8800Ultra is ~20% faster than the 8800GT.
Also you seem to forget like astrallite pointed out the 256bit bus vs 384bits bus. To me it seems more logical with 112 SPs still due to the strong source indicating that. Also we don't know ROP clock of 8800GT yet.
I guess it's just best to wait as usual and let the time tell but some midrange refresh sure comes handy.
Last edited by RPGWiZaRD; 10-12-2007 at 12:19 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
I'd rather see a regular 3dmark06 with the card being oced.
Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card
LSI series raid controller
SSDs: Crucial C300 256GB
Standard drives: Seagate ST32000641AS & WD 1TB black
OSes: Linux and Windows x64
I don't see any reason for those with an already great graphics card like 8800GTX should need to upgrade theirs to something slower, these upcoming midrange cards however are VERY interesting for all those yet holding on to their DX9 cards in deperation until some decent price for performance DX10 combo arrives as personally I just can't afford spending $400+ in a graphics card. Personally I was more waiting for card that is a worthy upgrade for my 7900GTO for up to 250 EUR which is more like my budget and can provide at least ~80%+ better performance. These new midrange cards certainly seems to easily meet all those criterias.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Cause 2 GT's in SLI would be quicker than 1 GTX. But of course all the headaches of a dual card system come too...and want no more of that! Also, it seems Crysis might be the only game to push a single GTX @ 1680x1050, so no worries.
It sure does seem like a helluva value tho. Especially if they wind up on sale under $250.
yeah those scores with AA activated are way low, i can do a nice 13.9k with 3.2 ghz quad and stock Ultra.
But nice 8800 GT card!
My recent configuration:
Asus P5K Deluxe
Intel Q6600 @ 3.2ghz (8 x 400 @ 1.37v) | Batch: L644G508
4 GB G.Skill F2-6400 CL4-GBHK @ DDR2 800(2.05v) 2T | 4-4-3-5
Samsung 400 GB Sata2
EVGA 8800 Ultra 768MB | 3D Mark 06 score: 14.400
Seventeam ST-750EAJ
Gemin2 | (evercool 120 mm Aluminium)
Dead mobos: 1![]()
Model: Striker Extreme![]()
Bookmarks