Usual suspects: i5-750 & H212+ | Biostar T5XE CFX-SLI | 4GB RAndoM | 4850 + AC S1 + 120@5V + modded stock for VRAM/VRM | Seasonic S12-600 | 7200.12 | P180 | U2311H & S2253BW | MX518
mITX media & to-be-server machine: A330ION | Seasonic SFX | WD600BEVS boot & WD15EARS data
Laptops: Lifebook T4215 tablet, Vaio TX3XP
Bike: ZX6R
www.teampclab.pl
MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12
Test bench: empty
Because nVidia has said so themselves. If you wanna get right down to it, these chips have been "pin-compatible" since the release of the 6600 series.
I agree but then again, these rumors coming from non-legit places tend to be highly innacurate. I remember not too long ago when people swore up and down claiming that the 8900 series was going to be released soon. Infact I was laughed at when I insisted they didnt exist. My guess is that just like the 8900 series, these card's pricing is incorrect.
I wouldnt be at all suprised if nVidia leaks information like this to see what the public reacion would be and they change their prices accordingly, thats pretty typical in terms of pricing tactics.
I will say this however, if it is infact 128bit RAM, it had better be high speed GDDR3 or GDDR4 to offset the loss of birate. There's no way a 1.3ghz 128bit ram chip on any mid range card is going to hack it. It will have to be atleast 2ghz otherwise ATi will just make a card that uses GC20 or GC16 @ 256bit (which I assume is the same price as the higher end 128bits) and simply slaughter the nVidia offerings.
Oh no no, that has nothing to do with it. I can promise you the reason they are using 128bit ram is because they are afraid that the card is powerful enough that when given 256bit RAM it can match the peformance of many of the higher end cards at lower resolutions (say 1076/768 etc).
If a card half the price of their main ones is capable of doing that for I'd say 99.9% of their customers it would be a simple decision as to which card they are going to buy, ie *not* the 8800 @ $400 or the cutdowns at $250-300. By handicapping the RAM significantally it puts enough distance between it and the 8800s to justify people purchasing them over their crippled cousins.
In terms of sales a 256bit card with that kind of GPU would be disasterous to their high end modles. The only way they are going to make a true cutdown like the 7900GS is at EOL when they are trying to clear inventory at any cost.
Last edited by Sentential; 03-12-2007 at 07:37 AM.
NZXT Tempest | Corsair 1000W
Creative X-FI Titanium Fatal1ty Pro
Intel i7 2500K Corsair H100
PNY GTX 470 SLi (700 / 1400 / 1731 / 950mv)
Asus P8Z68-V Pro
Kingston HyperX PC3-10700 (4x4096MB)(9-9-9-28 @ 1600mhz @ 1.5v)
Heatware: 13-0-0
if you actually read the first post you will see that the 8600gts does have 2ghz mem . so the bandwidth should not be TO bad but still not enough.
CPU: Intel Core i7 3930K @ 4.5GHz
Mobo: Asus Rampage IV Extreme
RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
GPU: EVGA GTX Titan (1087Boost/6700Mem)
Physx: Evga GTX 560 2GB
Sound: Creative XFI Titanium
Case: Modded 700D
PSU: Corsair 1200AX (Fully Sleeved)
Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's
G84 supports 3840x2400 resolution
Geforce 8600 GTS / GT have the power
It looks like that resolutions are doomed to grow. An average gamer plays on the 19 inch 1280x1024 display, some lucky ones have 1650x1080 resolutions on their big wide sceens and only a few rich ones can afford a 30 inch display and pay up to $2000 for 2650x1600 resolution.
The new Nvidia card will let you play at 3840 × 2400 but only at 30 Hz which is two times 1900x1200 resolution. I am not aware of any display or device that can bennefit from this Dual link DVI-I output but obviously there will be a purpouse for such a resolution. It is a cool pixel number but un cool refresh rate as you need at least 60 Hz to make your eyes feel good.
In worst case scenario it is a nice tick box feature, that you can state on the box and impress anyone.
Geforce 8600GTS works at 675 MHz
80 nanometre can push it high
We found out that chip that use to be codenamed G84 will end up with much higher frequencies than we originally expected. G84 based Geforce 8600GTS, a top of the mainstream offer will end up clocked at 675 MHz core. We still don’t know the memory clock but we are sure that Nvidia plans to use the GDDR 3 memory with speeds higher than 1600 MHz, if not even higher.
G84 will be fastest clocked chip from Nvidia to date and the clock comes in such a high speed only thanks to the 80 nanometre marchitecture. G80 works at at default at 575 MHz but G80 is 90 nanometre, not 80 as G84.
This is very close to the limit of the technology as you can not clock 80 nanometre chips at much higher frequency than 700 Mhz. Obviously Nvidia has to think about shrinking the chip to 65 nanometre but this won’t happen anytime soon as least not that we know off.
We already said here that the launch date is April, middle of the month.
http://www.fudzilla.com/index.php?op...id=98&Itemid=1
regards
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
Bookmarks