Do they have IHS like 8800?
Printable View
Do they have IHS like 8800?
Will a Maze 4 Fit on them? :fact:
Damn, I completely missed the 128 bit thing.
Oh well, looks like the 8800GTS 320 is the best bet for GF's upgrade.
$250 for a 128 bit card??
someone send me some of whatever the Nvidia Marketing departments smoking, cause I bet I could make a fortune selling it here in alaska.
no 256 bit cards?
thats a HUGE mid range performance gap to leave open.
I mean, top cards are 384 and 320 bit, and mid range to low range only 128 bit??
if they really do ignore the mid range so much, then this leave a huge gap for ATI to come in with a successor to the X1950 pro, which is already a 256 bit card for the mid range, and do some major damage to Nvidias market share.
low to mid range products outsell high end products by like 10 to 1 or something crazy like that.
if Nvidia leaves such a obvious gap in performance potential open, I hope ATI jumps all over it. they need all the help they can get these days (AMD that is)
yeah, for mid range i guess 8800GTS 320MB is prolyl a much better solution ... only $299 in frys here, and some sites have it close to $250 ... cuz $199 cant relaly buy u much
^^ I hope ATI punches um in the gut with that move. These video cards look "cute" for some reason:p:
this is why i always go for high end cards.
Funny Prices for these low-end cards :D i think nvidia must dream or don't see the street prices for the 8800gts 320m
Before everyone starts freaking out remember the 6600 was 128bit and it did pretty well. This lineup would fit well with the 6x00 series. 8800GTX(6800U), 8800GTS(6800GT), 8800GTS 320mb(6800), and the 8600GTS(6600gt) and 8600gt(6600).
If they 8600GTS comes out at $250 or doesn't perform i'll pick up the 8800GTS 320mb but otherwise I think the 8600 GTS will be my choice.
people said the same about the 7600GT is im not mistaken.
that 128bit wouldnt be enough etc and the 7600GT turned out to be a pretty good card.
edit:
if the 8600 really is 128bit it will most likely be done to lower production costs. Im guessing they are waiting with 256bit mid-end until they can make chips on 65nm.
and its likely that they first pump out a cheap to produce card and wait how wel the x2600 will perform. and if the x2600 turns out to be ebtter they come with a faster and better card.
Because nVidia has said so themselves. If you wanna get right down to it, these chips have been "pin-compatible" since the release of the 6600 series.
I agree but then again, these rumors coming from non-legit places tend to be highly innacurate. I remember not too long ago when people swore up and down claiming that the 8900 series was going to be released soon. Infact I was laughed at when I insisted they didnt exist. My guess is that just like the 8900 series, these card's pricing is incorrect.
I wouldnt be at all suprised if nVidia leaks information like this to see what the public reacion would be and they change their prices accordingly, thats pretty typical in terms of pricing tactics.
I will say this however, if it is infact 128bit RAM, it had better be high speed GDDR3 or GDDR4 to offset the loss of birate. There's no way a 1.3ghz 128bit ram chip on any mid range card is going to hack it. It will have to be atleast 2ghz otherwise ATi will just make a card that uses GC20 or GC16 @ 256bit (which I assume is the same price as the higher end 128bits) and simply slaughter the nVidia offerings.
Oh no no, that has nothing to do with it. I can promise you the reason they are using 128bit ram is because they are afraid that the card is powerful enough that when given 256bit RAM it can match the peformance of many of the higher end cards at lower resolutions (say 1076/768 etc).
If a card half the price of their main ones is capable of doing that for I'd say 99.9% of their customers it would be a simple decision as to which card they are going to buy, ie *not* the 8800 @ $400 or the cutdowns at $250-300. By handicapping the RAM significantally it puts enough distance between it and the 8800s to justify people purchasing them over their crippled cousins.
In terms of sales a 256bit card with that kind of GPU would be disasterous to their high end modles. The only way they are going to make a true cutdown like the 7900GS is at EOL when they are trying to clear inventory at any cost.
if you actually read the first post you will see that the 8600gts does have 2ghz mem . so the bandwidth should not be TO bad but still not enough.
Quote:
G84 supports 3840x2400 resolution
Geforce 8600 GTS / GT have the power
It looks like that resolutions are doomed to grow. An average gamer plays on the 19 inch 1280x1024 display, some lucky ones have 1650x1080 resolutions on their big wide sceens and only a few rich ones can afford a 30 inch display and pay up to $2000 for 2650x1600 resolution.
The new Nvidia card will let you play at 3840 × 2400 but only at 30 Hz which is two times 1900x1200 resolution. I am not aware of any display or device that can bennefit from this Dual link DVI-I output but obviously there will be a purpouse for such a resolution. It is a cool pixel number but un cool refresh rate as you need at least 60 Hz to make your eyes feel good.
In worst case scenario it is a nice tick box feature, that you can state on the box and impress anyone.
Quote:
Geforce 8600GTS works at 675 MHz
80 nanometre can push it high
We found out that chip that use to be codenamed G84 will end up with much higher frequencies than we originally expected. G84 based Geforce 8600GTS, a top of the mainstream offer will end up clocked at 675 MHz core. We still don’t know the memory clock but we are sure that Nvidia plans to use the GDDR 3 memory with speeds higher than 1600 MHz, if not even higher.
G84 will be fastest clocked chip from Nvidia to date and the clock comes in such a high speed only thanks to the 80 nanometre marchitecture. G80 works at at default at 575 MHz but G80 is 90 nanometre, not 80 as G84.
This is very close to the limit of the technology as you can not clock 80 nanometre chips at much higher frequency than 700 Mhz. Obviously Nvidia has to think about shrinking the chip to 65 nanometre but this won’t happen anytime soon as least not that we know off.
We already said here that the launch date is April, middle of the month.
http://www.fudzilla.com/index.php?op...id=98&Itemid=1
regards
Looks like the mounting holes are the same as the 6x & 7x series, so is it foolish to home that my VF900 will mount?
Also, will these new cards be 10.1 compatible? I read some news that the R600 cards WILL support 10.1. But then, I also read some news that Saddam Hussein was developing nuclear weapons...
so it's 128bit instead 256bit?
i'll forget it..and wait for 8300 :D
while on the subject of new cards, does anyone know when ATI or Nvidia will come out with physics drivers so I can use one of these crappy little cards for dedicated physics? ive been itching to know when the drivers come out cus i want a actual physics card so it doesnt leach power from my processor or 8800GTX, cus once i get my new monitor (30" lcd, 8ms response and 2560x1600 res) i wont have power to spare from the rest of my system plus ill be getting either another 8800 or upgrade to two of the next Nvidia cards whatever they decide to name them to just support that damn big monitor
If it supports 3840x2400 resolution doesn't mean it's playable, even current cards from low end class support 2048x1536 but how many games can't be played at that resolution, anyway guess it's good for a big monitor in a company but even there instead of buying a expensive LCD screen that supports that resolution they wouldgo for a projector. :)
Quote:
Geforce 8600 GTS details out
675 MHz core / 2000 MHz memory
We have the final details about G84 based cards. Geforce 8600 GTS will be the fastest mainstream card from Nvidia in sub € / $200 price range. Of course, it is a Direct X 10 compatible and it is scheduled for middle April announcement.
Geforce 8600 GTS, G84 still has 128 bit memory controller, quite unusual for 2007 but there you go. Its GDDR 3 memory works at 2000 MHz so that is the catch. It still guaranties you some performance.
The card comes with 256 MB of memory and we don’t expect to see it at Cebit but there might be miracles. This will finally make directX 10 cards a bit more affordable, and then there is a G86 Nvidia’s entry level part scheduled for later date.
http://www.fudzilla.com/index.php?op...d=123&Itemid=1
regards
well increased bus sizes DO give diminishing returns.
clock speed increases give mroe or less linear returns to both overall bandwidth and latency.
With any luck, we'll end up with a 8600Ultra or 8900somethign which has a 256bit bus AND GDDR4
call be crazy, but I see memory performance as being about as big of a bottleneck as GPU performance, and I prefer lower latency and higher clocks over straight bus increases.
I'm still waiting for the 8600U.
I think they gave up the Ultra naming scheme. traded it in for something more recognizable (GTX, GT, GTS).
I can't believe we are comparing the 6600 to the 8600. The 6600 had a 128bit interface,
so it's fine that the 8600 has a 128bit. If you keep accepting the same old crap, then how
are we ever gonna get any better?
Ignoring the fact that the 8600 is a DX10 card right :rolleyes: