Yup... I feel bad for ya bro... the 8600GT is no where near where it should've been... :rolleyes:
Printable View
Yah, If only they made it run at 256-bit.. :D At least I did not get the 8600GTS. saved some precious moolah from doing so.
Used riva tuner recently. :D
I'm at 702/1674/1750 on my 8600GT. :D I just have a 17" LCD anyway. :P
ah, were do I find that one? :D
Here is the link to the thread: http://www.xtremesystems.org/forums/...d.php?t=156499
Good luck!
thanks. will do some benching later. :D
Anyone get the impression nVidia rumoured a 8700 product name, realised people weren't that excited by a more power efficient, lesser performance model and are now attempting to get people going again by marketing it as an 8800 series card?
Personally, they can call it the "nVidia 1" for all I care - if it's got a good price/performance factor, that's all that matters.
Still no news on a 8800Ultra-beater? :rolleyes:
News on nextgen high end cards?! November or January :-S ? if January I'l go with ATi 2900Pro 1GB GDDR4...
High end cards no comee until next year...hence no news of R680 as well...
"Things are quiet on the Eastern front" :p:
Perkam
if you have a smaller monitor you won't need to worry much about when high end cards are comming out. :D
:D Hehehe
8800GTX's are already overkill for my 17" LCD gaming anyway. :P
You are oversimplifying the issue. And no, Nvidia's 1+1 are not the same with ATi's 4+1. The bottom line is though that most current games are coded in a way that favors Nvidia's simple shader approach and not ATi's complex shader approach. That and what probably is a hardware bug affecting AA performance which hopefully will be gone in RV670.
He is over simplifying the issue. He neglects the MUL (making it 2+1 FYI) which does operations (Special function, interpolator, perspective correction) that are not general shading, which is left for the MADDs. It's not part of the SP, but does do certain jobs an SP would do if it wern't there or the jobs wern't assigned to the MUL...AFAIK ATi does these ops with their 64 (4+1) shaders. You could say Nvidia has 384 shaders on G80, but you'd be wrong. It's much more complicated. As for the AA issue, I agree that plays a part and hopefully will be corrected.
@Candyman - AFAIK, it was meant to be between 86 and 88gts and allow a phase-out of the 8800GTS320MB, but now Nvidia is trying to kick it up a notch to match RV670. We won't know if they succeeded until Nov 19th...Or until the leaks come. :)
I know I was impatient when I bought an 8800GTX back in the start of September, but looking at all the guess-work going 'round about release-dates, I'd say I'm happy I did so... It still pretty much runs everything as well as you could wish for a card which is as of now just less than a year old... It's actually quite remarkable that a single graphics-card has been able to remain on top for that long... (no, I'm not including the ULTRA on purpose, that card doesn't have any place in my books)...
I'm happy with my choice... For now :p:
Best Regards :toast:
Fudzilla reported G92 aka D8M will be launched on the 29th this month.
Well, if all these rumours are true it`s a big dissapointment for me from NVIDIA:( It seems this year only GF8800GT will be launched which is said to have performance between GF8600GTS and GF8800GTS. 64SP GPU is not enough against rv670 which is said to be a little faster than r600.
But if D8M is GF8800GT so what is D8P??:confused:
What you said is exactly my understanding:
1. It's cheaper to produce than 8800GTS 320MB (g80 is huge, plus more complicated/expensive PCB as it uses the same as GTX even if using less mem than the 512 8800gt)
2. Performance I believe was meant to be lower than 8800GTS (320/640) but now I believe they are trying to match it, as obviously R600 (and assumingly Rv670) are on par with the GTS's for what appears to be quite a bit less money to both produce and to market. 8800GTS 640MB is still $350, and would have a market above a slightly slower 256-bit 512mb G92, but will not against a $250 Rv670, hence they want to juice everything out of their cheaper to produce part to be competitive with ATi's solution.
Hope that makes sense.
As for D8P, still think it's a 96SP 6 TCP 8800GTS-like thing. That could clock well, both on core/shaders, allow room for NVIO to be on-die, and use little enough power and produce little enough heat to jam two on a card(s) and call it a 8950GX2, just as the 7950GX2 was two 7900GTs. Heck, it'd satisfy both the 1Tflop rumors and the 192SP rumors....and I'm all about killing as many possible scenarios with one stone as possible. :p:
I definitely agree that money and the market is and will drive NVIDIA to do what is needed.
In this case, an 8800GT is needed because:
1) The market wants a mid-range card that can actually perform. The 8600GTS is a joke and ATI just released the 2900PRO and will soon release the 55nm RV670. Jumping into the next high-end while leaving the mid-range market (where most of the money is made, as well as where most OEM contracts will be had) would be disastrous, esp. if you hand it away to your archrival. Keep in mind that more and more DX10 games are coming out soon (and have been now) and so the market for DX10 compatible cards is growing.
and
2) The 8800GTS 320 and 640 are the same core as the GTX/Ultra, but are likely the lower binned ones (with the lower binned shaders/streams disabled). But, the original cost to make each core is the same as the GTX/Ultra. The G80 core is HUGE and had record-breaking # of transistors when it was released. I'm sure that sucker ain't cheap to make and it might be smarter to go to a die-shrunk revised core that's cheaper to produce while leaving the expensive G80 core to expensive cards, the GTX and Ultra, and keep high profits.
and
3) Nvidia has always released a die-shrunk mid-range card before using that size for the high-end. This past year, a combination of the fact that the 8600 series was still 80nm, same as the G80, and that ATI tanked for most of the year, Nvidia feels that it can afford to release a mid-range card it needs, while using it as a testbed for a die-shrink for the high end performance crown, which it can release afterwards because there is no pressure from ATI to do so.
I might be completely wrong but from a business perspective, it can definitely make sense