really? because if I remember correctly, AMD/ATi does continue to spend money on R&D and typically speaking such funding creates new and better products :rolleyes:
Printable View
Guys, please consider the possibility that these specs are fake. The eDram piece is a huge indicator to me that someone made these specs up.
OMG..:eek:.. this is unreal :)
I will start saving up :)
Well of course they do. But can they keep up with nVidias time schedule. And if they do, is there a possibility that they will mess up, again.
If those specs are true and the card will be released in the end of this year, then ATI is toast. Because by the end of this year i cant see ATI comming with something new beside a 65nm of R600 card.
But again the spec of this 9800GTX can be fake.
We should edit the original post to let people know its some comment left a month and a half ago, not any fact...
KoHaN69 - why not name it HAL 9000 ? :D
Well the way I remember, AMD was toasting Intel for 3 years, but instead of working on K10 when they had money, they've been...?
I can't remember who said it here, but something like "Looks like AMD spent their R&D budget on booze and hookers for the last 3 years". Don't get me wrong I love AMD to bits, but because of their poor foresight they're getting spit-roasted by Intel and nV at the moment :shakes:
http://redhill.net.au/c/c-8.html
Scroll down and look at the chart for a couple minutes and realize that even though they haven't always had the performance crown doesn't mean that it killed them nor did it mean they weren't working their ass off.
Heck their "failed" K5 was the key to their Famous K6 procs, after they gleaned some alternate experience from nexgen (which was considered a horrid idea by the industry but proved to be genius) or how about how they picked up the DEC alpha team at huge expense and once again thought as an absolute failure (atleast until K7 and K8 came out and changed the industry) and NOW... AMD bought ATi and people are screaming how horrid of an idea it is but I honestly Question people who don't look at AMD's history and see how often that was exactly perfect for their future plans.
well im not expert either but what i under stood is the unified shader architecture took the pixel pipelines and vertex shaders and made a one size fits all processor but the ROP's and TMU's still are on there own. that seems to be the 2900xt's biggest weakness it has less than the 8800gtx and gts.
i see your point and you seem to be right. i did not know that you would need that much edram in order to make it work on higher res screens. but surely edram could be used to speed something up. but i guess it's far to "big" to fit on a die and offer good benefits compared to more stream processors, ROP's and TMU's.
on another note it says "next gen unified shader" could that combine the ROP's and TMU's into the steam processors?
And my point is that over the past 3 years we have seen NO major architectural improvements in AMD processors at all. Higher clocks, DDR2 support, 65nm shrink (and a not very successful one it seems). Maybe it's too much to expect revolutionary change, but that list is evolutionary at the bare minimum imo.
K10 should have been out a year ago, giving Intel a worthy competitor to Conroe.
By the way, these specs are quite obviously made up. We go through the same thing EVERYTIME whenever we start to smell a new graphics card release. Some jackass pieces together rubbish from the Inq and Fudzilla and adds some "common sense" evolutionary bits to the spec, along with an arbitrary clock speed guess, and this makes us all shout "zomfgwtfbbq" and cream our pants.
Sorry I'm not biting on this one! :cussing:
one silly stupid thought but, have you considered that during those 3+ years that AMD has been developing a new design to such a degree that little to none of the features came out in existing products.
Kinda like how K6 never had K7's improved Floating point performance
we're off topic
back on topic: when is the approx release date for this "9800GTX?"
I can't believe people are already wetting themselves over these specs.
I'd say theres a 99.9 percent chance those specs are complete BS.
I was discussing this with Mav on MSN earlier...I thought the Audio chip and the GDDR4 sounded a bit dodgy, but maybe GDDR4 will be really decent by then?
How would a sound chips outputs be dealt with? ribbon cable and a PCI-slot for the jacks? adjust the heatsink so theres space for them on the 2nd half of the I/O-PCI plate? (Which would strongly suggest the HSF will not disperse heat out the back of the case)
I still dont get it, for such a beast of a card it doesnt make sense to have the gts version of a newer card be 256bit when the older is 320bit so is either the GTS version of the 9 series gonna be 768mb and the high end 1gb or it just would not make sense @ all to get the GTS version of the 9 series!!!:mad:
They aren't the dodgy bots at all mate... as mentioned already the "sound" bit could just be HDMI audio pass-through as done on teh HD2900, and GDDR4 is surely a logical progression with faster speeds etc?
The EDRAM bit is probably the fishiest item of the list!
But the entire thing is made up anyway so it doesn't matter :p:
Agreed. I believe the GTS will be 384-bit. Well, that or 448-bit. Considering performance parts usually replace last-gen flagships (and are a little faster) let's see why 384-bit works:
How many shaders does the GTS now have compared to the GTX?
A. 3/4.
Where are most ATi performance products in specs compared to the flagship?
A. 3/4.
What is 384/512?
A. 3/4
What is 12 ram chips divided by 16 ram chips (4 less chips saving nvidia moneys and perhaps a pcb change from the current 8800's for the GTS)
A. 3/4.
56x? = The answer to the universe
A. 3/4