Expected launch time: 1Q 2007Quote:
R600, codename: Pele
True 512-bit memoryinterface (Nvidia G80 = 384-bit)
1024 MB GDDR4 mem 2,5GHz effective, bandwidth 160GB/s (G80 = 86GB/s)
80nm
Free 4xAA
Original source
Printable View
Expected launch time: 1Q 2007Quote:
R600, codename: Pele
True 512-bit memoryinterface (Nvidia G80 = 384-bit)
1024 MB GDDR4 mem 2,5GHz effective, bandwidth 160GB/s (G80 = 86GB/s)
80nm
Free 4xAA
Original source
if its true its a beast.
Free 4xAA is tastety.:cool:
________
Toyota Rz Engine
hmm looks a little too good, can it really be ~50% fasterer than G80?
If it's as fast as the specs look, nvidia is BEYOND screwed...
Dam thats impressive. If true I guess I will have to buy an AMD product.
If this is true . Would SLI G80 be competitive to 1 R600? I read the G80 info and its 125 shaders. But keep in mind that 64 R600 unified shaders = is 192 shaders.
If this is true Its going to be a massive performer. How much $$$$$$$$$$$$$$$$$$$$.
This card will be a monster folding beast and a hell of a good video card:D
no, they are SUPER screwed lmao!!! :fact:Quote:
Originally Posted by afireinside
if the nvidia g80 has 700+ transistors, ur talking about a heat output of over w 350 watts!!!
jeez that means it will need to take in more than 450watts a card...
new psus wont work that well
Holy crap! 192 shaders Oo!Quote:
Originally Posted by Turtle 1
but its unified remember that, so it can have 16 pipes and 176 pixel shaders, or 176 pipes and 16 shaders :slobber:Quote:
Originally Posted by fhpchris
WTF that can't be right, GDDR4 at 2.5 GHz??
With so much bandwidth I wonder if its even necessary to have 1024 Mb, that will just be pushing the price sky high, 512Mb would more that beat Nvidias 7xxMb in performance anyway.
On a sidenote, if this is true, yes Nvidia is in some dookie
???Quote:
Originally Posted by Dimitriman
gddr4 is tested to be able to go to 3ghz (1.5ghz ddr)
1225mhz is nothing!!!
I see, sounds nice but I still think 1024 is just too expensive and unnecessary since it will perform so fast anyway.Quote:
Originally Posted by rpg711
Agreed especially since x1950s already have 1Ghz GDDR4.Quote:
Originally Posted by rpg711
They talk about free 4xAA but apparently they don't take the eDRAM approach, so it isn't necessarily free 4xAA but rather so much ram so fast that 4xAA can be considered free, from what I understood
Doesn't look to good for nVidia now does it :p:Quote:
Originally Posted by flopper
As it stands at the moment, from looking at specs, nVidia has no chance. There are SO many variables though, like if ATi has enough stock or not, and weather or not they can keep up with supply and demand. They should be able to though, given the fact that AMD owns them now and should be able to provide the neceserry materials, and funds to make things happen quickly. It also depends on how much heat this will put out, power consumption, real-world benching, drivers, etc.
u know i might jsut wait for this, as my GeForce 7800GTX SLI is not doing too bad even for the graphic intense games like Oblivion
but ATI + intel dont really have any future in multi GPU as RD600 will probably be the last chipset ATI will make for intel.
haha dimitriman, i have links to that picture in ur avatar :P even videos
looks like the competition between nVidia and ATi isn't dead yet :D
it all depends on the gpu! 700= transistors is alot of space and nvidia could have something up their sleve. and if the r600 has 64 shaders and the nvidia has 128 then i think it is an even match or mabye even in nvidias favor because 128 shaders is a lot of power!!!!!!!!! and there is no point to having so much unused bandwith. so only time will tell if all this bandwith is a waste. looks to be a beast tho:D
Where are you getting your numbers from... 1 unified shader can't have 3 unified shaders tied to it. If it only has 64 unified shaders, it'll be just that..only 64 unified shaders. With a 500 million transistor die size, 192 shaders is a physical impossibility.Quote:
Originally Posted by Turtle 1
Either way, it sounds like ATi isn't out of the fight yet, but NVidia is going to get the christmas shoppers money either way.
2 of these, a kentsfield, rd600 and 4gig d9-ram and I'm set.
post some vid;s, haven't seen many yet :PQuote:
Originally Posted by theteamaqua
I think people are drawing the 192 number in the fact that the R580 has 16 pixel pipelines and 48 shaders. So they naturally assume to multiply the number by 3 to get number of total unified shaders? Just a thought......
yeah for about a year, then you'll have to upgrade again :p:Quote:
Originally Posted by ettis
You the man . and you are correctQuote:
Originally Posted by rpg711