Kinda interesting:
http://forums.legitreviews.com/about15370.html
EDIT: just realized this is the same test mentioned in the Radeon 4000 Reveal sticky.
Printable View
Kinda interesting:
http://forums.legitreviews.com/about15370.html
EDIT: just realized this is the same test mentioned in the Radeon 4000 Reveal sticky.
i'm as excited about the 4870's as anyone else is, but i'm not holding my breath for a single 4870 to be faster than a 9800 GX2. take all leaked tests with a grain of salt. anyone can make a fake graph.
scores are looking sexzy...but i can't wait until some proto types are released to some review sites for real testing =)
Those are 100% percent fake.
Could be real ? A single 4870 beating the 9800GX2 by that much ? No, not really.
BTW, remember the site with the supposed pre-release ( prior to ATi's HD2900XT release ) benchmarks featuring the 8800GTX and the unreleased ( back then ) HD2900XT in Crysis ( they claimed to have a beta ) and 3D Mark06 which have been proven to be fake ?
It was the same site.
i hate to say this but i agree in 3dmark that seams imposable with an 800mhz gpu (thats not even the clocks of the 4850s projected speed) and gddr5 cant make that much of a difference with only a 3ghz cpu, that 8.5-10k sm2 and sm3 scores is just looks wrong
also crysis dosnt use the shaders in a 3870 so in a 4870 it wouldent use the bonus shaders, this is unbelievably fake
Wait, I thought one of the biggest reasons the HD2k and HD3k series sucked at playing Crysis was because it was limited by its texture units. nVidia have a lot more texture units and is the reason why it shines in Crysis (and its been more optimized than ATI's GPU's). The HD4850/4870 have twice the amount of texture units, far more shading power and higher clock speeds so its bound to be atleast twice as powerful than the HD3850/3870, which is still faster than any single nVidia GPU, though the GX2 should still be faster.
too bad thats not the real scores :mad:
NV has more filtering per shader (much higher clocks) not more TMU, with shaders at higher speeds and cysis is made for high speed shaders (and may or may not limit the total usable shaders)
as of now NV and ati have the same Gpixle/s but then NV can use a different clock for the texture filtering so they have 2x more texture filtering since the shader clock for textures and for pixles are 3x higher for textures
this isnt from more shading its from having a card full of vertex shaders that can down clock and do pixle shading, and not having a true "stream" processor, thats why the g92 isnt DX10.1
the rv770 will go against this by upping the TMU per ROP, but if stuff still has problems using more than 128 shaders then NV sponsored games (unreal engine specifically) will still have lower fps than non NV games other than assassins creed until it gets patched
Neither did I....Quote:
and i never heard that they stopped doing it
WOW, some people are really bored making charts and going through the trouble to put out fake stuff