Its easy to enhance performance if you can sacrifice other things...
Printable View
Its easy to enhance performance if you can sacrifice other things...
All modern test have IQ part, so cheat with IQ is the best way to be caught. May be demo bug or drivers issues but for some it seems esier to call it cheating.
Would you give it a rest already, that's the most broken record statement(s) of the year. Do you need the approval of others to validate your own opinion of the R600 before mature drivers are released? Also, your statement is purely one from the Nvidia fan camp. Besides, what IQ test? Do you have a link?
I have no clue what you are talking about, so I assume you mean the Call Of Juarez test.
Non-mouse over:
http://tertsi.users.daug.net/temp/R600/iq/coj_ati.jpg
http://tertsi.users.daug.net/temp/R600/iq/coj_nv.jpg
Mouse-over:
http://tertsi.users.daug.net/temp/R6...ia_vs_ati.html
Article:
http://www.legitreviews.com/article/504/2/
The "mature drivers" thing is getting annoying. How do you define drivers to be mature? When ATI releases an official version? Why am I even asking; no one is going to agree on a definition of mature....
Don't start that crap with me. Current drivers don't show the potential of the card yet, bottom line. The term "mature" is not relative. It has a very specific meaning. Therefore, know what you are talking about before you start blathering.
As for the photos:
-He clearly states he has pre-alpha (but doesn't mention which version). Again, drivers are NOT MATURE!
-The image test looks like the bitmap was set to quality and not High Quality. I find myself having to change this with each driver update.
-He clearly states:
But never shows a pic of those settings and never explains what those settings were at the time he provided those photos. Not only do we not know what those adjustments were between R600/G80, we have no control example to bench against at other bitmap/AA/AF settings.Quote:
When it comes to image quality we installed both drivers and without adjusting any of the settings in the control panel jumped right into the benchmark to see how they did.
In all, this is a pure example of FUD through photos. There is no supporting documentation (regarding those photos) to provide an explain of how he arrived to that conclusion (through photos). People taking these photos at face value without asking questions first is the real problem here. Why shouldn't you ask questions? It's a freaking 3 page (short) review of the HD 2900 XT.
I'll wait for overclocked partner cards and Catalyst 8.38 before making a judgement. I'm thinking of buying the HD2900XT, it has about the same performance as the GTS and still has headroom for driver improvement. So i guess this should be a better card to buy than the GTS. This card shows potential, it is not as good as the GTX, but can come close to it with driver improvement (I hope).
I only had NVidia cards (GeForce 2 MX400, GeForce 5600XT and GeForce 7300LE), but now i'm thinking of changing to AMD.
I waited for this card 5 or 6 months and i hoped for alot more, but it is not that bad for the money it costs.
As for the reviews, some sound fishy to me. I don't know what to believe anymore tbh.
Any reviews of the HD2600XT? How does it compre to the X1950Pro?
The 8.37.4.2 drivers are alpha and have numerous issues. They do give a glimpse as to how ATI/AMD will optimize this card. The 8.38b2 are based on this driver and offer further refinements but CrossFire/OpenGL is not working right under Vista, AVIVO is not working correctly with Blu-ray/HD-DVD, IQ is not up to par in several areas. Unless you need them to show to some nice 3DMark marks then I would wait on the next release that is scheduled on 5/23. 8.39s are in alpha testing now.
The card was dealyed cuz they wanted to release all cards at the same time.
They were spitting out 65nm cards and there was no problem what so ever with silicon or software, this is what AMD said.
So we waited all this time and they only launched this card anyway..people were telling that it would be worth the waiting cuz we would get something extra that everyone wanted, wtf was that ? HL2 ?
This sux so much! Or is ít AMD that sux ? Im switching side..Im now an official Intel and Nvidia Fanboy idiot.
I received this today :
http://aycu26.webshots.com/image/166...1661763_rs.jpg
tests tomorow
regards
No one in there right mind would upgrade until there is a value to having DX10, let alone a quad scenario anyway - wait until September.
no wait until 2008, no, fall 2009 or better yet 2010... Everyone needs to stop playing the waiting game, THERE IS ALWAYS SOMETHING BETTER A FEW MONTHS FROM NOW IN THIS HOBBY OF OURS!!!
If the x2600 uses a similar ring-bus as the x2900, and has 4 memory chips rather than 8, and still uses one ring-stop for 2 memory chips, it would be more efficient than the x2900, thus making up for the large drop in bus-width.
I wouldn't hold your breath on it tough.
There is another difference between 8600-8800 and the 2600-2900 relation. The 8800 has 6 shader clusters of 16 shaders each (96 shaders). The 8600 has 2 shader clusters of 16 shaders each (32 shaders). However the 2900 has 4 shader clusters of 16 shaders each (64 shaders), but the 2600 has a total of 24 shaders total. Apparently it has 2 shader clusters of 12 shaders each, or possible even 4 shader cluster of 6 each. The reduction of shaders in the shader cluster has an impact on the performance, positive or negative I do not know. It could also be so small it's unnoticeable....
I got my own numbers...
QX6700 @ 3.4Ghz
Asus P5K Deluxe
2x1GB Gskill PC8000HZ
Forceware 160.03 for NVIDIA card
6.87.4.2 for ATI Card
Quote:
Asus EAH2900XT
3DMark 2001: 55524
3DMark 2003: 37779
3DMark 2005: 19830
3DMark 2006: 12041
AA16X = AA8X Filter Wide-Tent = 16 Samples
FEAR
1024x768 AA4X, AF 16X Tot max: 92 FPS
1024x768 AA16X, AF16X Tot max: 49 FPS
Company of Heroes
1680x1050 All max, no AA: 91 FPS
1680x1050 All max, AA16X: 55 FPS
Supreme Commander
1680x1050 high quality, AA4X AF16X: 45 FPS
1680x1050 high quality, AA16X AF16X: 26 FPS
Lost Coast
1680x1050, AA8X AF16X HDR FULL: 76 FPS
1680x1050, AA16X AF16X HDR FULL: 32 FPS
Episody One
1680x1050, AA4X AF16X HDR FULL: 128 FPS
1680x1050, AA16X AF16X HDR FULL: 39 FPS
Oblivion Indoor
1680x1050, AA4X AF16X HDR FULL IQ MAX: 54 FPS
1680x1050, AA16X AF16X HDR FULL IQ MAX: 39 FPS
Far Cry 1.4beta (Regulator)
1680x1050, AA4X AF16X IQ MAX: 77 FPS
1680x1050, AA16X AF16X IQ MAX: 28 FPS
GRAW
1680x1050, AA4X AF16X IQ MAX: 64 FPS
1680x1050, AA16X AF16X IQ MAX: 64 FPS (I guess AA16X wasn't applied correctly)
STALKER
1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 28 FPS
Quote:
Asus 8800 Ultra
3DMark 2001: 58734
3DMark 2003: 43949
3DMark 2005: 19369
3DMark 2006: 13884
FEAR
1024x768 AA4X, AF 16X Tot max: 156 FPS
1024x768 AA16X, AF16X Tot max: 145 FPS
Company of Heroes
1680x1050 All max, no AA: 113 FPS
1680x1050 All max, AA16X: 86 FPS
Supreme Commander
1680x1050 high quality, AA4X AF16X: 71 FPS
1680x1050 high quality, AA16X AF16X: 62 FPS
Lost Coast
1680x1050, AA8X AF16X HDR FULL: 127 FPS
1680x1050, AA16X AF16X HDR FULL: 115 FPS
Episody One
1680x1050, AA4X AF16X HDR FULL: 178 FPS
1680x1050, AA16X AF16X HDR FULL: 128 FPS
Oblivion Indoor
1680x1050, AA4X AF16X HDR FULL IQ MAX: 113 FPS
1680x1050, AA16X AF16X HDR FULL IQ MAX: 102 FPS
Far Cry 1.4beta (Regulator)
1680x1050, AA4X AF16X IQ MAX: 130 FPS
1680x1050, AA16X AF16X IQ MAX: 112 FPS
GRAW
1680x1050, AA4X AF16X IQ MAX: 102 FPS
1680x1050, AA16X AF16X IQ MAX: 95 FPS
STALKER
1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 54 FPS
Quote:
Power Consumption
GTS 320: 244W idle 437W full
GTX: 226W idle 447W full
Ultra: 259W idle 484W full
HD2900XT: 225W idle 502W full
Why you put 2900XT against a Ultra?We already know it can NEVER compete against a Ultra...
@KrampaK the new Wide Tent filter its same image quality of traditional AA x8 and you got less FPS with Wide tent filter , Try run same games with traditional AA x8 and see if the FPS go UP .....
BTW: Windows Vista or XP ???
regards
Krampak where do you get these numbers from ?
is that own experience ?
AMD Cautioned Reviewers On DX10 Lost Planet Benchmark
http://www.vr-zone.com/Quote:
Tomorrow Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a “Lost Planet” benchmark. Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game.
I thought delay was so they could launch the entire family at once. BTW, what happen to we'll launch 10 DX10 GPUs?
That power consumption is insane. Can you stop OC'ing and test the power usage then?
GRAW uses the same (type of) engine as STALKER. If you're using the "Full Dynamic Lighting" option (which I can only assume you are), then any AA settings you have in your control panel have no effect. The in-game settings only provide marginal quality improvement, so you should also run both games with AA off.
There's more sofware bottlenecks to R600 than the drivers ATi writes. The shaders on R600 are so much different that current DirectX can't properly utilize them.
I reckon we'll see noticeable gains as soon as Microsoft updates DirectX and more importantly, optimises the HLSL (high-level shader language) compiler in it for the shaders found in R600.