Its easy to enhance performance if you can sacrifice other things...
Printable View
Its easy to enhance performance if you can sacrifice other things...
All modern test have IQ part, so cheat with IQ is the best way to be caught. May be demo bug or drivers issues but for some it seems esier to call it cheating.
Would you give it a rest already, that's the most broken record statement(s) of the year. Do you need the approval of others to validate your own opinion of the R600 before mature drivers are released? Also, your statement is purely one from the Nvidia fan camp. Besides, what IQ test? Do you have a link?
I have no clue what you are talking about, so I assume you mean the Call Of Juarez test.
Non-mouse over:
http://tertsi.users.daug.net/temp/R600/iq/coj_ati.jpg
http://tertsi.users.daug.net/temp/R600/iq/coj_nv.jpg
Mouse-over:
http://tertsi.users.daug.net/temp/R6...ia_vs_ati.html
Article:
http://www.legitreviews.com/article/504/2/
The "mature drivers" thing is getting annoying. How do you define drivers to be mature? When ATI releases an official version? Why am I even asking; no one is going to agree on a definition of mature....
Don't start that crap with me. Current drivers don't show the potential of the card yet, bottom line. The term "mature" is not relative. It has a very specific meaning. Therefore, know what you are talking about before you start blathering.
As for the photos:
-He clearly states he has pre-alpha (but doesn't mention which version). Again, drivers are NOT MATURE!
-The image test looks like the bitmap was set to quality and not High Quality. I find myself having to change this with each driver update.
-He clearly states:
But never shows a pic of those settings and never explains what those settings were at the time he provided those photos. Not only do we not know what those adjustments were between R600/G80, we have no control example to bench against at other bitmap/AA/AF settings.Quote:
When it comes to image quality we installed both drivers and without adjusting any of the settings in the control panel jumped right into the benchmark to see how they did.
In all, this is a pure example of FUD through photos. There is no supporting documentation (regarding those photos) to provide an explain of how he arrived to that conclusion (through photos). People taking these photos at face value without asking questions first is the real problem here. Why shouldn't you ask questions? It's a freaking 3 page (short) review of the HD 2900 XT.
I'll wait for overclocked partner cards and Catalyst 8.38 before making a judgement. I'm thinking of buying the HD2900XT, it has about the same performance as the GTS and still has headroom for driver improvement. So i guess this should be a better card to buy than the GTS. This card shows potential, it is not as good as the GTX, but can come close to it with driver improvement (I hope).
I only had NVidia cards (GeForce 2 MX400, GeForce 5600XT and GeForce 7300LE), but now i'm thinking of changing to AMD.
I waited for this card 5 or 6 months and i hoped for alot more, but it is not that bad for the money it costs.
As for the reviews, some sound fishy to me. I don't know what to believe anymore tbh.
Any reviews of the HD2600XT? How does it compre to the X1950Pro?
The 8.37.4.2 drivers are alpha and have numerous issues. They do give a glimpse as to how ATI/AMD will optimize this card. The 8.38b2 are based on this driver and offer further refinements but CrossFire/OpenGL is not working right under Vista, AVIVO is not working correctly with Blu-ray/HD-DVD, IQ is not up to par in several areas. Unless you need them to show to some nice 3DMark marks then I would wait on the next release that is scheduled on 5/23. 8.39s are in alpha testing now.
The card was dealyed cuz they wanted to release all cards at the same time.
They were spitting out 65nm cards and there was no problem what so ever with silicon or software, this is what AMD said.
So we waited all this time and they only launched this card anyway..people were telling that it would be worth the waiting cuz we would get something extra that everyone wanted, wtf was that ? HL2 ?
This sux so much! Or is ít AMD that sux ? Im switching side..Im now an official Intel and Nvidia Fanboy idiot.
I received this today :
http://aycu26.webshots.com/image/166...1661763_rs.jpg
tests tomorow
regards
No one in there right mind would upgrade until there is a value to having DX10, let alone a quad scenario anyway - wait until September.
no wait until 2008, no, fall 2009 or better yet 2010... Everyone needs to stop playing the waiting game, THERE IS ALWAYS SOMETHING BETTER A FEW MONTHS FROM NOW IN THIS HOBBY OF OURS!!!
If the x2600 uses a similar ring-bus as the x2900, and has 4 memory chips rather than 8, and still uses one ring-stop for 2 memory chips, it would be more efficient than the x2900, thus making up for the large drop in bus-width.
I wouldn't hold your breath on it tough.
There is another difference between 8600-8800 and the 2600-2900 relation. The 8800 has 6 shader clusters of 16 shaders each (96 shaders). The 8600 has 2 shader clusters of 16 shaders each (32 shaders). However the 2900 has 4 shader clusters of 16 shaders each (64 shaders), but the 2600 has a total of 24 shaders total. Apparently it has 2 shader clusters of 12 shaders each, or possible even 4 shader cluster of 6 each. The reduction of shaders in the shader cluster has an impact on the performance, positive or negative I do not know. It could also be so small it's unnoticeable....
I got my own numbers...
QX6700 @ 3.4Ghz
Asus P5K Deluxe
2x1GB Gskill PC8000HZ
Forceware 160.03 for NVIDIA card
6.87.4.2 for ATI Card
Quote:
Asus EAH2900XT
3DMark 2001: 55524
3DMark 2003: 37779
3DMark 2005: 19830
3DMark 2006: 12041
AA16X = AA8X Filter Wide-Tent = 16 Samples
FEAR
1024x768 AA4X, AF 16X Tot max: 92 FPS
1024x768 AA16X, AF16X Tot max: 49 FPS
Company of Heroes
1680x1050 All max, no AA: 91 FPS
1680x1050 All max, AA16X: 55 FPS
Supreme Commander
1680x1050 high quality, AA4X AF16X: 45 FPS
1680x1050 high quality, AA16X AF16X: 26 FPS
Lost Coast
1680x1050, AA8X AF16X HDR FULL: 76 FPS
1680x1050, AA16X AF16X HDR FULL: 32 FPS
Episody One
1680x1050, AA4X AF16X HDR FULL: 128 FPS
1680x1050, AA16X AF16X HDR FULL: 39 FPS
Oblivion Indoor
1680x1050, AA4X AF16X HDR FULL IQ MAX: 54 FPS
1680x1050, AA16X AF16X HDR FULL IQ MAX: 39 FPS
Far Cry 1.4beta (Regulator)
1680x1050, AA4X AF16X IQ MAX: 77 FPS
1680x1050, AA16X AF16X IQ MAX: 28 FPS
GRAW
1680x1050, AA4X AF16X IQ MAX: 64 FPS
1680x1050, AA16X AF16X IQ MAX: 64 FPS (I guess AA16X wasn't applied correctly)
STALKER
1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 28 FPS
Quote:
Asus 8800 Ultra
3DMark 2001: 58734
3DMark 2003: 43949
3DMark 2005: 19369
3DMark 2006: 13884
FEAR
1024x768 AA4X, AF 16X Tot max: 156 FPS
1024x768 AA16X, AF16X Tot max: 145 FPS
Company of Heroes
1680x1050 All max, no AA: 113 FPS
1680x1050 All max, AA16X: 86 FPS
Supreme Commander
1680x1050 high quality, AA4X AF16X: 71 FPS
1680x1050 high quality, AA16X AF16X: 62 FPS
Lost Coast
1680x1050, AA8X AF16X HDR FULL: 127 FPS
1680x1050, AA16X AF16X HDR FULL: 115 FPS
Episody One
1680x1050, AA4X AF16X HDR FULL: 178 FPS
1680x1050, AA16X AF16X HDR FULL: 128 FPS
Oblivion Indoor
1680x1050, AA4X AF16X HDR FULL IQ MAX: 113 FPS
1680x1050, AA16X AF16X HDR FULL IQ MAX: 102 FPS
Far Cry 1.4beta (Regulator)
1680x1050, AA4X AF16X IQ MAX: 130 FPS
1680x1050, AA16X AF16X IQ MAX: 112 FPS
GRAW
1680x1050, AA4X AF16X IQ MAX: 102 FPS
1680x1050, AA16X AF16X IQ MAX: 95 FPS
STALKER
1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 54 FPS
Quote:
Power Consumption
GTS 320: 244W idle 437W full
GTX: 226W idle 447W full
Ultra: 259W idle 484W full
HD2900XT: 225W idle 502W full
Why you put 2900XT against a Ultra?We already know it can NEVER compete against a Ultra...
@KrampaK the new Wide Tent filter its same image quality of traditional AA x8 and you got less FPS with Wide tent filter , Try run same games with traditional AA x8 and see if the FPS go UP .....
BTW: Windows Vista or XP ???
regards
Krampak where do you get these numbers from ?
is that own experience ?
AMD Cautioned Reviewers On DX10 Lost Planet Benchmark
http://www.vr-zone.com/Quote:
Tomorrow Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a “Lost Planet” benchmark. Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game.
I thought delay was so they could launch the entire family at once. BTW, what happen to we'll launch 10 DX10 GPUs?
That power consumption is insane. Can you stop OC'ing and test the power usage then?
GRAW uses the same (type of) engine as STALKER. If you're using the "Full Dynamic Lighting" option (which I can only assume you are), then any AA settings you have in your control panel have no effect. The in-game settings only provide marginal quality improvement, so you should also run both games with AA off.
There's more sofware bottlenecks to R600 than the drivers ATi writes. The shaders on R600 are so much different that current DirectX can't properly utilize them.
I reckon we'll see noticeable gains as soon as Microsoft updates DirectX and more importantly, optimises the HLSL (high-level shader language) compiler in it for the shaders found in R600.
Guys... ATi's R600 lacks the hardware for AA Resolve.... Sadly this means AA is likely to always be slow on the part, and no driver will fix that because it's using shader power to do this job.
http://www.beyond3d.com/content/reviews/16/16Quote:
Originally Posted by beyond3d
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=31Quote:
Originally Posted by anandtech
Due to this, it's unlikely the R600 will ever catch up with AA involved.
Finally someone with some sense that actually read the architecture bit of the reviews. The erratic performance of 2900xt across several diferent game benchmarks is to be expected. It's simply the nature of the R600 architecture design.
People will believe whatever they're willing to believe and it takes very hard work to change that. As a advertising professional I thank for that.Quote:
Originally Posted by Periander6
hmmm so all that bandwidth is wasted since the card is poor at doing AA anyway?! No way...
hardware processing has always been and will always be the fastest however, it is the absolutely most limited. hence we have Software controlling our Processors instead of the computer being hard coded.
The 2900xt has a massive BW. It's not like it's allwasted doing AA, but it definitively hurts. But that applies for all GPU designs. The real problem for R600 lacking a dedicated part for AA in the render back units is that it will use shader processing power for it, as pointed by the more in-depth reviews and DilTech.
Yep, so essentially, it's hurting in bandwidth(which, the HD 2900XT has more than enough) AND shader power.
Essentially, it takes two hits for turning on AA instead of one. Definitely not a good thing, and it's not something drivers can technically fix either. It's a hardware problem.
another call of juarez DX10 bench:
http://www.chip.de/bildergalerie/c1_...23.html?show=4
http://aycu08.webshots.com/image/162...9256431_rs.jpg
btw:
Quote:
ATI Drivers for Radeon HD 2900 Series
The following driver version 8.37.4.3 is provided as an update to the driver software that currently ships with the Radeon HD 2900 series products. The download file contains the display drivers and Catalyst Control Center. This driver applies to the following operating systems: Windows XP Home/Professional/Media Center Edition/Professional x64 Edition and Windows Vista 32/64-bit. For more information please refer to the release notes.
Source: AMD/ATI
http://forums.techpowerup.com/showthread.php?t=31181
regards
This is all very disheartening. I can't believe anyone who bought an 8800 6 months ago is still not going to go through any buyers remorse anytime soon. I honestly didn't see this happening. The clarification that the 8800 performs MUCH better and has better IQ to boot is a hard thing to read as I have always been very pro ATI. I waited a long time for this day when both cards would compete head to head. All this talk about R600 having inherent hardware flaws such as lack of ROPS, a shoddy implementation of how AA is done etc is really making me feel quite disappointed in this card as a whole. I know it is my own fault but I wasted a ton of precious time in my life scouring all the message boards I could to gain every last bit of info on this card that I thought would be the end all, be all. I regret wasting all that time now. What makes it worse is the 8800GTX continually coming to my mind over this card is something I thought would never happen. Much better performance and technically better AF are winning me over, higher price or not. It seems people are just happy with the low(er) price of the 2900XT instead of what it can actually offer. The BFG 8800GTX OC2 is fairly cheap and less than a vanilla EVGA 8800GTX which sweetens the deal.
One other thing that is really bothering me that perhaps someone could answer. Why did ATI get rid of the 6xAA option? It has always looked VERY good on past cards however was kind of unusable due to the high performance hit. I was looking forward to using 6xAA on this card yet getting the same performance as 4xAA on my current X1900XTX essentially and technically bumping up IQ with no real loss in performance to what I am getting now with my current card. Getting rid of the 6xAA option and jumping straight from 4xAA to 8xAA means there is no longer a middle ground for IQ vs. performance. Jumping from 4xAA to 8xAA yields a high performance hit and is now going to be just like running 6xAA over 4xAA due to the drop the current cards having from doing this. Is there any technical reason they removed 6xAA being the only card to offer this or was just simply a choice they made that we have to live with?
What about the latency of the ring-bus design with 4 bus-stops?
It has 6xAA? http://www.beyond3d.com/content/reviews/16/10
tomorow :thumbsup:
btw:
ARmed Assault benchmark:
http://3d-center.de/images/radeon_hd..._armed_sg1.png
Noobie>6x AA is not available on HD 2900XT.
cantankerous> Don't cry, this is x1800 like. R650 is gonna fix this.
Keep in mind a x1900 is always very good in games ;)
I like the image quality test of that review : http://3d-center.de/artikel/radeon_h..._xt/index6.php
Depends on your definition of AA. I say AA is measured by the total number of samples taken for the filter to work with, which means 6xAA = 2x + Wide Tent or 4x + Narrow Tent.
Maybe it is just my cpu that is beginning to suck but I still find FEAR doesn't play as well as I would like. To top this off Oblivion etc may run on a X1900 but you surely see what a difference a faster card can make on a game like this.
EDIT... wow, the prices are already dropping. Directcanada had a retail box HIS for $479 yesterday, it is now $456. $23 drop in 24 hours. The Diamond and Powercolor variants have also dropped in price.
2950XTX will come no later than September :)
G80 brutally wipes the floor with R600 in DX10 Lost planet - Extreme condition:
http://www.legitreviews.com/article/505/1/
The game is just plain bad. No AA, Medium HDR, 4x AF, Medium shadow quality (dunno about other settings, screenshot doesn't show) and an 8800 GTX still doesn't do 60 FPS in the cave at a res of 1280x720. The only reason this could be happening is because soft shadows are on (which cuts FPS in half).
Also note how the 1600x1200 res doesn't drop the FPS on legit.
Checkout http://www.pcgameshardware.de/?article_id=601352 as well.
A 8800 GTS performing only 4% slower than a 8800 Ultra. CPU bound? They appear to be testing with a AMD x2 4200+. And with that system the game drops under 5 FPS in the DX10 codepath with a 8800 GTS (average is 26 fps at that point).
PCGH SNOW:
2900 - 19
GTS - 39
GTX - 54
Ultra - 61
PCGH CAVE:
2900 - 26
GTS - 47
GTX - 47
Ultra - 49
LEGIT SNOW:
2900 - 30
GTX - 91
LEGIT CAVE:
2900 - 21
GTX - 44
DX9 codepath (E6700 @ 3,4 GHz, 2 GB RAM, 1280x960, 4xFSAA, 8xHQ-AF):
x1950 pro - 15
x1950 XTX - 23
7900 GTX - 25
8800 GTS - 36 / 31 (DX-10)
8000 Ultra - 56 / 62 (DX-10)
Funny to see a 8800 GTS actually do worse in DX-10, whilst the ultra does better.
BTW - Lost Planet originally appeared on X-BOX 360. The X-BOX 360 uses an ATI Xenos chipset. Specs of that chip are 500 MHz parent GPU & 48-way parallel floating-point dynamically-scheduled shader pipelines. More specs can be found @ http://en.wikipedia.org/wiki/Xenos
PS - 2 more hours for the demo arrives
Drivers drivers drivers :slapass:
Do you seriously think DAAMIT would come up with a card that can do no better than this?
That would be a REAAALY depressing story.
Anyways, I'll wait before i conclude whether 2900xt is worth its price or not till ati/amd comes up with some proper drivers, because the one's that are beeing used now are badly optimized if at all..
Even if it was like X1800 was compared to X1900 it would perform better than this..
Ya a game developed to Nvidia cards and with ATI runing with serial bugs because they donīt had the chance to work with the editor of the game.
So is more than fanboy talk, talk about that.
Why they didnīt run the game on 8600/8500/8300?????
Also when 95% of the Nvidia selling produts is 8600/8500/8300 that will run that DX_10 game in a slide-show at 640*480 then people will realise that itīs time to get back to X1950pro and 7900Gs that are far better in DX_9 and costs 50$ less.
Those cards will run DX_10 as good an FX5200 runs DX_9.
That bench only show how 95% of Nvidia selling grafic cards just suck in DX_9 relatively to 7900 as X1950pro, and suck so much in DX_10 that even a highly optimized Nvidia game will run that game very ssssslllllloooowwwwww.
Does the HD2900XT GPU have dual-core/quad-core enhanced drivers like NVIDIA has?
No 6xAA on the 2900...now that's very disappointing.
That's one of my top 3 reasons why I came to like ATI cards better for gaming.
:(
I've had this discussion before. Last time I was stupid enough to talk. Now I present you with pics. Download http://images.anandtech.com/reviews/...ize-images.zip from http://www.anandtech.com/video/showdoc.aspx?i=2988&p=14
Pay no regards to the stupid name schema.
ATI R600 has 2 6xAA filters: 6xAA Narrow & 6xAA Wide. I find that 6xAA Narrow is better than 6xAA Wide. I also find that 6xAA Narrow is better (a little bit) than 6xAA x1950. I also find G80 has no 6xAA.
Filename:
R600 - 6xAA Narrow - n.6x.r600.jpg
R600 - 6xAA Wide - w.6x.r600.jpg
X1950 - 6xAA - 6xAA.x1950.jpg
I still managed to talk to much :stick:
Because 6x is not MSAA, it's "narrow CFAA" +4xMSAA, or "wide CFAA" +2xMSAA. The end result does not have the same quality, although it's quite close.
That's good news then.
Though it makes for a lot of choices, it can be quite confusing...most people will want to know what is the best? I like 6x because its a great quality setting without having to go to 8x and lose more performance. This also makes it impossible to directly compare to Nvidia at that setting.
Infernal LOOKS better on 2900xt than X1950 Crossfire...dunno about performance, although it felt smoother.
somone run the lost planet demo in game bench pls with the 2900.
For those interested in reading more reviews, x-bit posted a brief review.
http://xbitlabs.com/articles/video/d...hitecture.html
EDIT:Anyone able to confirm this?Quote:
Originally Posted by xbitlabs
It is interesting to note that there are multiple versions of the card. some with the fets at the rear, some without. others have no markings on the die, but rather on the shim, and some with both markings on the die and shim.
Finally Bi-tech's review
http://www.bit-tech.net/hardware/200...n_hd_2900_xt/1
this sites review also shows the 2900 to sit right in between the GTX and GTS:
http://www.hardware.info/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
I just wish it wasn't so power hungry, and that ATI made their version of the card instead of all the other MFG's.
To me the quality of narrow CFAA + 4xMSAA looks better than x1950's 6x AA (as I previously stated). The wide CFAA + 2xMSAA is worse in comparison tough.
As for it being really 6x or not, 4xMSAA + narrow CFAA (X) takes 6 samples to calculate the AA with, x1950 does the same. R600 with X uses 4 box-samples and 2 non-box samples. x1950 takes 6 box-samples. Both have advantages and disadvantages. We just need to see it in motion in some games to determine what is better.
No mention of what driver's were used...but heck - those results look very different to other reviews...New driver maybe?
Not quite sute what to make of it. They didn't seem to look at the power consumption at all :/
'Bout the first site to fully support the card...Quote:
Originally Posted by hardware.info
: got mine 10 minuts ago
temps idle (52c)
http://img441.imageshack.us/img441/7351/tempsbg0.jpg
more testes later
regards
Test it for us mascaras, hurry up! :)
seems to be a beast in unreal 3 engine. and before u ask, this site is NOT biased. some of their other benches show the 2900 losing to the GTS, so they are not biased.
http://www.matbe.com/images/biblio/a...0000057006.png
http://www.matbe.com/articles/lire/3...0-xt/page1.php
another thing everyone should notice is that the AA issue is not universal, so that should provide reason to the optimist that the hardware is not broken but it is a just a driver issue. in the above review the 2900 takes a 50% hit in Tomb Raider with 4xaa, but in R6 vegas only about 10%.
you've defended these cards right from the start with good reason i think here is my own 3dmark03 score with 2 x hd2900xt in crossfire with a qx6700 at 3600mhz and the cards at default stock clocks
http://gbwatches.com/pics/hd29003d03first.jpg
Everyone should notice that there is a less than 1 FPs drop in 2560x1900 when enabling 16xAF and 4xAA on the 2900. Everyone should remember that G80 had problems to get AA working on Vegas. Everyone should remember that TechReport (whilst using driver version 8.37.4.070419a-046506E) says that "the (Vegas) game engine doesn't seem to work well with multisampled antialiasing, so we didn't enable AA". Everyone should remember that MatBe uses driver version 8.37.4.2_47322. Everyone should remember that some reviews show that the 2900XT comes very near (sometimes under, sometimes above) of 8800GTX. Everyone should remember that some reviews show that the 2900XT performs way better than the 8800GTX in low res in Vegas.
I wish everybody would start to do HD video reviews of the benchmarks they run :(
Go to hell with stupid 3D mark benchmarks ... Yes ATi has good score, but in games is totally crappy ... and in quality of picture .... looser ...
PS. dont flame anything about ummatured drivers ... it is not true, ummature is design of chip not drivers ...
Bullsh*t.
Nobody's flaming about immature drivers. Its the only reason why it performs like utter crap in one game, and its a very good performer in other games.
You need to stop flaming that this card sucks, if it still performs this bad when NEW DRIVERS come out; sure. You can say it sucks. But as long as there are no proper drivers; stop flaming.
Let's take a Core 2 Solo 4 GHz.
Let's take a Core 2 Duo 3 GHz.
There are some games where the Solo will win.
There are some games where the Duo will win.
Let's take a Core 2 Solo 4 GHz.
Let's take a Core 2 Quad 2 GHz.
There are some games where the Solo will crush the Quad.
There are some games where the Quad will crush the Solo.
From a post in the Bluesnews thread re. Lost Planet demo:
"This is already the second DX10 demo where nvidia owners have at least 3 times the framerate and better image quality than the HD2900XT fellows (CoJ was the first DX10 demo). I'm still waiting for the DX10 demo where ATI excels over nvidia like predicted by many ATI fans."
Um, ok. Can anyone confirm or deny this? No direct response to the guys claim nor other info is provided in that thread...
http://www.xtremesystems.org/forums/...d.php?t=144489
report results?
Neither do you, ECH, but I HAVE the card, and I say that most of the comments made by people are not true. The performance in games IS good, not bad, the QUALITY is GOOD, not bad, and each and every complaint made earlier either stems from users who were testing the cards, and were met with some issues, or from nV's FUD team. I could not be more happy with my cards.
Reviews on the web are not accurate...:fact:
It cannot be confirmed do to the generalization of the statement. For one this is a push to downplay the HD 2900XT because they know sooner or later the drivers (release the week of 5-22-07) will mature improving the HD overall. Although I still awesome that another driver release maybe needed after 7.5 before we start seeing complete maturity and better results. Second, there is no indication of which G80 outperforms the HD. I've seen benchmarks that went both ways even with immature drivers. Third, there are no photos that suggest that COJ/LP have better image quality in G80 then in the HD, making this statement false.
Your saying all those reviews are bad/fake/wrong cuz you got the card ?
I dont think anyone is saying the card is bad or waste of money, but compared to G80 its not as good as everyone expected.
And this is their high end..when 65nm version of this appear then that will be their next high end.
But think G80 shrinked to 65nm..then ATI lost again.
So they need a new GPU already.
I think some people got too high hopes for the drivers...its like keep pushing..if not next driver then the driver after that for sure!!! Its a continual push in denial. lets face it, R600 is a bummer getting replaced by R650 in 2-3 months. Even AMD stated that on an interview. It should also be a big hint on why tehre is no GT/XL/XTX version. Its simply a temporary "hotfix".
Have proof that any of your comments may be true.
Look, in alot of situations, X1950Crossfire is faster than a single G80. the HD2900XT is faster than X1950Crossfire, so some conclusions can be made...
however, this gpu is superscalar. this means that the standard way of doing things does not apply any more. There are instances were a driver CAN NOT HELP AT ALL...especially when it comes to DX9. DX10 can use load balancing, so I'm not so "unconfident" in how things will turn out.
ANd yes, Ubermann, when reviews say "colour is washed out", etc, etc, and I do not see the same problem, then I call shens.
Fact of the matter is that I have been pretty muc haccurate about this card from the get-go, including the bit about us getting UFO only first, about the gpu being superscalar, about ALOT of things. This, to me, means that things are pretty "cut and dry" here, and everything else, all the complaints and worries, are FUD.
And no, I do not care about comparisons w/ G80. When DX10 comes out in full force, then I might...but if one card is not fast enough, I'll simply toss in another, thanks to the cost. I bought G80 when it first came out too, adn the horrible driver bugs had me sell them right away. Until we have some applictions that can properly measure the performance difference between G80 and R600(in all types of rendering), no real comparisons can be made, as the fact of the matter is that because of this gpu's structure, drivers play a far more important role in DX9 than ANYONE thinks.:fact:
It cleary beats the 8800gts in price/performance and... that's it. The high-high-end belongs to nvidia.
And that makes you wonder how much delayed the R650 gonna be..omg i give up =)
But do you really understand how many cards get sold @ this level? VERY little. I mean really, two companies merge, we got layoffs, etc, and a product line that has been delayed countless times due to bad choices and medling from other companies. I think, really, AMD have done well.
BTW, this 2900XT is UFO. it is not "top card". This card is not released yet. The fact of the matter is that even the ringbus in these cards is half disabled, due to the number of memory IC's, and thier size.
OH, ECH, i never said i didn't agree with you, however, I don't see you with a card, so you are basing your comments on the same stuff that other guy was, thereby causing me to respond. You look at the situation...well..you know what I think.