I don't mean any harm but what you are saying is petty. The information on the video card is out there. It's clear that better drivers are needed. The G80 went through their own driver issues when it was released.
Printable View
it doesn't beat the 8800gts in price/performance. I see a evga 8800gts 640mb for $329.99 after a $30 rebate. that is $100 cheaper then the 2900 XT's are going for. And i wanted to buy a 2900 XT too, really did. Still kinda want to.. but i don't wanna have the same thing that happened to me when i bought my 1800XL when it first came out - couple months it was pretty much obsolete. It's as if the initial release is only to get something out the door, then they patch it up and re-release a much better and improved product. this time, i'm gonna wait, and get me the 65nm version ;) (if and when it comes out... i might be waiting for a while)
The difference between 2900 & GTX in CoJ isn't that big. http://www.legitreviews.com/article/504/3/
The difference in Lost Planet is under 2x @ http://www.pcgameshardware.de/?article_id=601352
It's a bit higher @ http://www.legitreviews.com/article/505/3/
It should be noted that Lost Planet is unplayable on ATI, because of disappearing objects, so what's the point anyway?
I don't even get the driver argument. Besides a few obvious bugs, its performance based on its price is exactly where it should be. From looking at 20+ reviews, I have come to the conclusion its site very nicely between the GTS and GTX. And what do you know, its price is also between the GTS and GTX, leaning towards the GTS. In some cases its as fast as the GTX, some cases slower than GTS. But that is expected. But the OVERALL impression I got is that it performs right in the middle.
What happened was that everyone, for good reason, expected it to be a GTX killer. But even before release we knew it was going to cost around $400. I might be in the minority, but I instantly knew it was NOT going to perform at GTX levels. Why else sell it at that price? So I adjusted my expectations accordingly. The problem with the vast majority still seem to be oblivious to what is right in front of their nose. A $400 video card. Not a $550 dollar one.
So to sum up, its performance is fine. Drivers bugs will be worked out. Everyone be happy.
and to top it off you get 3 of the years hottest games, plus a free G5 mouse you can sell on ebay for like $25. and sell the games for around $25-$30 also if you want.
i i must confess that i was a litle reticent about image quality because of what few reviews said ( that was not so good ) but then i got my x2900 today and i saw this :
x2900XT Day of defeat
http://img520.imageshack.us/img520/9191/dfed3.jpg
better image than with my 8800gts in my other system
temps playing DOD
15:35:15, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 54.750, MCLK(MHz)[0] = 513.00, SCLK(MHz)[0] = 506.25
15:44:46, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 71.125, MCLK(MHz)[0] = 828.00, SCLK(MHz)[0] = 742.50
ASIC Temperature via LM64 on DDC3 I2C [0]
Minimum temperature: 54.375 C
Maximum temperature: 71.750 C
Average temperature: 58.601 C
(btw: another Free game with X2900 :) )
regards
The HD2900XT doesn't have IQ issues. You read about it but there is no comprehensive photo details that suggest otherwise. As you clearly see the HD2900XT has IQ as good as or better then the 8800GTX. There is more talk then actually photo proof (using several examples) to show that IQ is a problem with the XT. The only thing that I question in your photo is that window. Something about that window doesn't look right. Can you check with your GTX to see if that window looks like that?
"Easy" way to resolve this is by using 3D Mark 2006 in 1280 * 1024 @ 6xAA & 16xAF mode on both cards, then letting it dump 900 individual frames using the image quality part of the prog, and compiling this into VC-1 HD movie.
Side to side comparisons would be nice =)
Btw do some benchies in Lost Planet and Call of Juarez, and some other games.
how do the temps scale with the clock in r600? i saw all those stock cooler overclocks and im wondering what kind of temp i can expect with, say, 850Mhz on the core.
anyone has an idea of what sapphire toxic will cost?
New sapphire x2900 with 20% OC
http://img410.imageshack.us/img410/9...ay0704lwm5.jpg
http://www.fx57.net/?p=669
and 2 zalmans vf900???
regards
about 890mhz GPU? that is great for a stock card!
btw can someone translate what it says, i just dont know what language that is lol...
Awesome :D
This thread is getting filled with nvidiots jumping in and saying this card suxxors blah blah. We all know what this card can currently do or can't do. If you have anything useful to add, do so. But keep this clean of fanboyism please... Reviews, performance numbers, drivers -> discussion. If you want to start a "insult the r600" thread, go ahead and do that please.
Thank you :D
AMEN TO THAT Ahmad :)
Damned guys,i need your wisdom to help me shift through these reviews..i'm planning a new rig around august,and i need a new card.So what's the best choice:A 2900XT or a 8800GTS 640mb?I'm playing at 1680x1050 res,and the games i play the most are oblivion,NeverwinterNights 2 and i'm planning on starting with Stalker too.I'm more biased towards RPG's then towards shooters,my girlfriend plays Half-life 1&2 and Team Fortress...Looking at the pic Mascaras posted here i must admit that it looks a gazillion times better then on my 9800pro.So what's the best?2900XT for being a bit more future-proof,or a 8800GTS 640mb for a bit more power?
I feel kinda noobish right now, but can someone please explain what "IQ" is?
Hehe I mean on a graphics card, you get that, right? :D
That smells so much of fake it hurts. Also look where the PWM should be...where is it :rolleyes:
And that site have made a few fakes already with R600 and others.
Also the fan blades looks to be so close they would hit one another. And without a barrier between they would hinder each others performance. Since they would blow against one another in the middle.
The HD2900XT is currently better than the 8800GTS and will be even better once new drivers are out. It is also more future proof than the GTS, and a better bang for the buck in my honest oppinion.
However, if you are willing to wait till August, i guess you should wait one more month after that, September. AMD plans to launch the R650 (said to be HD2950 series) around that time, so it wouldnt hurt waiting one more month for better performance and less heat.
Or wait 2 more months after that for the GeForce 9.
NWN2 is a problem. The only review I found was on VR-zone @ http://vr-zone.com/?i=4946&s=13 for that game, and ATI is isn't doing to well in that. Ofc VR-zone sucks cos they have 8-37-4-070419a drivers, so we still don't know jack about NWN2 performance on HD2900XT.
8800GTS, no questions asked. It leads in pretty much everything once you enable AA. Check out just about any review and they'll tell you the same. Also, the HD 2900XT is not "more future proof", just to letcha know. It's not really 320 shaders, and due to how AA works, it won't be able to use it's extra bandwidth thanks to how it handles MSAA resolves, because it's using the shader hardware to handle it, which means it's using shader clock cycles for AA and taking a performance hit even with it's massive bandwidth.
WTF are you talking about? The HD 2900XT gets smoked with AA. :confused:
http://www.vr-zone.com/?i=4946
http://www.anandtech.com/video/showdoc.aspx?i=2988
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
DriverHeaven, which is known to be "very ati friendly" even stated that they cannot recommend the card to anyone.Quote:
In the above screenshot you will see the 3DMark06 score of 10723 on the ATI Radeon HD 2900 XT, 9105 on the GeForce 8800 GTS 640 MB and 11191 on the GeForce 8800 GTX. Yes, the 3DMark06 score and “game tests” are a good deal higher with the ATI Radeon HD 2900 XT compared to the GeForce 8800 GTS and is just shy of the GeForce 8800 GTX. If you were a benchmark enthusiast you might think wow, the ATI Radeon HD 2900 XT has to be faster in games because it is faster in 3DMark! By looking at 3DMark alone you would think it is almost as fast as a GeForce 8800 GTX. Well, you would be wrong.
Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!
You may want to read up on what you preach on.