I just wish it wasn't so power hungry, and that ATI made their version of the card instead of all the other MFG's.
Printable View
I just wish it wasn't so power hungry, and that ATI made their version of the card instead of all the other MFG's.
To me the quality of narrow CFAA + 4xMSAA looks better than x1950's 6x AA (as I previously stated). The wide CFAA + 2xMSAA is worse in comparison tough.
As for it being really 6x or not, 4xMSAA + narrow CFAA (X) takes 6 samples to calculate the AA with, x1950 does the same. R600 with X uses 4 box-samples and 2 non-box samples. x1950 takes 6 box-samples. Both have advantages and disadvantages. We just need to see it in motion in some games to determine what is better.
No mention of what driver's were used...but heck - those results look very different to other reviews...New driver maybe?
Not quite sute what to make of it. They didn't seem to look at the power consumption at all :/
'Bout the first site to fully support the card...Quote:
Originally Posted by hardware.info
: got mine 10 minuts ago
temps idle (52c)
http://img441.imageshack.us/img441/7351/tempsbg0.jpg
more testes later
regards
Test it for us mascaras, hurry up! :)
seems to be a beast in unreal 3 engine. and before u ask, this site is NOT biased. some of their other benches show the 2900 losing to the GTS, so they are not biased.
http://www.matbe.com/images/biblio/a...0000057006.png
http://www.matbe.com/articles/lire/3...0-xt/page1.php
another thing everyone should notice is that the AA issue is not universal, so that should provide reason to the optimist that the hardware is not broken but it is a just a driver issue. in the above review the 2900 takes a 50% hit in Tomb Raider with 4xaa, but in R6 vegas only about 10%.
you've defended these cards right from the start with good reason i think here is my own 3dmark03 score with 2 x hd2900xt in crossfire with a qx6700 at 3600mhz and the cards at default stock clocks
http://gbwatches.com/pics/hd29003d03first.jpg
Everyone should notice that there is a less than 1 FPs drop in 2560x1900 when enabling 16xAF and 4xAA on the 2900. Everyone should remember that G80 had problems to get AA working on Vegas. Everyone should remember that TechReport (whilst using driver version 8.37.4.070419a-046506E) says that "the (Vegas) game engine doesn't seem to work well with multisampled antialiasing, so we didn't enable AA". Everyone should remember that MatBe uses driver version 8.37.4.2_47322. Everyone should remember that some reviews show that the 2900XT comes very near (sometimes under, sometimes above) of 8800GTX. Everyone should remember that some reviews show that the 2900XT performs way better than the 8800GTX in low res in Vegas.
I wish everybody would start to do HD video reviews of the benchmarks they run :(
Go to hell with stupid 3D mark benchmarks ... Yes ATi has good score, but in games is totally crappy ... and in quality of picture .... looser ...
PS. dont flame anything about ummatured drivers ... it is not true, ummature is design of chip not drivers ...
Bullsh*t.
Nobody's flaming about immature drivers. Its the only reason why it performs like utter crap in one game, and its a very good performer in other games.
You need to stop flaming that this card sucks, if it still performs this bad when NEW DRIVERS come out; sure. You can say it sucks. But as long as there are no proper drivers; stop flaming.
Let's take a Core 2 Solo 4 GHz.
Let's take a Core 2 Duo 3 GHz.
There are some games where the Solo will win.
There are some games where the Duo will win.
Let's take a Core 2 Solo 4 GHz.
Let's take a Core 2 Quad 2 GHz.
There are some games where the Solo will crush the Quad.
There are some games where the Quad will crush the Solo.
From a post in the Bluesnews thread re. Lost Planet demo:
"This is already the second DX10 demo where nvidia owners have at least 3 times the framerate and better image quality than the HD2900XT fellows (CoJ was the first DX10 demo). I'm still waiting for the DX10 demo where ATI excels over nvidia like predicted by many ATI fans."
Um, ok. Can anyone confirm or deny this? No direct response to the guys claim nor other info is provided in that thread...
http://www.xtremesystems.org/forums/...d.php?t=144489
report results?
Neither do you, ECH, but I HAVE the card, and I say that most of the comments made by people are not true. The performance in games IS good, not bad, the QUALITY is GOOD, not bad, and each and every complaint made earlier either stems from users who were testing the cards, and were met with some issues, or from nV's FUD team. I could not be more happy with my cards.
Reviews on the web are not accurate...:fact:
It cannot be confirmed do to the generalization of the statement. For one this is a push to downplay the HD 2900XT because they know sooner or later the drivers (release the week of 5-22-07) will mature improving the HD overall. Although I still awesome that another driver release maybe needed after 7.5 before we start seeing complete maturity and better results. Second, there is no indication of which G80 outperforms the HD. I've seen benchmarks that went both ways even with immature drivers. Third, there are no photos that suggest that COJ/LP have better image quality in G80 then in the HD, making this statement false.
Your saying all those reviews are bad/fake/wrong cuz you got the card ?
I dont think anyone is saying the card is bad or waste of money, but compared to G80 its not as good as everyone expected.
And this is their high end..when 65nm version of this appear then that will be their next high end.
But think G80 shrinked to 65nm..then ATI lost again.
So they need a new GPU already.
I think some people got too high hopes for the drivers...its like keep pushing..if not next driver then the driver after that for sure!!! Its a continual push in denial. lets face it, R600 is a bummer getting replaced by R650 in 2-3 months. Even AMD stated that on an interview. It should also be a big hint on why tehre is no GT/XL/XTX version. Its simply a temporary "hotfix".
Have proof that any of your comments may be true.
Look, in alot of situations, X1950Crossfire is faster than a single G80. the HD2900XT is faster than X1950Crossfire, so some conclusions can be made...
however, this gpu is superscalar. this means that the standard way of doing things does not apply any more. There are instances were a driver CAN NOT HELP AT ALL...especially when it comes to DX9. DX10 can use load balancing, so I'm not so "unconfident" in how things will turn out.
ANd yes, Ubermann, when reviews say "colour is washed out", etc, etc, and I do not see the same problem, then I call shens.
Fact of the matter is that I have been pretty muc haccurate about this card from the get-go, including the bit about us getting UFO only first, about the gpu being superscalar, about ALOT of things. This, to me, means that things are pretty "cut and dry" here, and everything else, all the complaints and worries, are FUD.
And no, I do not care about comparisons w/ G80. When DX10 comes out in full force, then I might...but if one card is not fast enough, I'll simply toss in another, thanks to the cost. I bought G80 when it first came out too, adn the horrible driver bugs had me sell them right away. Until we have some applictions that can properly measure the performance difference between G80 and R600(in all types of rendering), no real comparisons can be made, as the fact of the matter is that because of this gpu's structure, drivers play a far more important role in DX9 than ANYONE thinks.:fact:
It cleary beats the 8800gts in price/performance and... that's it. The high-high-end belongs to nvidia.
And that makes you wonder how much delayed the R650 gonna be..omg i give up =)
But do you really understand how many cards get sold @ this level? VERY little. I mean really, two companies merge, we got layoffs, etc, and a product line that has been delayed countless times due to bad choices and medling from other companies. I think, really, AMD have done well.
BTW, this 2900XT is UFO. it is not "top card". This card is not released yet. The fact of the matter is that even the ringbus in these cards is half disabled, due to the number of memory IC's, and thier size.
OH, ECH, i never said i didn't agree with you, however, I don't see you with a card, so you are basing your comments on the same stuff that other guy was, thereby causing me to respond. You look at the situation...well..you know what I think.