I don't mean any harm but what you are saying is petty. The information on the video card is out there. It's clear that better drivers are needed. The G80 went through their own driver issues when it was released.
Printable View
it doesn't beat the 8800gts in price/performance. I see a evga 8800gts 640mb for $329.99 after a $30 rebate. that is $100 cheaper then the 2900 XT's are going for. And i wanted to buy a 2900 XT too, really did. Still kinda want to.. but i don't wanna have the same thing that happened to me when i bought my 1800XL when it first came out - couple months it was pretty much obsolete. It's as if the initial release is only to get something out the door, then they patch it up and re-release a much better and improved product. this time, i'm gonna wait, and get me the 65nm version ;) (if and when it comes out... i might be waiting for a while)
The difference between 2900 & GTX in CoJ isn't that big. http://www.legitreviews.com/article/504/3/
The difference in Lost Planet is under 2x @ http://www.pcgameshardware.de/?article_id=601352
It's a bit higher @ http://www.legitreviews.com/article/505/3/
It should be noted that Lost Planet is unplayable on ATI, because of disappearing objects, so what's the point anyway?
I don't even get the driver argument. Besides a few obvious bugs, its performance based on its price is exactly where it should be. From looking at 20+ reviews, I have come to the conclusion its site very nicely between the GTS and GTX. And what do you know, its price is also between the GTS and GTX, leaning towards the GTS. In some cases its as fast as the GTX, some cases slower than GTS. But that is expected. But the OVERALL impression I got is that it performs right in the middle.
What happened was that everyone, for good reason, expected it to be a GTX killer. But even before release we knew it was going to cost around $400. I might be in the minority, but I instantly knew it was NOT going to perform at GTX levels. Why else sell it at that price? So I adjusted my expectations accordingly. The problem with the vast majority still seem to be oblivious to what is right in front of their nose. A $400 video card. Not a $550 dollar one.
So to sum up, its performance is fine. Drivers bugs will be worked out. Everyone be happy.
and to top it off you get 3 of the years hottest games, plus a free G5 mouse you can sell on ebay for like $25. and sell the games for around $25-$30 also if you want.
i i must confess that i was a litle reticent about image quality because of what few reviews said ( that was not so good ) but then i got my x2900 today and i saw this :
x2900XT Day of defeat
http://img520.imageshack.us/img520/9191/dfed3.jpg
better image than with my 8800gts in my other system
temps playing DOD
15:35:15, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 54.750, MCLK(MHz)[0] = 513.00, SCLK(MHz)[0] = 506.25
15:44:46, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 71.125, MCLK(MHz)[0] = 828.00, SCLK(MHz)[0] = 742.50
ASIC Temperature via LM64 on DDC3 I2C [0]
Minimum temperature: 54.375 C
Maximum temperature: 71.750 C
Average temperature: 58.601 C
(btw: another Free game with X2900 :) )
regards
The HD2900XT doesn't have IQ issues. You read about it but there is no comprehensive photo details that suggest otherwise. As you clearly see the HD2900XT has IQ as good as or better then the 8800GTX. There is more talk then actually photo proof (using several examples) to show that IQ is a problem with the XT. The only thing that I question in your photo is that window. Something about that window doesn't look right. Can you check with your GTX to see if that window looks like that?
"Easy" way to resolve this is by using 3D Mark 2006 in 1280 * 1024 @ 6xAA & 16xAF mode on both cards, then letting it dump 900 individual frames using the image quality part of the prog, and compiling this into VC-1 HD movie.
Side to side comparisons would be nice =)
Btw do some benchies in Lost Planet and Call of Juarez, and some other games.
how do the temps scale with the clock in r600? i saw all those stock cooler overclocks and im wondering what kind of temp i can expect with, say, 850Mhz on the core.
anyone has an idea of what sapphire toxic will cost?
New sapphire x2900 with 20% OC
http://img410.imageshack.us/img410/9...ay0704lwm5.jpg
http://www.fx57.net/?p=669
and 2 zalmans vf900???
regards
about 890mhz GPU? that is great for a stock card!
btw can someone translate what it says, i just dont know what language that is lol...
Awesome :D
This thread is getting filled with nvidiots jumping in and saying this card suxxors blah blah. We all know what this card can currently do or can't do. If you have anything useful to add, do so. But keep this clean of fanboyism please... Reviews, performance numbers, drivers -> discussion. If you want to start a "insult the r600" thread, go ahead and do that please.
Thank you :D
AMEN TO THAT Ahmad :)
Damned guys,i need your wisdom to help me shift through these reviews..i'm planning a new rig around august,and i need a new card.So what's the best choice:A 2900XT or a 8800GTS 640mb?I'm playing at 1680x1050 res,and the games i play the most are oblivion,NeverwinterNights 2 and i'm planning on starting with Stalker too.I'm more biased towards RPG's then towards shooters,my girlfriend plays Half-life 1&2 and Team Fortress...Looking at the pic Mascaras posted here i must admit that it looks a gazillion times better then on my 9800pro.So what's the best?2900XT for being a bit more future-proof,or a 8800GTS 640mb for a bit more power?
I feel kinda noobish right now, but can someone please explain what "IQ" is?
Hehe I mean on a graphics card, you get that, right? :D
That smells so much of fake it hurts. Also look where the PWM should be...where is it :rolleyes:
And that site have made a few fakes already with R600 and others.
Also the fan blades looks to be so close they would hit one another. And without a barrier between they would hinder each others performance. Since they would blow against one another in the middle.
The HD2900XT is currently better than the 8800GTS and will be even better once new drivers are out. It is also more future proof than the GTS, and a better bang for the buck in my honest oppinion.
However, if you are willing to wait till August, i guess you should wait one more month after that, September. AMD plans to launch the R650 (said to be HD2950 series) around that time, so it wouldnt hurt waiting one more month for better performance and less heat.
Or wait 2 more months after that for the GeForce 9.
NWN2 is a problem. The only review I found was on VR-zone @ http://vr-zone.com/?i=4946&s=13 for that game, and ATI is isn't doing to well in that. Ofc VR-zone sucks cos they have 8-37-4-070419a drivers, so we still don't know jack about NWN2 performance on HD2900XT.
8800GTS, no questions asked. It leads in pretty much everything once you enable AA. Check out just about any review and they'll tell you the same. Also, the HD 2900XT is not "more future proof", just to letcha know. It's not really 320 shaders, and due to how AA works, it won't be able to use it's extra bandwidth thanks to how it handles MSAA resolves, because it's using the shader hardware to handle it, which means it's using shader clock cycles for AA and taking a performance hit even with it's massive bandwidth.
WTF are you talking about? The HD 2900XT gets smoked with AA. :confused:
http://www.vr-zone.com/?i=4946
http://www.anandtech.com/video/showdoc.aspx?i=2988
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
DriverHeaven, which is known to be "very ati friendly" even stated that they cannot recommend the card to anyone.Quote:
In the above screenshot you will see the 3DMark06 score of 10723 on the ATI Radeon HD 2900 XT, 9105 on the GeForce 8800 GTS 640 MB and 11191 on the GeForce 8800 GTX. Yes, the 3DMark06 score and “game tests” are a good deal higher with the ATI Radeon HD 2900 XT compared to the GeForce 8800 GTS and is just shy of the GeForce 8800 GTX. If you were a benchmark enthusiast you might think wow, the ATI Radeon HD 2900 XT has to be faster in games because it is faster in 3DMark! By looking at 3DMark alone you would think it is almost as fast as a GeForce 8800 GTX. Well, you would be wrong.
Despite what the numbers in 3DMark are showing our evaluation has proven that the ATI Radeon HD 2900 XT is slower than a GeForce 8800 GTS when it comes to actually gaming. Even our apples-to-apples real gaming tests confirmed that the 8800 GTS is faster than the HD 2900 XT and nowhere close to the GeForce 8800 GTX, yet here sits 3DMark showing us the opposite!
You may want to read up on what you preach on.
I think the R600 would be a better investment. Nvidia has already released the magic drivers for the g80, but we have yet to see it for r600.
Hey guys,
I just installed the HD2900XT and loaded the driver from ATI website and this unknown device show up. I only have X-FI as a PCI but already loaded the driver for that.
Any fixes? thanks
http://img477.imageshack.us/img477/6127/untitledru4.jpg
PS : The Driver CD with the card didn't even work....
1- There are lots of bugs with AA.
2- You get the worst review that you found. 20 other review don´t say that, also as new reviews with new drivers just show HD 2900XT being far far away better the 8800GTS.
That king of propaganda you are doing don´t make any sense.
Everyone says that drivers are bugy and there is a lot of room to improve and you stand here and say the oposite. The bad part is that you even don´t have the card, you only look to some review and some things mos of them just Fud.
If the people who have the card point in many reviews that drivers can improve things a lot, why the hell you say the oposite when you don´t even have the card to test it :slapass:
The good part is that in 1 week when Cat. 7.5 gets oficial relesase and reviews get updated the performance will improve were it should be improved (in those games with bugs).
A more recent review:
http://www.matbe.com/images/biblio/d...0000057014.png
http://www.matbe.com/images/biblio/d...0000057013.png
http://www.matbe.com/images/biblio/a...0000057006.png
other:
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
http://www.hardware.info/nl-NL/produ...F00/benchmark/
the driver on the cd has all the hdmi audio stuff on ( which is the pci device and the unkown) which the 8.37 one which you download hasn't its just the basic driver so you'd be better uninstalling your existing drivers and using the cd ones fiest or borrow someonelses cd i did the same
that would probably be the vivo and the audio.
Hey DilTech, you got one of these cards? If not, don't comment. I have one, and have been playing ALOT of different games on it...they all seem ot perform better than x1950Crossfire. Don't try to tell me that a 320mb GTS is better than x1950crossfire, I know better firsthand!
Can someone upload for me the Driver CD that came with the card. My Driver CD is NOT readable, I am so bad luck. Thanks a bunch
Wouldn't you think if there really was a 2950XT/XTX,there should be some kind of confirmation about them??I mean,the first messages about the R600(2900XT)started in february 2006....
Please install the original driver disc that comes with the video card and see if it will auto update. If not do a search on the disc. Also, if by any chance you have a P5W DH Deluxe the unknown device maybe resolved by inserting the original cd from that MB. It will auto start and the "unknown device" will disappear. Again, that's if you have a P5W DH Deluxe and it's a MB related device.
thats the WDM driver
regards
THank you all, the problem is fixed using the Driver CD. Somehow it ran so I copied to my USB Drive for future use if needed.
If AMD/ATi can get the leakage issues sorted out, then R600 could be the monster it was supposed to be.
Stephane Matbe is a known trusted guy, not some unknown guy. He is fully credible, more than most reviewers out there who have corp affiliations and display favoritism. He might even be around to comment but he's done a very good job of reviewing the card, better than a lot of the nV fanboys or site hit gainers did by far (ahem foolish Genmay *cough* I mean [H]).
There IS a driver release coming that is "said" to sort many performance lags out.
Thanks for posting it v_rr. :D
They both use the same architecture, so whatever software improvements they make for R600 will most likely benefit R650 as well and that's pretty good incentive to develop better drivers right ;)
It's right there under a long silver heatsink... look properly next time :rolleyes:
I hope R650 is more than just a shrink. I believe its more in the area of R520->R580 with architectual fixes. And specially the AA engine needs a big fix in R600.
The PWM is misplaced then. Kinda odd for boards that should be equal yes? Also you forgot to comment the conflicting fan blades.
I hope R650 is more than just a shrink. I believe its more in the area of R520->R580 with architectual fixes. And specially the AA engine needs a big fix in R600.
The PWM is misplaced then, in best case it can be cooling the 7 transistors infront of the PWM. Also you forgot to comment the conflicting fan blades.
Then add the image quality of the coolers and cards look differently. And that the power connector for the fans aint connected to the card either.
http://img475.imageshack.us/img475/3...ay0704lte3.jpg
I could be wrong, but it sure is suspicious.
I highly doubt it will be anything that major; mainly because AMD can't afford to waste time doing this. It will be shrunk so that the clocks can be ramped greatly without power/heat issues. AMD will compete with 8900 using brute force because it's easier ;)
No it's most probably not misplaced either. Read this.Quote:
The PWM is misplaced then. Kinda odd for boards that should be equal yes?
That's because they don't conflict. Draw some circles on the pic in paint if you can be bothered to confirm you're wrong.Quote:
Also you forgot to comment the conflicting fan blades.
anybody got aquamark running on the hd2900, i think it needs an updated direcpll.dll , like the x1900 cards did , it does the test fine but right at the end it black screens before the score is shown
Some very nice performance for the 2900XT in Lost Planet, edging past the 8800GTS 640mb, about 3,5 - 4fps difference, nice. :)
http://www.firingsquad.com/hardware/...ance/page6.asp
These cards are unified, and nothing like the old cards. I am not saying it is impossible, I am saying that it is not something that should be assumed.
Those FSquad drivers are the same as TechReport uses in their review. Those drivers seem to quite nice for some games. The HD2900XT beats the 8800 GTS 640 MB OC by 0.3 FPS average on Call Of Juarez. http://www.techreport.com/reviews/20.../index.x?pg=14
The GTS has higher minimums however which will help smooth out the gameplay when the fps drops. I would rather have 3fps more in minimums than 0.3fps in maximums any day.
If it truly is a 20% overclocking bringing the core to 890 that proves that this thing scales well as long as heat is removed. I am saying this in assumption that the new, non referenced cooler does better than the stock to allow such a higher core. I have not heard of much above 850 or so with stock cooler for overclocks. Some can't even do that and fall short in the 84x range.
Problem with that other cooler is the hot air is not expelled from the system like it is on the stock. No more will I have 90c video card air stay in my case. Damn Accelero X2 ruined that for me.
Yeah, but that 3 fps minimum difference is caused by the HD deciding it needs to re-calibrate the heads, and thus the texture wasn't loaded in time causing a sudden drop in FPS.
They used a median low. Whatever that may mean. I imagine it is the median of all numbers that are below the average FPS; but what if NVIDIA has 49% very low numbers and 51% 15 FPS numbers, whereas the ATI has 51% 11.7 numbers and 49% 25 FPS numbers, then you have the same median, but ATI clearly is the winner of choice.
Did I already mention that most reviews suck out there? I like the HardOcp manner of displaying the FPS over an interval of time. However they need to provide more screenshots (a screenshot after every 10 seconds would be ideal), since they change the settings for each videocard. They provide screenshots to show the difference, but the difference changes depending on the scene. Taking an automated screenshot every 10 seconds at least is time-feasible. But shutouts to HardOcp for their testing-methodology.
NWN2, details maxed, outdoor town, 1680x1050, 16FPS avg
http://www.custompc.co.uk/custompc/n...rformance.html
http://img292.imageshack.us/img292/1...93d76eozz4.jpg
Quote:
AMD explains Radeon HD 2900XT's poor AA performance 1:16PM, Monday 14th May 2007
The R600 is finally here, and in keeping with its mysteriously long gestation, in at least in its first incarnation as the HD 2900XT, AMD's new GPU still poses a lot of questions. One of the things we noticed during our in-depth testing of the card is that compared to its principle rival, the Nvidia GeForce 8800 GTS 640MB, the HD 2900XT performs poorly in many games when anti-aliasing is enabled.
In F.E.A.R., at 1,600 x 1,200, with AA and AF disabled, the HD 2900XT easily outstripped the 640MB 8800 GTS, delivering a minimum that was 23fps higher than the latter's. However, with 4x AA, the HD 2900XT's minimum framerate dived from 82fps to 21fps, while the 640MB 8800 GTS produced a minimum of 30fps. Adding 4x AA results in a 74% drop for the Radeon, compared to only a 49% drop for the GeForce.
The Radeon's framerates suffer disproportionately with anisotropic filtering, too. Again testing in F.E.A.R. at 1,600 x 1,200, we saw the HD 2900XT's minimum FPS drop by 10 per cent with 16x anisotropic enabled, compared to 3 per cent for the GTS, although the HD 2900XT still had a faster average. It was a slightly different result at 2,560 x 1,600, as the HD 2900XT's massive bandwidth gave it a boost, although adding 16x AF still had more impact than it did on the 640MB GTS.
As most gamers will want AA and AF enabled in games, the HD 2900XT's poor performance with these processing options enabled is a serious problem for the card and ATi. We asked ATi to comment on this surprising result and the company revealed that the HD 2000-series architecture has been optimised for what it calls 'shader-based AA'. Some games, including S.T.A.L.K.E.R., already use shader-based AA, although in our tests the 640MB 8800 GTS proved to be faster than the HD 2900XT.
We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'
While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."
So does this means AA issue can be fixed by drivers?
I think maybe yes since it's software in part.
What issue? The poor performance in current games compared to the G80? He never said it could be fixed (or that it couldn't be fixed), instead he said, forget about old games, let's focus on AA for future games. They asked him the wrong question tbh.
I am fairly sure they will optimize their AA-shader based solution further and try to program their programmable MSAA to do the resolve in the back-end, to see if it is faster with older games.
He seemed to imply that doing MSAA resolve in the back-end, can only be done linear, whilst HDR requires non-linear to work correctly. If that is true, then either NVIDIA doesn't do MSAA resolve in the back-end, found a way around it, or has different image quality with HDR AA (the lightning will be different). This would be very difficult to verify since you need the exact same settings for both videocards you are testing and a good dose of HDR light.
Basically I am quite confused :)
Your native TFT resolution is 1680x1050 I guess. I see the drivers haven't improved for that game. I was hoping for more speed, but okay. NWN2 was tested on http://vr-zone.com/?i=4946&s=13
i understood, that ATI improve their new AA render on shader, and don't care about classic AA, i'm right ? ( english is not my first language ^^ )
If i understood well, then if ATI force driver to do shader AA, AA will perform well ?
This drivers could kill the GTX :D
I find it really funny how it is mentioned that the card was optimized for shader based AA. As far as I understood this was a desperate attempt from ATI to include AA on the card without having to do yet another silicon respin due to a problem with the original AA resolve method. They did not want to have another delay so opted for this instead. I still feel this was a real bad move and one of the main reasons why the card is suffering performance wise when AA and AF is enabled. Until I see a driver update that corrects what I am feeling I am going to have to feel this way unfortunately. If shader based AA really is a thing for future games than great, but what about the 10,000 other games on the market that don't agree with that method performance wise? I would hate to spend $400+ to play maybe 2-3 games that will be out this year that may take advantage of this new AA method. To me, it is all about playing the older games the best they have ever been played as well as have the possibility to play future games decently too. I am very anxiously awaiting the end of next week when new officially released drivers are suppose to hit with what is suppose to be major performance increases, and hopefully not at the expense of IQ which is another nasty rumor I keep hearing. My trigger finger is staying off the buy button till I get some more solid answers.
Mmm, now that you mention CPU limiting. It's almost like developers are spending more time programming in a threaded model than optimizing. The later is still far more advantageous performance wise.
From recent news I heard the problem is exaggerated and underrated. The AF problem is exaggerated; the difference between G80/G84 and 2900 is supposedly very hard to find in games. The AA problem is far worse when in motion, BUT! it only occurs when wide & narrow tent are used, there is no problem when your using pure MSAA (2x, 4x, 8x).Quote:
I find it really funny how it is mentioned that the card was optimized for shader based AA. As far as I understood this was a desperate attempt from ATI to include AA on the card without having to do yet another silicon respin due to a problem with the original AA resolve method. They did not want to have another delay so opted for this instead. I still feel this was a real bad move and one of the main reasons why the card is suffering performance wise when AA and AF is enabled. Until I see a driver update that corrects what I am feeling I am going to have to feel this way unfortunately. If shader based AA really is a thing for future games than great, but what about the 10,000 other games on the market that don't agree with that method performance wise? I would hate to spend $400+ to play maybe 2-3 games that will be out this year that may take advantage of this new AA method. To me, it is all about playing the older games the best they have ever been played as well as have the possibility to play future games decently too. I am very anxiously awaiting the end of next week when new officially released drivers are suppose to hit with what is suppose to be major performance increases, and hopefully not at the expense of IQ which is another nasty rumor I keep hearing. My trigger finger is staying off the buy button till I get some more solid answers.
You understood wrong...
Problem with Shader AA is that you're taking shader power to do something that could be done by dedicated hardware.
So, now ATi(I refuse to say AMD had anything to do with this part) is going to attempt to offload physics onto the shaders AND AA onto the shaders? That's just not going to work.
Also, ATi don't have to force their driver, it already does it. That's why performance is so low with AA.
AA works here, although ther ARE some niggling issues as it does not seem to be applied to all textures in some applications.
Will 65nm process permit a higher shader clock? If AMD rises shader clock, performance will scale much more than on GeForce, right?
Maybe AMD will come up with a 1Ghz core and 1Ghz for the 320 shaders on the R650. One of the problems with the R600 is the lower shader clock (Half the clock of the 128 shaders of nvidia).
I dont know much about shaders...
Check out this scores! what do you guys think. It's from a member of OCNQuote:
http://i148.photobucket.com/albums/s...87/stock06.jpg
8800 GTS score *all stock* 3dMark 06 (CPU: Intel E6600)
http://rigshowcase.com/img/457I4N29/10733.jpg
2900 XT score *all stock* 3dMark 06 (CPU: Intel E6400)
http://rigshowcase.com/img/457I4N29/10736.jpg
2900 XT Fear BenchMark AAx4, max resolution, Max Everything, Ansiotropicx8, no soft shadows.
Good link, but no answers:
Basically no one is able to figure out why ATI is using shaders to perform the AA, whilst it is clearly not the way to go due to performance.Quote:
So it's possible the ALU is broken, or some other logic in there is borked. Or they simply just want to use CFAA as the default resolve path regardless, even if the hardware does work, and I'm just completely wrong.
Hypothetically it could be a major oversight on ATI's part; believing that the shaders were so powerful it could handle it; after all shader AA has a huge flexibility advantage and is DX10.1 spec. But in practice it was too slow, and they had to redo the hardware. They redid the hardware, but the software drivers weren't implemented yet (I actually wonder what has been correctly implemented properly in the drivers, can't be much seeing as how bad it was at launch). Instead ATI chooses to show of the software shaders, with their flexible options (2x, 4x, 6x, 8x, 12x, 16x) in various modes and proclaim it's the fuuutuuuree!
32x AF pleaaaase :p
Well, at least now ya know the source of "free 4xAA" for DX9 from DX10 cards.
Shaders that would normally be idle during rendering can be used to apply this AA method, however the "thread arbiter" must be programmed for such, and, in most cases, this must be on a per-app basis, until 3Dc kicks in(Three consecutive loadings of the same map will "optimize" the workflow as the driver "recompiles" the workload to what is needed).
This makes alot of sense when apllied to DX9 rendering formats, as it's far easier for the driver programmer to add in the AA for free, and make use of the VLIW format of the gpu, however this can come at a penalty and cause rendering errors if it's not done right. The other option is for the arbiter to pack many instructions at once, and this can cause timing issues far more easily than applying AA to a texture, if a subsequent texture/pixel needs info from data already "in flight" through the gpu.
Problems of a programmable architecture that will only get better over time, IMHO.
I don't think this will get solved without resulting to per-app setting, because of the indoor/outdoor problem so well demonstrated by Oblivion. Indoor the GPU/CPU are generally overkill fast, whilst outdoor they struggle (well just the GPU thx to grass). There is no way to guarantee that some part of the shaders are free all the time, unless the arbiter tries to aim for some free shaders which it can use. But then it would no longer be free. Per-app settings suck, so I wouldn't even go there without a big prodding stick :stick:
<turns brain on>
Ow wait you mean POTENTIALLY FREE 4xAA. Yes, thanks to the clever prioritizing design of the ring-bus, coupled with it's overkill in bandwidth, along with the overkill in shader power, the 2900 can use slack-time to perform AA without much of a performance penalty.
But there can be points in the game where it is no longer free (truly free cannot be achieved, guaranteed seemingly free can only be achieved using hardware only specifically meant for doing AA and only capable of doing AA, which does it in the space of a nanosecond). This stresses the need for expanding the way we test the videocards in games (we must ensure that the parts we bench the game on are representative).
I want at least 128xAF & 64xAA for Quake 3!
http://www.sharkyextreme.com/hardwar...ng/figure1.jpg
http://www.sharkyextreme.com/hardwar...ng/figure7.jpg
I got 20536 marks X2900XT default & E6600 @ 3600mhz
I got 17633 marks BFG 8800GTS OC (550/1600) & E6600 @ 3600mhz
--3DMARK 2005 -- X2900XT 512mb DEFAULT (cat 8.37)-- E6600 @ 3600Mhz ( 20536 Marks)
http://img264.imageshack.us/img264/1695/2k5x2900ck3.jpg
3Dmark2005 @ BFG 8800GTS OC (158.19) @ E6600 3600mhz 17633 Marks
http://img406.imageshack.us/img406/3861/105hx2.jpg
Yes, you are right, in a way, however, due to the superscalar nature of teh gpu, it pretty much applies to DX9 as well, or if it is NOT APPLIED, we get a performance hit. Oh, wait..we got that now...:ROTF:
LoL Noobie...i think you got it all straight right from the get-go, but wanted confirmation.:rofl:
the "Potentially free"(you got the right term, thanks) AA is possible due to the VLIW nature of the gpu. As it stands now, shaders aren't quite large enough to use VLIW to the max, so AA gets tossed into the mix in order to use more of the gpu, however, when we think aobut DX10, we must be aware of geometry operations happening as well, and alot of rendering is going to be dependant on the results of the geometry shading being complete before being able to continue. How does that relate? well...in the scheduling, of course!
Dx10 features shadow AA, and it would be far easier, IMHO, to get AA-free shadows, given HDR, if the edges of teh textures were AA free BEFORE HDR is applied. If using the traditional method, performacne falters alot more, you would think, using non-AA textures for HDR, than using shader AA.
More personal tests mascaras... I´m fed up with fanboys and want real numbers from XS members.
HY there, nice thread, useful. Sorry to interrupt, but i just received my new 2900XT and i can not run 3DMark, i can not use RivaTuner, or ATITool, is it faulty? Pls some help guys? Many thanks, Danni
on 3dmark you need to add -nosysteminfo to the shortcut and rivatuner and atitool do not work as of yet... use AMD GPU tool that was recently released. (http://www.techpowerup.com/downloads...Tool_v0.7.html)
if anyone wants to know fix for all 3dm is do delete direcpll.dll from system32 file and every thing sould work
dnldnt if you right click the 3dmark 06 shortcut and click properties and at the end of target (which is "C:\Program Files (x86)\Futuremark\3DMark06\3DMark06.exe" on my vista x64) add a space followed by -nosysteminfo ("C:\Program Files (x86)\Futuremark\3DMark06\3DMark06.exe" -nosysteminfo)
Hy, thanks for all your help guys, it's working now, standard clocks, first test 3DMark 06 - 10239 , and going up soon, many thanks and good luck to all, Danni. BTW, i have read on a review, they have used RivaTuner for o/cing, is not out yet that version?
try and find a beta version? Look on nzone, guru3d or ngohq...
I don't think we can say that ATI cards do not have the capability of doing AA without passing it off to the shaders. Thats what the current drivers are showing now and it maybe that ATI is attempting to cover new grounds by going that path. That does not eliminate the possibility of it being fully capable of not depending on the shaders for AA... let us keep that in mind.
how are the vista drivers? about the same as the XP drivers?
Haha, HD2900... so much drama. I bet none of us thought it would be such a roller-coaster ride!
http://www.hkepc.com/bbs/hwdb.php?ti...HIS&rid=790821
New driver did the trick?!
Quote:
To collect higher-order market, Radeon 2900XT HD cheaper prices than expected, Official pricing for $ 399, and competitors for a meeting of the GeForce 8800GTS.
According to this test, Radeon 2900XT HD performance is better than GeForce 8800GTS. About 5~30% the leading edge of the test results exceeded even more similar to the level of 8800GTX.
But AMD also frank and R600 graphics core driver has yet to be continued optimization, So in the future effectiveness still further improve the performance space.
Won in pretty much every test, but isnt that the one with bad quality ?
Does anyone know if the HD2900XT works on the ASRock 775-DUAL-VSTA mobo (which is limited to PCI-e 4x)? The 8800 did not work at first, because NVIDIA screwed up the drivers. This was fixed later on.
the second card on a 965 chipset works @x4 so my guess would be yes, but i don't know for sure.