Have you tried running G80 drivers on G84? It won't work.
Printable View
anybody got aquamark running on the hd2900, i think it needs an updated direcpll.dll , like the x1900 cards did , it does the test fine but right at the end it black screens before the score is shown
Some very nice performance for the 2900XT in Lost Planet, edging past the 8800GTS 640mb, about 3,5 - 4fps difference, nice. :)
http://www.firingsquad.com/hardware/...ance/page6.asp
These cards are unified, and nothing like the old cards. I am not saying it is impossible, I am saying that it is not something that should be assumed.
Those FSquad drivers are the same as TechReport uses in their review. Those drivers seem to quite nice for some games. The HD2900XT beats the 8800 GTS 640 MB OC by 0.3 FPS average on Call Of Juarez. http://www.techreport.com/reviews/20.../index.x?pg=14
The GTS has higher minimums however which will help smooth out the gameplay when the fps drops. I would rather have 3fps more in minimums than 0.3fps in maximums any day.
If it truly is a 20% overclocking bringing the core to 890 that proves that this thing scales well as long as heat is removed. I am saying this in assumption that the new, non referenced cooler does better than the stock to allow such a higher core. I have not heard of much above 850 or so with stock cooler for overclocks. Some can't even do that and fall short in the 84x range.
Problem with that other cooler is the hot air is not expelled from the system like it is on the stock. No more will I have 90c video card air stay in my case. Damn Accelero X2 ruined that for me.
Yeah, but that 3 fps minimum difference is caused by the HD deciding it needs to re-calibrate the heads, and thus the texture wasn't loaded in time causing a sudden drop in FPS.
They used a median low. Whatever that may mean. I imagine it is the median of all numbers that are below the average FPS; but what if NVIDIA has 49% very low numbers and 51% 15 FPS numbers, whereas the ATI has 51% 11.7 numbers and 49% 25 FPS numbers, then you have the same median, but ATI clearly is the winner of choice.
Did I already mention that most reviews suck out there? I like the HardOcp manner of displaying the FPS over an interval of time. However they need to provide more screenshots (a screenshot after every 10 seconds would be ideal), since they change the settings for each videocard. They provide screenshots to show the difference, but the difference changes depending on the scene. Taking an automated screenshot every 10 seconds at least is time-feasible. But shutouts to HardOcp for their testing-methodology.
NWN2, details maxed, outdoor town, 1680x1050, 16FPS avg
http://www.custompc.co.uk/custompc/n...rformance.html
http://img292.imageshack.us/img292/1...93d76eozz4.jpg
Quote:
AMD explains Radeon HD 2900XT's poor AA performance 1:16PM, Monday 14th May 2007
The R600 is finally here, and in keeping with its mysteriously long gestation, in at least in its first incarnation as the HD 2900XT, AMD's new GPU still poses a lot of questions. One of the things we noticed during our in-depth testing of the card is that compared to its principle rival, the Nvidia GeForce 8800 GTS 640MB, the HD 2900XT performs poorly in many games when anti-aliasing is enabled.
In F.E.A.R., at 1,600 x 1,200, with AA and AF disabled, the HD 2900XT easily outstripped the 640MB 8800 GTS, delivering a minimum that was 23fps higher than the latter's. However, with 4x AA, the HD 2900XT's minimum framerate dived from 82fps to 21fps, while the 640MB 8800 GTS produced a minimum of 30fps. Adding 4x AA results in a 74% drop for the Radeon, compared to only a 49% drop for the GeForce.
The Radeon's framerates suffer disproportionately with anisotropic filtering, too. Again testing in F.E.A.R. at 1,600 x 1,200, we saw the HD 2900XT's minimum FPS drop by 10 per cent with 16x anisotropic enabled, compared to 3 per cent for the GTS, although the HD 2900XT still had a faster average. It was a slightly different result at 2,560 x 1,600, as the HD 2900XT's massive bandwidth gave it a boost, although adding 16x AF still had more impact than it did on the 640MB GTS.
As most gamers will want AA and AF enabled in games, the HD 2900XT's poor performance with these processing options enabled is a serious problem for the card and ATi. We asked ATi to comment on this surprising result and the company revealed that the HD 2000-series architecture has been optimised for what it calls 'shader-based AA'. Some games, including S.T.A.L.K.E.R., already use shader-based AA, although in our tests the 640MB 8800 GTS proved to be faster than the HD 2900XT.
We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'
While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."
So does this means AA issue can be fixed by drivers?
I think maybe yes since it's software in part.
What issue? The poor performance in current games compared to the G80? He never said it could be fixed (or that it couldn't be fixed), instead he said, forget about old games, let's focus on AA for future games. They asked him the wrong question tbh.
I am fairly sure they will optimize their AA-shader based solution further and try to program their programmable MSAA to do the resolve in the back-end, to see if it is faster with older games.
He seemed to imply that doing MSAA resolve in the back-end, can only be done linear, whilst HDR requires non-linear to work correctly. If that is true, then either NVIDIA doesn't do MSAA resolve in the back-end, found a way around it, or has different image quality with HDR AA (the lightning will be different). This would be very difficult to verify since you need the exact same settings for both videocards you are testing and a good dose of HDR light.
Basically I am quite confused :)
Your native TFT resolution is 1680x1050 I guess. I see the drivers haven't improved for that game. I was hoping for more speed, but okay. NWN2 was tested on http://vr-zone.com/?i=4946&s=13
i understood, that ATI improve their new AA render on shader, and don't care about classic AA, i'm right ? ( english is not my first language ^^ )
If i understood well, then if ATI force driver to do shader AA, AA will perform well ?
This drivers could kill the GTX :D
I find it really funny how it is mentioned that the card was optimized for shader based AA. As far as I understood this was a desperate attempt from ATI to include AA on the card without having to do yet another silicon respin due to a problem with the original AA resolve method. They did not want to have another delay so opted for this instead. I still feel this was a real bad move and one of the main reasons why the card is suffering performance wise when AA and AF is enabled. Until I see a driver update that corrects what I am feeling I am going to have to feel this way unfortunately. If shader based AA really is a thing for future games than great, but what about the 10,000 other games on the market that don't agree with that method performance wise? I would hate to spend $400+ to play maybe 2-3 games that will be out this year that may take advantage of this new AA method. To me, it is all about playing the older games the best they have ever been played as well as have the possibility to play future games decently too. I am very anxiously awaiting the end of next week when new officially released drivers are suppose to hit with what is suppose to be major performance increases, and hopefully not at the expense of IQ which is another nasty rumor I keep hearing. My trigger finger is staying off the buy button till I get some more solid answers.
Mmm, now that you mention CPU limiting. It's almost like developers are spending more time programming in a threaded model than optimizing. The later is still far more advantageous performance wise.
From recent news I heard the problem is exaggerated and underrated. The AF problem is exaggerated; the difference between G80/G84 and 2900 is supposedly very hard to find in games. The AA problem is far worse when in motion, BUT! it only occurs when wide & narrow tent are used, there is no problem when your using pure MSAA (2x, 4x, 8x).Quote:
I find it really funny how it is mentioned that the card was optimized for shader based AA. As far as I understood this was a desperate attempt from ATI to include AA on the card without having to do yet another silicon respin due to a problem with the original AA resolve method. They did not want to have another delay so opted for this instead. I still feel this was a real bad move and one of the main reasons why the card is suffering performance wise when AA and AF is enabled. Until I see a driver update that corrects what I am feeling I am going to have to feel this way unfortunately. If shader based AA really is a thing for future games than great, but what about the 10,000 other games on the market that don't agree with that method performance wise? I would hate to spend $400+ to play maybe 2-3 games that will be out this year that may take advantage of this new AA method. To me, it is all about playing the older games the best they have ever been played as well as have the possibility to play future games decently too. I am very anxiously awaiting the end of next week when new officially released drivers are suppose to hit with what is suppose to be major performance increases, and hopefully not at the expense of IQ which is another nasty rumor I keep hearing. My trigger finger is staying off the buy button till I get some more solid answers.
You understood wrong...
Problem with Shader AA is that you're taking shader power to do something that could be done by dedicated hardware.
So, now ATi(I refuse to say AMD had anything to do with this part) is going to attempt to offload physics onto the shaders AND AA onto the shaders? That's just not going to work.
Also, ATi don't have to force their driver, it already does it. That's why performance is so low with AA.
AA works here, although ther ARE some niggling issues as it does not seem to be applied to all textures in some applications.
Will 65nm process permit a higher shader clock? If AMD rises shader clock, performance will scale much more than on GeForce, right?
Maybe AMD will come up with a 1Ghz core and 1Ghz for the 320 shaders on the R650. One of the problems with the R600 is the lower shader clock (Half the clock of the 128 shaders of nvidia).
I dont know much about shaders...
Check out this scores! what do you guys think. It's from a member of OCNQuote:
http://i148.photobucket.com/albums/s...87/stock06.jpg
8800 GTS score *all stock* 3dMark 06 (CPU: Intel E6600)
http://rigshowcase.com/img/457I4N29/10733.jpg
2900 XT score *all stock* 3dMark 06 (CPU: Intel E6400)
http://rigshowcase.com/img/457I4N29/10736.jpg
2900 XT Fear BenchMark AAx4, max resolution, Max Everything, Ansiotropicx8, no soft shadows.