It's been fixed, and the performance difference is only around 1 fps between the bugged driver and the fixed one.. That's a noticable performance improvement? :rofl: :ROTF: :rofl:
Printable View
I knew you will be the first one saying that, you are so predictable :p:
Admit it or not, those drivers are cheat. And now, the new ones are not. Should I have to praise NVIDIA for not cheating? NVIDIA´s IQ in both 2D and 3D is $hit compared to ATI, and perfomance is better on NVIDIA cards. I admit that, why not you? Is a well proved fact. Not by reviews (well there too lol), but with my own eyes.
:shakes:
NVBOYZ: Its not a cheat its an optimization.
ATIBOYZ: Sure, just in time for reviews to be done on your cards there is a driver bug that gives you 7% performance boost.
Agree or disagree you have to realise one thing. If you dont tell NV to f*0ff with such "bugs" they will keep trying to pull new ones. Dont accept anything substandard demand quality for your money. Beta this beta that, why cant it be quick fix but stable and non-buggy?
I on the other hand i dont really care that NV cheated i know what i am getting as my next card. Let me tell you it aint going to be from beta-spamming camp.
You can disable frame rate overlay on the screenshots from the "Screenshots" tab. Have you contacted Nvidia and alert them of the problem? If not, take some screenshots and send them to Nvidia. A new beta maybe on the way because of this. However, given the benefit of the doubt, they may not know about it yet... But you get no argument from me, it is what it is...
If you load these drivers, Look around the mine area, and the alien ship, You can see alot of it. I need to take more screenshots, I can't really talk to nvidia via customer support since My account has been locked for a unknown reason. I request a password reset and i get no email from them :(
why don't ati cheat like this?
It's not a cheat, it's a bug, and ATi have had very similar bugs. Even in this generation.... Lost planet and CoJ's DX10 patch(anyone remember when it first released and screenshots showed ATi using some form of filtering that lowered IQ?) come to mind.
What most people don't get is, when you fix a bug, it can cause issues elsewhere. Say your driver has a problem with a few specific textures glitching at the beginning of a game, so you fix that... Now, all the sudden that fix causes a few clouds not to cast reflections. Now everyone says you're cheating for performance...
This is why NVidia releases so many beta drivers, so they receive feedback on any problems faster, and as such, can fix it and put out the new beta to see if anyone finds any new problems with it.
You still calling it a cheat...now THAT'S predictable.
Even the author of the review laughed at the people using this as a way to flame nvidia as a whole. I take it you don't have any form of understanding on how things like this work, and as such I'll have to give you that benefit of the doubt. See the above, and you'll understand what I'm talking about.
Also, 2d IQ isn't any better on ATi right now... Not sure where you get that from. Neither is 3d. Again, not sure where you get these ideas from, but to bypass the swear filter(which is an offense on our forums...) and call it what you do, when everyone knows that IQ is subjective, makes me laugh.... I've said this a million times, NVidia has better IQ on technically(on paper, there's no one who can deny that), while which you prefer is all preference. If you don't believe me on the technically part, beyond3d wrote a very VERY good article outlining it all...
I have yet to see that one, but I'll keep an eye out. Unfortunately that's the nature of fixing bugs, what fixes one thing can break another.
You are like Shintai sometimes. Your own logic is THE logic. Sometimes damn right, sometimes damn wrong. But you always think you´re right because you read all those paper launchs and reviews. And again, you´re wrong here. I´ll tell you why. But I know, you will be coming with some subjetive BS. It seems that you are the one that is talking with the only source of things written on a paper.
Have you tried overlay video with ATI cards? Have you tried it in both XP and Vista? Have you seen the differences? Have you ever used YV12 color space output+overlay with an ATI card X1000 series or newer? Have you tried a small resolution video with overlay+YV12 with an ATI card? Have you see the contrast and color quality differences between the two companies in all those scenarios? If after trying all these things you say that NVIDIAs IQ in 2D is better, then you need some kind of telescopic glasses. It has nothing to do with subjetive. I have tested it in a bunch of ATI and NVIDIA cards, including 6600GT, 8800GTX and GTS, ATI X300, X800, X1900 and HD2900. And again, I call NVIDIAs IQ pure :banana::banana::banana::banana: (you like it spelled like now? :rolleyes: ) So go away and bring your technically part BS to other people. Really, I don´t like hidden fanboys like you, talking about something you have no idea about trying to convince people that you have that idea.
Oh, and I´m not talking about UVD or PureVideo here, that´s another history.
In the 3D part I have to agree with you, the gamma and contrast settings are both subjetive. And again, in my honest opinion, is way better in ATI cards. Not including NVIDIA cheats here ;)
I´m the first one interested, I want NVIDIA to fix their crap quality, so I could give some love to an 8800Ultra. But, unfortunately, they haven´t fixed it yet. Bad for me, and bad for NVIDIA.
Last ATi card I played with for an extensive amount of time was an x1900xt, if you really must know. Before that was my old 9800pro(which is still alive and kicking, might I add). I'm not counting cards I toyed with for a small amount of time in this.
Video overlay, eh? Never bothered playing with those settings, but seeing as how that has nothing to do with the topic, and the argument is dragging us further and further from the topic, I'll save us all a whole lot of time and merely give you the benefit of the doubt... For the record though, I don't consider video image quality as part of 2d... I classify it 2d as being windows/fonts/web browsing and the like, 3d(obvious), and movie/video watching(which, I don't do on my PC really anyway, as my rig is for work and gaming). As such, if ATi has some miraculous difference in video, it's likely I never noticed it because I had no reason to use said features....
Do you have any links to back all this up? I'm not doubting you, I'm just curious and would like to read on it myself. Not ATi white papers, but from a well respected site like beyond3d or the like.
Well, 2D is in fact all the things that are not 3D IMO. In your 2D, yes I agree, all the cards are the same to my eyes.
That´s the problem, people like you don´t use the PC to watch movies or TV, and if they do, if the quality is garbage they don´t bother at all. And then they claim that video quality of both companies are the same. Of course, using WMP or some crappy program like that it´s true. But when you start to use some "special" (or advanced if you want) features, unknown by most of average pc users, the differences start to be visible. To probe it the only thing I can do is put my camera and take some photos of the monitor, as these enhancements are not transmitted to a capture of the playing video. Even in a photograph the IQ difference is notable. Sadly now I don´t have any NVIDIA cards, and I have sold my HD2900XT. Stuck with the X800 for now.
ATI has made some serious enhancements from Xxxx series to X1xxx series, the difference between X1xxx and HDxxx is very small. NVIDIA, since the 6600GT days, has done nothing. I have spent some time trying to discover if there was any changes regarding to drivers or quality, but I really didn´t find them.
I don´t trust many of the reviews I read, so I can barely remember them. I think TheInq or Fudzilla posted some stuff regarding this. Something like ATIvsNVIDIA in HD playback. Yes, I know, they are FUD sites, but the photos they have are realistic and illustrates perfectly the things I´m trying to say. Like bad black color, bad contrast, poor colors, etc.