Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Its not like nvidia has not been quality of doing similar shady things in the past. Whats the problem.
It's a reference renderer which defines what exactly the scene should look like?
Software rasterization by it's very nature does use the same optimizations as hardware rasterization does - depending on the method used of course. The only difference between the two is that the CPU does exactly the same as the ROPs and TMUs do hardwired, and at the same time the CPU emulates the shaders.
So you think that after they went trough the process of developing, testing and implementing their own version in ForceWare and of course asking reviewers formally to kindly test with it on, nVIDIA will disable it (losing about ~10% free performance is selected titles) for end-users whom have no visible indication nor in the ForceWare CP, nor in games for that matter what the status of this optimization is. We shall believe this, because they said so?
Actually an article would be nice testing the last 2 or 3 drivers from nVIDIA with the switching utility to see whats what. Any reviewers up for it?
If I'm proven wrong, shame on me. On the other hand if they are pulling a "Hey look!" pointing in the other direction while doing the same, then shame on them.
Call me what you want, but I'm skeptical... situation kind of reminds me of the good ol' 3DMark03 times.
"We are going to hell, so bring your sunblock..."
I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen. Thinking that it could just be COD Black ops I tried another game ... Left 4 Dead 2. I play L4D2 a lot and since it's not a TWIMTBP title (and steam complains about my Graphics adapter being unknown) I didn't expect much difference to show. But boy was I wrong AGAIN !!! The same thing goes ... much much sharper picture, more vibrant colors etc. and I'm talking about a game I have logged hundreds and hundreds of hours in playing on high end Radeon Hardware. No matter what game I try it's the same story, much better image quality and more vibrant colors. You could argue that it's just a matter of adjusting settings in the control panel and it could be, but since I just got the GTX 580 I haven't changed any settings in the control panel and it's at default settings.
I'm a bit surprised about this and also a bit disappointed, because since the GTX 7xxx days and the GeForce FX + 3DMark 2003 days ATI has always been a guarantee for image quality for me. I hope that AMD/ATI will find their way onto the right path again and not play these games of cat and mouse where they do AF correctly in AF tester (when it's detected) but differently in games etc.. Picture quality is more important to me than FPS and I hope AMD will realize that too. So what if they loose 10% performance by rendering things the right way, it's a lot better than the bad press they get on this.
AMD Ryzen 9 5900X
ASRock Radeon RX 7900 XTX Phantom Gaming OC
Asus ROG Strix B550-F Gaming Motherboard
Corsair RM1000x SHIFT PSU
32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)
1x WD Black SN850 1 TB
1 x Samsung 960 250 GB
2 x Samsung 860 1 TB
1x Segate 16 TB HDD
Dell G3223Q 4K UHD Monitor
Running Windows 11 Pro x64 Version 23H2 build 22631.2506
Smartphone : Samsung Galaxy S22 Ultra
That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.
About the AF issue: Hardly anyone would notice a difference in a blind test that does not pick particular scenes. However I do agree that AMD has to improve the AF quality.
You are incorrect.
The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.
You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.
From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
32 GB Patriot Viper Steel 3733 CL14 (1.51v)
RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
Tons of NVMe & SATA SSDs
LG 27GL850 + Asus MG279Q
Meshify C white
As mentioned in the other topic:
If you think there is some truth about this - XS has a section at Off-Topic -> Tech Talk ... but don't trash news section with their politics intended for bashing one each-other by trash-talking products from an "adversary (as in competition)".
so pls tell me what lack of performance increase you are referring to? you mean the update of the 57xx series which is now 68xx series... to me it looks like you just didn't look/read any decent review and didn't search enough threads here to really know that you should wait a bit longer to purchase a card unless you are a real nv fanboy, certainly a 580..... within 2 weeks it will be 150$ cheaper
![]()
does these 2 smileys look also green and blue on your new nv580 or do they have much more brigther color now.......
now to the point, yes you could see a difference but that has more to do with profile settings, drivers and cards then anything else, just like monitors have a big influence...
This discussion here is about the anisotrope filtering quality. So it is just ridicules to state on this topic that IHVx delivers "much sharper" images with "colors so much fuller" than IHVy here.
Yeah, there are some differences in the default (uncalibrated) video signal profiles. SO WHAT?!
Argh... I really should stop wasting my time here!
Compared to my old HD5870 I don't think the 6870 was that big a deal and yes I did read reviews and lots of them. You can mock me all you want, but saying I'm a NV fanboy after all the ATI cards I have owned is just plain stupid on your part. And why should I wait ? If I have the money to burn why shouldn't I buy a new graphics card whenever I want to ?
You can turn and twist it all you want, there is a difference and in some cases it's HUGE and judging by your comments you didn't read what I wrote.
Fact of he matter is I didn't write what I did to taunt anyone or to mock anyone, but simply to report what I observed going from a long line of Radeon cards to a Nvidia card.
Last edited by Toysoldier; 11-23-2010 at 01:46 PM.
AMD Ryzen 9 5900X
ASRock Radeon RX 7900 XTX Phantom Gaming OC
Asus ROG Strix B550-F Gaming Motherboard
Corsair RM1000x SHIFT PSU
32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)
1x WD Black SN850 1 TB
1 x Samsung 960 250 GB
2 x Samsung 860 1 TB
1x Segate 16 TB HDD
Dell G3223Q 4K UHD Monitor
Running Windows 11 Pro x64 Version 23H2 build 22631.2506
Smartphone : Samsung Galaxy S22 Ultra
That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
AMD Ryzen 9 5900X
ASRock Radeon RX 7900 XTX Phantom Gaming OC
Asus ROG Strix B550-F Gaming Motherboard
Corsair RM1000x SHIFT PSU
32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)
1x WD Black SN850 1 TB
1 x Samsung 960 250 GB
2 x Samsung 860 1 TB
1x Segate 16 TB HDD
Dell G3223Q 4K UHD Monitor
Running Windows 11 Pro x64 Version 23H2 build 22631.2506
Smartphone : Samsung Galaxy S22 Ultra
All right. Then let us determine this in a controlled test.
Which screenshot has been taken by a RV870 and which by a GF110?
http://img94.imageshack.us/img94/6471/dirt2a.jpg
http://img220.imageshack.us/img220/48/dirt2b.jpg
http://img819.imageshack.us/img819/8263/metaf.jpg
http://img254.imageshack.us/img254/4315/metb.jpg
http://img38.imageshack.us/img38/2775/mwabz.jpg
http://img202.imageshack.us/img202/7807/mwby.jpg
http://img841.imageshack.us/img841/7946/vana.jpg
http://img534.imageshack.us/img534/1826/vanbi.jpg
Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons!
PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
Last edited by Katzenschleuder; 11-23-2010 at 03:44 PM.
first dirt 2 is 870, second gtx 460, comparison not possible because 460 lacks tesselation
second game - don't see a difference
MW - no clue (gamma difference?! / can't compare IQ due to different brightniess settings
vantage: both screenshots show banding in certain areas...
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
the comment I provided was you pointing out that there is no added value going from 5870 to 6870 ---> off course not that is the hole point, 6870 is a new price range in the market it is not intended to replace the 58xx series, you still don't get it....and if you already own a 5870 i can't think of any reason to spend another +500$ to buy a new card without waiting 2-3 more weeks, that is what I callconsumer.
One complaint about ATI I've had for a while is that their games are substantially darker for the same ingame Source gamma setting. Made some dark, gritty HL2 mods hard to play.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
Bookmarks