Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24Ļin nVidia Surround
Last edited by Dimitriman; 11-20-2010 at 04:29 AM.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
this is not abotu defending; this is about realising that this campaign is more related to the success of barts and a desperate attemp from nvidia to start a mudcampaign because their products in this price range are inferior right now and they aren't happy with their christmas sales...
AMD doesn't take anything away, they give you more options than you had before and TBH i'm happy to take the performance advantage in BFBC2 because i fail to see a difference in this game...
trackmania is another story, in this game i'm happy about the NEW HQ setting which completely eliminates banding (BTW: banding is present on the 8800GTS in my other pc...)
if there is one thing you can critisize them for is that they don't set HQ by default but i'm sure that most users take the extra performance over the IQ because most of them can't even see a difference between mid and high settings (which is sad but true)
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
so the setting exists, and you just have to turn it on. sounds like such a big deal that we needed a post from nvidia with an exclamation point, after we already had a huge thread about it (and not the fisrt one)
2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
XS Build Log for: My Latest Custom Case
If I had a dollar for every "company x/y uses lower IQ to get higher performance" claim in the last ~10 yrs...
...I'd have founded Company Z
The point is that ATI's default quality is lower then Nvidia's default and once you adjust quality setting for ATI to match Nvidias you incur FPS penelty , according to that article 10%. And after all that PR marketing from ATI that they got superior quality compared to prior 5xxx cards and good performance suddenly becomes a blurred truth.
Thats a simple fact , noone is arguing that you can set to high quality or what nvidia or ati did in the past , its about now and misleading benchmarks, info people get when deciding what to buy.
.:. Obsidian 750D .:. i7 5960X .:. EVGA Titan .:. G.SKILL Ripjaws DDR4 32GB .:. CORSAIR HX850i .:. Asus X99-DELUXE .:. Crucial M4 SSD 512GB .:.
thats only partially an issue. most reviews test at least 4-5 different iq settings and what you see is what you get - at the highest iq settings and aa the results are as accurate as they can be.
now if you want to complain about 1280x1024 default iq +no aa gaming then i cant relate since i dont play those settings anyway.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
The problem is review sites. If they don't do a like for like review based on Image quality, then we're always stuck in this situation.
I think [H]ardOCP comes closest with reviews that take into consideration settings, but a review site that lowers/raises settings to give equal amounts of image quality is needed (are there any?).
As for the AMD and Nvidia fanbois in this thread. it's starting to get annoying, as none of you are giving any evidence to back up your arguments. AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
That is correct. I was convinced of that when they made a public statement to the effect of "we wanted to bring all products in line with the lower quality in order to be consistent". That took some balls but as evidenced in this thread they knew they could get away with it.
LOL @ AMD Droids defending AMD's shoddy IQ
"B-b-but no one can notice it!!"
Intel Core i7 6900K
Noctua NH-D15
Asus X99A II
32 GB G.Skill TridentZ @ 3400 CL15 CR1
NVidia Titan Xp
Creative Sound BlasterX AE-5
Sennheiser HD-598
Samsung 960 Pro 1TB
Western Digital Raptor 600GB
Asus 12x Blu-Ray Burner
Sony Optiarc 24x DVD Burner with NEC chipset
Antec HCP-1200w Power Supply
Viewsonic XG2703-GS
Thermaltake Level 10 GT Snow Edition
Logitech G502 gaming mouse w/Razer Exact Mat
Logitech G910 mechanical gaming keyboard
Windows 8 x64 Pro
Did you guys read the blog post?
Hahahaha, that's so funny. They were "taught" some hard lessons more like it.For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.
2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
XS Build Log for: My Latest Custom Case
that's why they include more than FPS numbers in their reviews. there is more to look for in a card than just performance or IQ.
sure there is no obligation, but is it right to lower image quality to gain performance? we would just end up in a race of degrading IQ.
AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
insightful post.
You meant AMD fanboys, right?
People can finger point all they want. The video comparisons on the sites linked are pretty much universal in their translation.
So ... the card bands in an AF testing program with an artificial texture that nvidia has ironically decried the use of because it's not real?
Can we get some real games tested, please? And let's try not to only focus on Far Cry or Trackmania.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
As long as this is not a problem during normal gameplay it's mostly irrelevant. Show me a comparison video of a recent game that shows this in a noticeable way and we can call it a real problem.
"No, you'll warrant no villain's exposition from me."
Bookmarks