5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
32 GB Patriot Viper Steel 3733 CL14 (1.51v)
RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
Tons of NVMe & SATA SSDs
LG 27GL850 + Asus MG279Q
Meshify C white
can not we just agree it is wrong to cheat
and when they do we Protests
this time it was amd next time it's maybe nVidia doing it
they just should not start lowering quality in a race about who is fastest
it only affects us and gaming in a bad way.
Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround
Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround
you can keep telling this over and over again yet i enjoy a higher IQ on my 6850 than what was possible on the 5850 which was proven over and over againOriginally Posted by E30M3
it's also proven that the difference between HQ (better than 5xxx series and my old 8800) and normal quality (worse quality in older games, same quality in new games) is around 5% which still puts the 68xx series in front of the gtx460 in terms of performance and price/performance
i don't see how amd lost this round, they own a 3-4 times higher market share than nVidia in the DX11 markt, they dethroned NV in overall market share and their higher priced products make up a significantly higher proportion of their revenue than nvidias...
sure nvidia is still pretty competitive right now but unless you want to buy a card above 400 (GTX580) amd is the way to go due to output option, video playback quality, performance, price and power consumption (that is from 50 to 300) if you don't have any brand preferations at all...
this might change in the next round, and certainly was the other way round in the 8800 times but to claim that amd lost this round is ignorance (just as ignorant as some AMD fanboys continue to claim that AMD didn't loose against i7, they hold on to certain price points and certain workloads but overall they lost just like nvidia...)
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
In before nvidia people cite ~70% marketshare leftover from 79xx and 88xx series.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
And this forum proves once again that an objective opinion is hard to come by.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
ive been saying it for a while... direct x should include a default render mode that has to pass a series of tests, so it does NOT get tweaked/optimized...
so people can chose to use this mode or the optimized settings ati and nvidia offer that boost performance by SLIGHTLY reducing image quality...
if im on a mainstream gpu ill appreciate the latter, but if im playing an older game i want max image quality and no optimizations at all...
and how can you sell 500$+ videocards to people that dont render the games the way they were supposed to look but blur things to boost the performance by a few percentage points?
thats just plain stupid...
btw, i dont get those videos... whats the right side of the video?
the flickering on the left side of the 6000 videos is def worse than in the 5000 videos.
BUT, on the nvidia videos is a slight flicker both left and right... the left side looks better than the 6000 but not really better than the 5000.
Last edited by saaya; 11-21-2010 at 04:28 AM.
Not in min FPS. In most reviews i seen 5970 drops way below 580. And you know that's whats makes or brakes a game. I don't care if i cant get 10 fps more on high. If FPS fluctuate like crazy then game is much more unplayable. When i play any first person shooter i want stable FPS and definitely dont want them to drop into single digits. I wont post any slides just check Anand's review. And btwy most crossfire setup suffer from that. You ether agree with me that min FPS is more important then high for good smooth gaming or you a fanboy. That's why I'm not so found of dual gpu cards, and would never get card like 5970 for gaming. Would definitely pick up 5870 instead.
Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376
e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7
Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
3DMark 2005 Score Dothan & 6800U
3DMark 2005 Score p4 & 6800GT
5970 has higher minimum (and avg) than 580 in AvP (dx11), BFBC2 (dx11), Just cause 2 (dx10).
580 wins in Dirt2 (dx11), Lost planet (dx11) and SC2 (dx9)
in Metro (dx11) theire about even in minimum fps....
This is from Hardware Canucks (1 of few review site i trust)
So it depends on wich games you play really.
Edit: Also looked at Anands 580 review. They only show minimum framrates in 2 games, 1 wich is is crysis warhead wich looks like it need more than 1gb ram (normal 5970 has only 1 gb ram) for thoose settings at 2560x1600, so ofc the 580 will do much better there since it got 1,5gb ram (5970 was better than 580 @ 1920x1200)
Edit2: Lab501 wich is also an excellent site wich has avg and minimum framerates.
Theire 580 vs 5970:
5970 wins in SC2, CoD4, just cause 2, AvP, Mafia2, Medal of honor, Cod:black ops
580 wins in Farcry 2, Crysis warhead, Dirt2, Hot pursuit, Metro 2033, Hawx 2 (tho same minimum), warhammer 4000, Battleforge, BFBC2, Darksiders
5970 was 1 fps ahead in resident evil 5 but so close i call that even. Same minimum in Hawx too, but 580 is slightly ahead in avg.
580 quite ahead in in avg on lost planet 2, but 5970 still had higher minimum.
580 slightly ahead in min on Batman Arhkam Asylum, but slightly behind in avg.
You can call anyone who dont agree with you fanboy, but the black and white picture you are painting makes you look way more fanboyish than others. Both card got theire strengths and weaknesses. Looking at just the 2 reviews i looked at now, ive would say 580 pulls the longest straw. Unless you only play certain games where the 5970 shines...
(and just for the record, i wouldnt buy either of them)
Last edited by eXa; 11-21-2010 at 10:31 PM.
X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
Gigabyte 890gpa-ud3h v2.1
HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
Dell U2412m, G110, G9x, Razer Scarab
It does that with new tweaked driver which supposedly gives 10% boost? That probably has nothing to do with that eh When you look at a review where they used full IQ picture changes some what. 6800's look pointless as in most cases 5800's takes the cake. Even in tessellation benchmarks which supposedly 6800's were redesign for they are far from shining.
And it's not about d** measuring contest. You cant compare dual gpu to single anyway. And Nvidia has the fastest single gpu card on the marker today.
99% would never purchase 5970 or upcoming 595. Most will probably go Xfire or SLI first. \
Oh and i hear that 5970 is not so smooth gaming card anyway. It has its own share of problems. That was my point. SLI is not perfect ether so i still outright dismiss anything like dual gpu, xfire, sli as a gaming platform. For benchmarking sure but for gaming give me single gpu card under 230W full load.
Last edited by railer; 11-22-2010 at 02:02 AM.
Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376
e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7
Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
3DMark 2005 Score Dothan & 6800U
3DMark 2005 Score p4 & 6800GT
They both use 10.10 and 262.99 (so does anand)
I dont understand your point, shouldnt you use the latest and best drivers?
You are not making any sence...
Dont let the 68XX naming confuse you, they are not really ment to replace the 58XX. But they still do better than 58XX in tesselation....
If it where, i would only have showed you the numbers from the card i own.
But i dont own either of them. So moot point...
(And by doing so i would only look un-objective and like a huge fanboy)
Now THIS is true fanboy talk
Ofc you can compare them, they are both graphics card right? They also happen to cost roughly the same and both are power hogs and they are also the best card each maker has atm...
You may ofc do all that if you want, no one is forcing you to do anything.
X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
Gigabyte 890gpa-ud3h v2.1
HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
Dell U2412m, G110, G9x, Razer Scarab
way to blow things out of proportions, it took a month to find the differences between both parts, not a single person who switched from NV to AMD or an older AMD card noticed a big difference...
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.
Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.
Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.
Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.
Rule 3:
When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.
Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!
Random Tip o' the Whatever
You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.
Oher sites with quality tests:
http://www.3dcenter.org/artikel/amds...ilterqualitaet
http://ht4u.net/reviews/2010/amd_rad...ded/index9.php
Last edited by Gilgamesh; 11-22-2010 at 07:01 AM. Reason: edit repost
Funny how nVIDIA makes a scene about this, considering they use FP16 demotion too since Rel.260 ForceWare drivers and that you can't turn off in the control panel... Though you can turn it off with a utility that is not available to the public.
Got to love marketing BS.
"We are going to hell, so bring your sunblock..."
Nice try...
http://www.geeks3d.com/20100916/fp16...e-says-nvidia/If you wish to test with Need for Speed: Shift or Dawn of War 2, we have enabled support for FP16 demotion similar to AMD in R260 drivers for these games. By default, FP16 demotion is off, but it can be toggled on/off with the AMDDemotionHack_OFF.exe and AMDDemotionHack_ON.exe files which can be found on the Press FTP.
It is on and was on for a couple of releases, for at least the games mentioned in the article you've linkend as far as I am aware.
What really baffles me is that I don't know why are they really doing this kind of work, since both AMD and nVIDIA hardware are capable of full-speed FP16 texture filtering. Though I don't know why I am "surprised" that they are at it again after missing shaders and objects in games like Crysis and Far Cry 2...
"We are going to hell, so bring your sunblock..."
So you obviously have proof, if you're so adamant about this, even though NV admitted putting it in and said it's off by default and can't be turned on through CP, right?
Bookmarks