so does this mean that wizzard was smart to use 9.12? cant imagine the crapstorm that would follow if it turns out that way
so does this mean that wizzard was smart to use 9.12? cant imagine the crapstorm that would follow if it turns out that way
Intel Core I7 920 @ 3.8GHZ 1.28V (Core Contact Freezer)
Asus X58 P6T
6GB OCZ Gold DDR3-1600MHZ 8-8-8-24
XFX HD5870
WD 1TB Black HD
Corsair 850TX
Cooler Master HAF 922
Honestly, and given that I have no brand affiliation here and am just as prepared to believe in dirty tricks from either side...
...
I don't see it. Those piccies look identical.
Rig specs
CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200
Foundational Falsehoods of Creationism
![]()
just another nvidia underhand practices.
i7 950@4.05Ghz HeatKiller 3.0
EVGA E762 EK WB | 12Gb OCZ3X1600LV6GK
Razer Tarantula |Razer Imperator | SB X-Fi PCIe
480GTX Tri SLi EK WBs | HAF X | Corsair AX1200
____________________________________________
Loop1: Double_MCP655(EK Dual Top) - MoRa3Pro_4x180 - HK3.0 - EKFB_E762
Loop2: Koolance_MCP655(EK Top) - HWLabsSR1_360 - EK_FC480GTX(3x)
Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.
Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.
Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.
Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.
Rule 3:
When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.
Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!
Random Tip o' the Whatever
You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.
The ATI screenshot actually looks better to me too. The nVidia ones look a bit "washed out", whereas the ATI ones show more detail. Anyway, could be just a placebo, as I currently use an ATI card. Then again, it's very hard to spot any differences, while focusing on a _very_ small part of the entire screen. If it means more performance, I'm happy with that, as I wouldn't notice the difference in quality anyway.
Does that mean benching Crysis with Nvidia should be done with lower settings to compare? If choosing anisotropic 16x on ATi does not give you anisotropic 16x...![]()
Main : i7 920 D0 @ 4ghz on Rampage II Gene - H2O - 6gb XMS3 1680mhz C9 - GTX 580
Sossaman : Dual Yonah @ 2.0ghz
Server: 1055T 6 core @ 3.6ghz - air cooled - 16gb KVR1333 - 8 x 1TB Caviar Black
HTPC : i5 760 @ 4ghz on Maximus III Gene - H2O - 8gb KVR1333 - GTX275 - 80GB X25-M G2 + 4 x 2.5" Caviar Black 500GB @ 7200RPM
chrunching for our future
I don't like cheating in benchmarks from any company, if they are going to optomize like this they need to be up-front about it. Seeing how little the visual quality was affected, they should have advertised this as a feature. Now they are stuck with fodder for nvidia fanboys to make snide comments over, even if they don't understand the changes.
Who on earth benches Crysis? or better, who on earth plays Crysis or Cryengine 2 games?
www.teampclab.pl
MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12
Test bench: empty
I don't see a difference in the pics....?
But still, if it is supposed to do one thing and is doing another, then shame on them... but I don't see a difference worth mentioning, really.
The Cardboard Master Crunch with us, the XS WCG team
Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64
saaya
I do not know whether you are extremely brave or rather silly for posting this thread. However I also hope that ATi fix this bug too. It is not uncommon for nVidia or ATi to have driver bugs from time to time. I recall recently nVidia had an AF issue in their drivers, no doubt this is also a driver bug.
IMHO the days of cheating (from both sides) is over as the sheer fallout that happened in the days of the Gefarce5 FX driver cheats..sorry optimisations was a lesson to be learned by. (I recall ATi also having a similar misdemeanour with quake and quack or something)
Fingers crossed we see no deliberate driver cheats EVER again, because when companies start cheating and if those cheats degrade image quality then the only loser is the consumer...nobody wins.
John
Just use the last pic as a guide. Basically if you look at the 5870 pic, you will see 3 distinct zones that follow the pattern on that last pic. You can almost make out a line of where they start using tri linear and bi linear.
The NV based cards don't have these zones, it textured the same way throughout.
Core i7 920@ 4.66ghz(H2O)
6gb OCZ platinum
4870x2 + 4890 in Trifire
2*640 WD Blacks
750GB Seagate.
Exactly; it's the essentially same effect as dithering so there is less banding , or gradients are smoother.
What if Nvidia allowed a lower quality mode and could increase performance by "x" % ?
The point is the tests should be done fairly.
If the tables were turned and Nvidia skimped out on rendering do you think the images would suddenly look different and this would suddenly become important? LOL
Go on the source website and there you can switch between the images directly on top of each other and the difference is very evident.
There is also an image of 16x AF quality, and strangely, the 480 looks better than the 5870 in that as well despite "angle independent" AF on the HD5000 cards...
i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit
For all the people saying there is no difference so its not an issue, if nvidia did this everyone would go on and on about cheating...
bilinear filtering does 5 texel lookups per pixel, trilinear does 10 and then does a linear interpolation between 2 colors, anisotropic uses the view angle to vary the sampling range of the texels, each anisotropic level enlarges the sampling range(non-uniformly) and so the texels sampled increase, it then applies trilinear filtering with the new ranges. This allows the textures to look smooth even when moving through a scene and as a result heavily anti-aliases textures and reduces pixel swimming but comes at a heavy cost.
For example in a standard 1680x1050 scene, the texel lookups counts are as follows:
bilinear: ~8.82 million texel lookups
trilinear: ~17.64 million texel lookups
Anisotropic: (depending on AF level) potentially more than 4x the texels of tri-linear depending on the AF level.
In a still image there is no problem, the major filtering artifacts are movement based artifacts such moire patterns and pixel swimming and mip-level pops, static images will never really show the full picture when it comes to filtering. I cant believe ATI would do this!
I'd would love someone used the 10.3s and the previous versions and see if there is a difference, that nice 10% increase across the board the new drivers bring might be this "bug"!
Yep I can easily see the difference, if this was done on purpose then shame shame on them![]()
The difference is VERY obvious in motion but you can still see it in the screenshot. The Textures will vibrate and you will see the lines in the textures as the camera moves and its very distracting.
Last edited by grimREEFER; 04-02-2010 at 09:23 AM.
DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis
Well that's the problem, people take a screenshot of stuff and go "see?!! see?!!" problem is most of the stuff is only really noticeable in motion. The screenshot you have to really look at quite a bit to begin to see anything in cases such as this.
The Cardboard Master Crunch with us, the XS WCG team
Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64
Good ol' XS, where every bug is a cheat.
Is there any actual proof this results in some sort of performance boost? What incentive would AMD have to do something like this, when they've got so much extra texturing performance that they're pushing for angle-independent AF?
Good job jumping to conclusions and sensationalizing, especially saaya.
Lots of people, obviously?
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
pretty amazed by some of the reactions here.
If it was Nvidia lowering IQ slighty to get more FPS then we would have 9 people complaining about Nvidias dirty tactics and 1 guy saying he barely see a diffrence and prefers Nvidias IQ.
Wich would then result the 9 to accuse the 1 of being a Nvidia fanboy.
@Cybercat
Like Saaya says.
Its very convenient that this bug apears in a game used by many reviewers just at the time when Nvidia releases their new high-end for reviews.
Even if this bug gets fixed in the next driver version these reviews using the bugged drivers wil stil be out there.
So in a few months when someone who doesnt know about all of this wants to buy a new high-end card and googles some reviews he wil mostly find these old ones using the bugged drivers.
Then again we stil need to know how much extra perf this bug gives.
But even if most people cant see the diffrence, in a review all settings should be the same for a fair result.
Last edited by Starscream; 04-02-2010 at 09:33 AM.
Time flies like an arrow. Fruit flies like a banana.
Groucho Marx
i know my grammar sux so stop hitting me
Bookmarks