I think you need to re-read whats been said, boxleiternerb.
Printable View
I think you need to re-read whats been said, boxleiternerb.
1080@med high settings looks to be the sweet spot regardless of CPU. The real winner for those of us gamers on a budget is the 6300 IMHO.
How about a rope benchmark:
http://maldotex.blogspot.co.uk/2013/...rst-level.html
I agree with bhavv for the most part. I'm still on my 4890 and the resolution is the last thing I decrease before reducing other details to playable resolutions. Tbh, I don't even run MSAA anymore, seeing how superior MLAA is. I pretty much use MLAA from RadeonPro (or if I get a newer card, I'd use MLAA straight from the control panel. It's one of the factors stopping me from going nvidia on my next card. FXAA just can't beat MLAA).
Someone else argued that NONE of the settings at higher res are in playable fpses. Fine, reduce the settings but don't reduce the resolution (bhavv also mentioned this). You'll see why some reviews set reso to 1080p and settings to mid. That's a fair comparison. Game still looks nice, but don't give us resos like 1024x768 that nobody plays at.
As for playability, 30 fps seems fine to me, as it has always been. I admit 60 fps gives a great experience (and 120 fps, even better, if you can even maintain such a high min fps), but 30 fps is still playable. I'd rather sacrifice fps for graphical fidelity. I end up raging how terrible the graphics get if I ever decrease my graphics just to gain more fps for say, multiplayer. 30 fps min is my hard limit regardless.
So, the best benchmark is IMO, at playable settings which is 30 fps min (NOT at crappy resolutions), then have the CPUs have a go at each other. From the looks of it, none of the benchmark seems to offer that but I'd still pick the safe pick, which is Intel (pity, I have always been an AMD fan. I have a Phenom II just because of it). I should check out hardwarecanucks, I do remember them testing games at resolutions people play at, and at playable settings.
Finally what really makes 1024x768 a pointless test is the fact that you can always go for synthetics/real world application for CPU benchies. Gaming tests are supposed to test your gaming experience. 1024x768 is not a pleasant gaming experience.
I'd thank bhavv for all his posts but since I don't like to spam everyone's screens, here's a thank for bhavv on this post. Thanks bhavv, excellent posts.
HARDOCP disagrees with you, says FXAA is much faster and offers similar image quality:
http://www.hardocp.com/article/2011/...g_technology/4
Which is not great news for me as I'll have 7970CF in my main rig as soon as a longer CF bridge arrives.
But, I got Crysis 3 with the Never Settle Reloaded bundle, will see how it does on a 990X/7970 today, downloaded last night.
http://i1097.photobucket.com/albums/...ps80632953.jpg
I bought the second 7970 for the Crysis3/Bioshock bundle, have two 7970s and two 680s now.
Zalbard, thanks for the SMAA tip, looks like I may have issues with it in Windows 8, but I'll try it! :)
First of all, have you never heard of MLAA 2.0? That review is old. All the AMD cards now uses MLAA 2.0 which has absolutely no performance impact and fixed the blurring problem.
EDIT: All right, I did some googling. Here's some MLAA 2.0 results. http://www.anandtech.com/show/5625/a...hern-islands/3
Too bad, I'm more interested in 0x MSAA + MLAA 2.0 but this will do. 2% performance drop apparently.
Secondly, I still have RadeonPro for SMAA in case MLAA does suck. Nvidia doesn't have SMAA unless it's injected with SweetFX. The problem with SweetFX is that they use an injection method, which means online games most of the time doesn't work.
Thirdly, SMAA is MLAA if I read back then correctly. MLAA =Morphological AA. SMAA= Subpixel Morphological AA.
EDIT: SMAA is based on MLAA, with subpixel optimizations. So right now, I have no clue what MLAA 2.0 is, although I recalled reading some slides that the optimizations they did were essentially SMAA. I may be wrong regardless.
MLAA 2.0 is basically equivalent to FXAA. They solved performance and blurring problems that plagued 1.0.
SMAA is much better than either of these.
A little comparison
About SMAA
Not equivalent. FXAA has blurring. MLAA does not. SMAA is MLAA but improved in visuals and performance. The website you linked also has presentation slides on how exactly they improved performance. MLAA 2.0 is also improvement on MLAA, but on how exactly they managed to improve it, is not disclosed. I'm pretty sure MLAA 2.0 is SMAA but take my statement with a pinch of salt as I am unable to find the information anymore (dig up AMD's old slides to find out, my google-fu is not strong enough). Also, rather than just spout blatantly wrong statement like FXAA is MLAA, here's some hard data. http://forums.guru3d.com/showthread.php?t=360336 Image quality comparison. No FXAA though but seeing how similar MLAA and SMAA is should tell you how superior they are to FXAA.
Did a comparison between No-AA, MSAA 4x and SMAA (ultra_preset), some days ago.
http://kirkedam.mine.nu/kwk/Bilder%2...20AA%20BF3.jpg
http://kirkedam.mine.nu/kwk/Bilder%2...204x%20BF3.jpg
http://kirkedam.mine.nu/kwk/Bilder%2...ltra%20BF3.jpg
My conclusion was that SMAA blurs the image indeed, but it gives very nice AA at a very low frame rate cost, so the positives outweigh the negatives.
Yep, it was a good one.
http://www.tomshardware.com/reviews/...rk,3148-2.html
Pics are using in-game settings FXAA however. I've tried numerous other methods to actually make FXAA looks better (like, FXAA + sharpening). Keep going back to SMAA, and I would use MLAA 2.0 if I got a 5xxx and above card.
@Kallenator: Hard to tell whether that's blurring or AA if you look at the second and third supports after the bridge, to the left.
I understand they want to show difference between CPUs so they benchmarked at 1280x720.. But people that want to play Crysis3 and have the goods to play it won't run 1280x720. They will run 1080p+ . The test does show one big difference between i7+SMT and FX, the rest is not that dramatic. I can put the slides which show practically same fps from the same test(or one scoring 98 and other 81,one 103 other 91 etc.),but all can click and see for themselves.
Here, have two screenshots. These are mine, not some random site on the internet. One is with 8xAA and the other is your beloved MLAA 2.0, everything else is the same. Open them without resizing of course.
http://thumbnails107.imagebam.com/24...6241161769.jpg http://thumbnails108.imagebam.com/24...e241161777.jpg
I know it's gonna be difficult, but try to find which is which. Then tell me MLAA 2.0 doesn't blur everything out. OK, call it distortion if you want.
It doesn't matter how good or bad an AA technique is when I can't even get ingame because I can't see sh|t in the main menu. Haha, look at that Fraps number, *shame* ;)
It scores so much better in that one scenario(vs FX). Maybe it's something specific to the game engine that makes good use of resources when SMT is on,in that specific level. Remember this is at 720p, the difference at higher resolutions won't be that dramatic in that one level ,but i7+SMT would probably still be ahead.
optimized grass for MT... nothing else... oh boy its maximum tessellation all over again... gg crytek :rolleyes:
I wonder what they even have done to the grass to be so cpu depended in the first place...
Its mentioned in the article, every scene where massive amount of grass is present a HT less i7 is offers the same performance as a FX everywhere else the i7 is (measurable better) and of course HT enabled massively hampers the i7 in all scenarios for what ever reason.
So wait,if I understood you correctly, HT enabled helps with scenarios where grass is present and hampers everything else?