FX-8350(1249PGT) @ 4.7ghz 1.452v, Swiftech H220x
Asus Crosshair Formula 5 Am3+ bios v1703
G.skill Trident X (2x4gb) ~1200mhz @ 10-12-12-31-46-2T @ 1.66v
MSI 7950 TwinFrozr *1100/1500* Cat.14.9
OCZ ZX 850w psu
Lian-Li Lancool K62
Samsung 830 128g
2 x 1TB Samsung SpinpointF3, 2T Samsung
Win7 Home 64bit
My Rig
http://www.anandtech.com/show/6862/f...marking-part-1
Looks like nVidia is taking the lead on new benchmarking tools, probably because right now they have the lead when it comes to stuttering
Main PC
i7 3770k
Asus P8Z77-Deluxe
4x4 GB Gskill Sniper
Sandisk Extreme 240 GB
Gigabyte GTX 670
Coolermaster ATCS 840
MCP35X - Apogee Drive II - MCR320
HTPC
i7 920
Gigabyte EX58 UD5
Sapphire 5670
3x2 GB OCZ Platinum @ 7-7-7-20
Corsair HX-650
Silverstone LC10
Intel X25-M G2
As far as single gpus go AMD's 7xxx series seem to be far more consistent with frame latency than past cards. Older techreport reviews seem make that case anyways.
Microstutter has always been an issue. I'm super glad that AMD is finally attempting to do something about it.
I'm not sure I follow. From what I understand it is directly recording the output from the video card in order to look at the amount of time between frames. It seems to me that the point is to ignore everything that comes in front and focus only on the frame times that the end user sees. I can't speak to the analysis tools but from a characterization of stutter that a user experiences what would be a better approach?
Main PC
i7 3770k
Asus P8Z77-Deluxe
4x4 GB Gskill Sniper
Sandisk Extreme 240 GB
Gigabyte GTX 670
Coolermaster ATCS 840
MCP35X - Apogee Drive II - MCR320
HTPC
i7 920
Gigabyte EX58 UD5
Sapphire 5670
3x2 GB OCZ Platinum @ 7-7-7-20
Corsair HX-650
Silverstone LC10
Intel X25-M G2
Cliffs:
We realize there is a problem, it will eventually go away like our mouse bug.
DX9 has been around for 10 years but we still can't produce drivers that will get our hardware to work properly
Many DX10 titles are on the way! Stay tuned for more crappy driver releases.
Give us your money cause we reassured you stupid consumers.
i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
Corsair HX 620, CM 690, Win 7 Ultimate 64bit.
What does the carmack interview have to do with the topic?
NVIDIA parts/drivers have worked out ways around the problems Carmack mentions with drivers and DX and AMD is starting to do the same.
Intel 990x/Corsair H80 /Asus Rampage III
Coolermaster HAF932 case
Patriot 3 X 2GB
EVGA GTX Titan SC
Dell 3008
read thread,He talked about latency issue,and having over spec hardware struggling compared to consoles.It is really noticed on console based games like sega rally revo,that thing skips w/o crossfire.Desktop computing is not mission critical and never will be aka the average users isn't interested in the possibility that one task is done under certain latency limitations, to make the one task run as low latency as possible, but many task are done in a fashion that work without noticeable perceived latency.
From youVideo is couple of yrs old,they know but haven't fixed it yet and just now making a push.Got to disagree there. How many years has it been going on and now they come clean when they have the problem largely fixed for single cards and a fix 4 months out for CFx
The main thing he said is drivers are his concern,I think it relates to thread.
Exactly, it ignores the entirety of the rest of the issue. You can fix your drivers all you want but if the problem lies outside of the render calls (HT, core parking) there's not much you can do and relying only on this method may produce flawed results.
An example of a possible flawed result is the runt frames. Given the numbers that have been shown they dont all seem to line up as being what we've been told they are (all of the time).
All along the watchtower the watchmen watch the eternal return.
Issues like that would probabily affect both SLI and Crossfire in the same way. Yet Crossfire has some serious issues that SLI doesn't have, so its the Crossfire implementation the one that is to blame considering that SLI scales quite good.
There is always, everywhere, people complaining about certain graphic glitchs, slowdowns, shuttering, whatever. What you see is what matters, end users usually doesn't really care about the inner workings of it for as long as it works properly. This technique to focus on the graphic output is what will help the most to properly see and identify all those issues, as with it you don't depend on a person perception to detect them (Not everyone notices), and get into a more scientific, exact method. Once you know that there is an issue somewhere, then you could use even more tools to properly identify what happens at the moment that the glitch/shutter occurs, to know what is the actual cause (Hardware, OS, Drivers, whatever), and try to fix it.
I'm very optimistic that with ths method, it will be posible to do before-and-after comparisions so you can see if an specific option (Like VSync), a new Drivers version, Software patch, or anything else, has a positive or negative impact. So it will be trackeable if there are actual improvements.
Last edited by zir_blazer; 03-28-2013 at 10:57 PM.
They can as seen in some HT vs non-HT results, however "can" is not "always".
Why is this method the most correct? Why cant another method which is not using visual perception work as well or better, given that it may actually also include this method as well as others? Why is one method which performs only one test better than multiple methods which test multiple issues?There is always, everywhere, people complaining about certain graphic glitchs, slowdowns, shuttering, whatever. What you see is what matters, end users usually doesn't really care about the inner workings of it for as long as it works properly. This technique to focus on the graphic output is what will help the most to properly see and identify all those issues, as with it you don't depend on a person perception to detect them (Not everyone notices), and get into a more scientific, exact method.
You've basically jumped off an assumption cliff with your line of reasoning..
So you cant use other methods in your previous statement but then you suddenly can in the next? Hmm.. seems you've cliff jumped before if you're going to bring a parachuteOnce you know that there is an issue somewhere, then you could use even more tools to properly identify what happens at the moment that the glitch/shutter occurs, to know what is the actual cause (Hardware, OS, Drivers, whatever), and try to fix it.
I'm very optimistic that with ths method, it will be posible to do before-and-after comparisions so you can see if an specific option (Like VSync), a new Drivers version, Software patch, or anything else, has a positive or negative impact. So it will be trackeable if there are actual improvements.
All along the watchtower the watchmen watch the eternal return.
good article , very interesting .
Thumbs up AMD for admintting this issue (finally) .
Main rig 1: Corsair Carbide 400R 4x120mm Papst 4412GL - 1x120mm Noctua NF-12P -!- PC Power&Cooling Silencer MK III 750W Semi-Passive PSU -!- Gigabyte Z97X-UD5H -!- Intel i7 4790K -!- Swiftech H220 pull 2x Papst 4412 F/2GP -!- 4x4gb Crucial Ballistix Tactical 1866Mhz CAS9 1.5V (D9PFJ) -!- 1Tb Samsung 840 EVO SSD -!- AMD RX 480 to come -!- Windows 10 pro x64 -!- Samsung S27A850D 27" + Samsung 2443BW 24" -!- Sennheiser HD590 -!- Logitech G19 -!- Microsoft Sidewinder Mouse -!- Fragpedal -!- Eaton Ellipse MAX 1500 UPS .
I don't know if "visual perception" is actually the correct name for it, but I'm referring to the same thing responsible that not everyone notices motion the same, or is sensible at the same degree. Some people is more sensitive and notice shuttering or tearing where you wouldn't notice anything, in the same way that some people couldn't notice the difference during the CRT era of playing a game @ 75 Hz or @ 100 Hz assuming that you had enough Hardware to draw all the frames flawlessly - the frames are there, but not everyone notices them. Its basically what AMD said in the first Anandtech article:
As you're recording the output of the GPU, that is what would be displayed in the Monitor, you could analyse frame by frame to check for inconsistency. You don't rely anymore on someone's visual perception/sensivity for motion, as you have proper tools to check for it. Maybe a guy doesn't notice the effect of tearing due a runt frame like this one (From here), but that frame WAS on screen for a few miliseconds.In our discussion with AMD, AMD brought up a very simple but very important point: while we can objectively measure instances of stuttering with the right tools, we cannot objectively measure the impact of stuttering on the user. We can make suggestions for what’s acceptable and set common-sense guidelines for how much of a variance is too much – similar to how 60fps is the commonly accepted threshold for smooth gameplay – but nothing short of a double-blind trial will tell us whether any given instance of stuttering is noticeable to any given individual.
If you have any doubts, you can as well re-read all the, Tech Report, PCPer and Anandtech articles about this matter and why they're using this new method.
You have both AMD and nVidia reasons about why the other current method to detect frame delay, FRAPS, isn't accurate enough, and why reviewers have gone thorough this and this to set up for this new method of analyzing the direct output.
If you don't like it, I'm sure there are many people besides myself that would die to know what method you have in mind that is better or more correct.
Should I give you my parachute?
I very much agree with this statement. I sure there can be, and probably will be, other tools in the future that give even deeper insight into the rendering pipeline and what effects each part of the system is having. However, for the near term I think this is a very large step forward from solely using FRAPS as a benchmarking tool and helps explain a lot of what people were complaining about but unable to show from FRAPS results.
Main PC
i7 3770k
Asus P8Z77-Deluxe
4x4 GB Gskill Sniper
Sandisk Extreme 240 GB
Gigabyte GTX 670
Coolermaster ATCS 840
MCP35X - Apogee Drive II - MCR320
HTPC
i7 920
Gigabyte EX58 UD5
Sapphire 5670
3x2 GB OCZ Platinum @ 7-7-7-20
Corsair HX-650
Silverstone LC10
Intel X25-M G2
My bad, I missed your point. It's a good explanation of why stuttering exists in pc gaming and why companies need to work around it. I thought you were trying to illustrate the latency and stutter "have" to exist in pc gaming, when NVIDIA has apparently figured out a way around it and AMD is well into the process.
I see this issue as a good thing for pc gaming in general. PCs have always had the advantage for resolution and image enhancement, if all graphics solutions deliver smooth animation it can only be good for the industry.
Intel 990x/Corsair H80 /Asus Rampage III
Coolermaster HAF932 case
Patriot 3 X 2GB
EVGA GTX Titan SC
Dell 3008
If anyone is interested, over at Guru3D, Hilbert has been working on a splitter and frame capture device/system (right before it goes to the screen) he bought some serious hardware for it but is running into the issue of a storage system that can record the high write speeds without dropping frames.
--Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))
All along the watchtower the watchmen watch the eternal return.
Bookmarks