-
Xtreme X.I.P.
It would be nice to be able to see graphs of the following two things for games:
- Latency between control manipulation and its effect on a frame
- Deviation in time between frames in a game
The first would relate to responsiveness.
The second would relate to smoothness. If on an Intel you get 40fps average with a deviation of 10ms or on the AMD you get an average of 35fps with a deviation of 5ms, I wonder what the perceived difference in smoothness would be. I imagine that people would mistake the lower framerate for a higher one just due to increased uniformity of motion. I would equate the deviation in a video's frame timing to noise in a still photograph, functionally. More noise makes people think a picture is of lower detail than a softer/less noisy version even when the reverse is true.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
Bookmarks