Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 59

Thread: [PCPer] Frame Rating Part 3: First Results

  1. #26
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    My 4870x2 always gamed fine, unless I pushed it to a point where it was VRAM, compute, or game engine limted. So far my SLI 670's have actually given me more trouble than that card ever did with respect to performance in either windowed or 1200p..

    All along the watchtower the watchmen watch the eternal return.

  2. #27
    Visitor
    Join Date
    May 2008
    Posts
    676
    Quote Originally Posted by SKYMTL View Post
    You are confusing onscreen refresh rate and response times with processed frame times.

    For example, the move from 60Hz to 120Hz isn't about eliminating flickering. Do you see less flicker on a 120Hz screen? Of course not.
    This depends greatly on the viewing source. Sit in front of a 50-60Hz CRT and you'll notice flicker immediately. If not, then for sure in direct comparison to an LCD or when you increase the refresh rate of the CRT to 75Hz for instance.

    Rather it is about being able to drive a display ABOVE the 60 FPS mark with v-sync on. That leads to less input latency.
    First and foremost, quicker refresh rates all but eliminate the ghosting and input latency that occurs on lower-frequency LCD displays.
    Higher than 60Hz refresh rate isn't essential for minimizing ghosting on LCD screens. I use EIZO Colorgraphic monitors for video work, etc., the ghosting on them is basically non-existent. No consumer grade TN 120Hz or 144Hz comes even close to them in that respect. However, due to the display and hold nature of LCD screens, there's a significant difference in motion perception. This is directly related to the human visual system. The display itself plays a comparatively minor role in the perceived motion blur.

    Fast paced gaming without VSync on a high refresh rate LCD also greatly helps to reduce tearing and/or the perception of it, which benefits tracking targets. It also reduces the hindrance of bright muzzle flash. The latter can be very annoying when it tears on a 60Hz screen, to the extent of almost temporarily impairing vision.


    With that in mind, the difference between 90Hz and 60Hz will be noticeable on the INPUT side rather than on the overly VISUAL side (other than the ghosting of course).

    Finally, they aren't claims. Perceptual latency detection above and below 48 FPS has a ton of science backing it up, much of which was sponsored by major production studios.
    Here's a simple test:
    1) ensure you have an LCD monitor with PWM controlled brightness
    2) ensure the back light burns continuously at 100% brightness

    3) wave your hand with fingers spread wide in front of your screen at 100% brightness
    4) repeat same test at ~50% brightness.

    In test 4 the PWM controlled back light typically flickers at 180Hz. You should clearly see a strobe effect when observing your hand movement.
    Last edited by cx-ray; 02-26-2013 at 01:54 AM.

  3. #28
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    @box

    100 FPS isn't going to do anything over 60 FPS on a normal 60 hz screen. If FPS is maintained at over 50 on a 60 hz, I don't believe that you or anyone else are going to be noticing any stutter.

    The frame spiking with crossfire in these tests however is a seperate issue to FPS, you may very well notice stutter due to that happening, and im not sure how frame limiting works or if it can be forced to 60 without dipping straight down to 30 like normal vsync.

    I've usually found that people complaining about SLI / Xfire and microstutter haven't even used the setup they complain about themselves, and when people do notice lag / stutter, in most cases this is caused by FPS spikes rapidly dropping down to a low number. This is easily measurable in some bechmarks which record the minimum FPS. Most likely case is if you are seeing sudden lag / stutter on any GPU set up, the FPS is dropping far below 30 when you see it.

    You can't judge an SLI / Xfire setup either way until you've tried it. I've been through SLI Geforce 6800s, GTX 460, and 560 Tis, and Xfire 3850s, 4850s, 4870s, 5770s and none of them were worse to me than a single card setup. I noticed the exact same lag and smoothness at the same FPS points, and I have perfect vision.
    Last edited by Mungri; 02-26-2013 at 01:54 AM.

  4. #29
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Looks like nvidia are just delaying one cards frame output by 10ms, whilst AMD are spitting out frames almost in unison. I'm not entirely sure how a monitor would react to that. Would it even show two frames rendered that close together? Would a buffered frame be shown in the gap between frame output?

  5. #30
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by Iconyu View Post
    Looks like nvidia are just delaying one cards frame output by 10ms, whilst AMD are spitting out frames almost in unison. I'm not entirely sure how a monitor would react to that. Would it even show two frames rendered that close together? Would a buffered frame be shown in the gap between frame output?
    Most monitor have so much time latency input, i dont think we can really see it...
    Last edited by Lanek; 02-26-2013 at 02:50 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  6. #31
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by STEvil View Post
    My 4870x2 always gamed fine, unless I pushed it to a point where it was VRAM, compute, or game engine limted. So far my SLI 670's have actually given me more trouble than that card ever did with respect to performance in either windowed or 1200p..
    4870x2 was a nice card in theory but it came with alot of cons and turned me off pc gaming cards along with the 4890 when I tried to trifire them.

    I thought the card would have provided good value as it was cheaper than the pair of single cards and also had the larger memory configuration.

    But the 4870x2 was as loud and hot as they get. The noise in particular was a high pitched fan noise with coil wine, it also didn't provide enough speed to maximize crysis. This experienced caused a chain reaction which would turn me off pc gaming in general with desktop cards.

    Basically because of the noise and performance I got water blocks and a 4890 and I thought this would solve everything or so i thought. The trifire experience was awful after I installed the 4890. Couldn't even start my games even as the menu's to the games were all messed up. E.g I would have partial black screen with the menu shrunk on to the main screen, but I couldn't click the buttons in the menu to even start the game. I waited forever for AMD to fix this but they never did and with the 4890, the noise was even more outrageous. So basically I was stuck playing a single 4870x2 that was too hot and loud to tolerate(I don't wear headphones). AND the water blocks never got installed because I wanted to do it all at once to get the loop all set up but because of the aweful trifire experience I never ended up doing it. So two water blocks and a 4890, completely useless and a generally annoying 4870x2.

    After spending this 1000 dollars and having that negative experience, I stuck with console's just because of all the hassle. It turned me off AMD cards and I made certain if I ever got a desktop card again, it would be a Nvidia card because of the unacceptable wait time for the fix.

    The only thing that worked as it should have was 3dmark.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  7. #32
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by bhavv View Post
    @box

    100 FPS isn't going to do anything over 60 FPS on a normal 60 hz screen. If FPS is maintained at over 50 on a 60 hz, I don't believe that you or anyone else are going to be noticing any stutter.

    The frame spiking with crossfire in these tests however is a seperate issue to FPS, you may very well notice stutter due to that happening, and im not sure how frame limiting works or if it can be forced to 60 without dipping straight down to 30 like normal vsync.

    I've usually found that people complaining about SLI / Xfire and microstutter haven't even used the setup they complain about themselves, and when people do notice lag / stutter, in most cases this is caused by FPS spikes rapidly dropping down to a low number. This is easily measurable in some bechmarks which record the minimum FPS. Most likely case is if you are seeing sudden lag / stutter on any GPU set up, the FPS is dropping far below 30 when you see it.

    You can't judge an SLI / Xfire setup either way until you've tried it. I've been through SLI Geforce 6800s, GTX 460, and 560 Tis, and Xfire 3850s, 4850s, 4870s, 5770s and none of them were worse to me than a single card setup. I noticed the exact same lag and smoothness at the same FPS points, and I have perfect vision.
    I referred to input lag, not stutter. 100 fps will have lower input lag than 60 fps.

    I'm curious what fps you were getting with your SLI and CF setups. If they are too low, microstutter will be worse. I cannot imagine gaming at 30-50 fps with SLI/CF, that is microstutter hell in most games. In some titles, even 60-80 fps are not smooth, for example I have experienced this in Serious Sam 3. Or look at PCGHs Ares Quadfire video. A whooping 70 fps in 3DMark11 and yet terrible stutter.

  8. #33
    Xtreme Enthusiast
    Join Date
    Oct 2012
    Posts
    687
    Quote Originally Posted by boxleitnerb View Post
    I referred to input lag, not stutter. 100 fps will have lower input lag than 60 fps.

    ...
    Could you explain how would this work ?I cant see any reason why higher amount of generated frames from gpu would lower input lag of a display when the synchronization remains at 60hz.Its interesting to me because im looking at monitors to buy atm however im not seeing any high resolution/high quality displays with low input lag.
    My understanding is that input lag is introduced when display analyses and does some work on the frames got from gfx card before displaying it.To be honest i dont even see why input lag would get lower in 120hz display at 120fps.Its all about monitor electronics/display technology.
    Intel 5960X@4.2Ghz[Prime stable]@4.5 [XTU stable] 1.24v NB@3.6ghz Asrock X99 Extreme 3 4x8GB Corsair Vengeance@3200 16-17-17
    Sapphire nitro+ VEGA 56 Samsung SSD 850 256GB Crucial MX100 512GB HDD:WD10TB WD:8TB Seagate8TB

  9. #34
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by boxleitnerb View Post
    First, you absolutely cannot compare gameplay with just displayed sequences. 48 fps, is way too low for many games, think input lag. In some game engines, 100+ fps are required for truly direct control. Secondly, it is unimportant what the "majority" feels as long as there are still people who have higher standards.
    So did you actually experience SLI/CF first hand or did you not? Theory has its place, but real life can be something different altogether. This smells like sugarcoating the problems that CF has. I'm always highly suspicious if some people - especially reviewers who should be objective and very careful with such stuff - tell me what I should or shouldn't be able to feel.
    You are correct. But we are not talking about input lag here. We're talking about motion fluidity. Basically, you can have completely fluid onscreen sequences but still experience input lag.

    Under no circumstance should AMD's issues here be sugar coated and yes, they do have MAJOR issues. However, my intent was to inject some understanding of perspective versus the results being discussed here. While some sequences look quite bad, I maintain the differences will be so minor in actual gameplay, they'll be all but ignored by gamers (I'm talking sub-20ms frame times here by the way). Others which tend to spaz between two higher frame time points will of course be extremely noticeable.
    Last edited by SKYMTL; 02-26-2013 at 06:00 AM.

  10. #35
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by SKYMTL View Post
    You are correct. But we are not talking about input lag here. We're talking about motion fluidity. Basically, you can have completely fluid onscreen sequences but still experience input lag.

    Under no circumstance should AMD's issues here be sugar coated and yes, they do have MAJOR issues. However, my intent was to inject some understanding of perspective versus the results being discussed here. While some sequences look quite bad, I maintain the differences will be so minor in actual gameplay, they'll be all but ignored by gamers (I'm talking sub-20ms frame times here by the way). Others which tend to spaz between two higher frame time points will of course be extremely noticeable.
    The whole isue, and the fantastic Never Settle Reloaded bundle, intrigued me enough to order another HD7970 DD edition this morning.

    I want to see for myself what all the reviews, posts are about. I've got a GTX680SLi rig powered by a now stock 990X, and now I'll have a 7970 CF rig powered by a stock 2500K. Both are equipped with Dell 25X16 monitors, rigs should be close enough I don't have to worry about differences too much, and if I see a big difference with SLi or CF being better I'll try swapping cards.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  11. #36
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by SKYMTL View Post
    You are confusing onscreen refresh rate and response times with processed frame times.

    For example, the move from 60Hz to 120Hz isn't about eliminating flickering. Do you see less flicker on a 120Hz screen? Of course not. Rather it is about being able to drive a display ABOVE the 60 FPS mark with v-sync on. That leads to less input latency.
    No, the move to a higher refresh rate is for a smoother image in motion. Go check the display section at any popular forum and that'll be the general consensus. The difference in input lag between a 60 and 120hz display is not noticeable. In fact there are several 120hz displays with higher input lag than most 60hz displays.


    In addition, people don't buy 120Hz screens to eliminate microstutter. Indeed, a 120Hz screen won't do anything to eliminate microstutter if it is already being displayed due to above-20ms frame times.
    I didn't claim that it did. I did claim that you can see a difference in smoothness past 20ms.


    Finally, they aren't claims. Perceptual latency detection above and below 48 FPS has a ton of science backing it up, much of which was sponsored by major production studios.
    Yeah, whatever you say. There are a lot of people who would disagree with that. You can skew results to come out any way that you like it.

    Quote Originally Posted by SKYMTL View Post
    I maintain the differences will be so minor in actual gameplay
    I disagree with your opinion.

    I'll tell you right now, if it weren't for the ability to set up a framerate cap I would have sold my 7950s long ago.

    Quote Originally Posted by STEvil View Post
    My 4870x2 always gamed fine, unless I pushed it to a point where it was VRAM, compute, or game engine limted. So far my SLI 670's have actually given me more trouble than that card ever did with respect to performance in either windowed or 1200p..
    I know in Crysis at 1920x1200 my 4870x2 didn't feel any smoother than my GTX280 despite the supposedly higher framerates. In general thats what I noticed with demanding games. That combined with poor performance in a few games like darkplaces and my modded morrowind install and off it went.
    Last edited by BababooeyHTJ; 02-26-2013 at 03:16 PM.

  12. #37
    Registered User
    Join Date
    Jan 2007
    Location
    Serbia
    Posts
    36
    you are all forgeting about capabilities of panel itself. Sure, you may have 6ms on ips or 2ms on TN but that is gray-to-gray. blu-to-blue is more like 30ms, so BtB cannot achieve even 50fps, so in 120hz (120fps) BtB will be lagging behind GtG anyway. My point is - LCD panels are not capable of true 120hz/fps anyway. Don't get to exited.

    edit: example: 3D crosstalk. Putting 500Hz refresh rate on a panel that is capable of displaying (all color transitions with all pixels firing up and down (on/off) ) some 80-90 fps will not yield a 500fp capability nor will it grant you a better experience than sam panel used in 120Hz monitor/TV. But hey, it would sell and placebo would rule the forums.
    600Hz plasma anyone?
    Last edited by gx-x; 02-26-2013 at 05:24 PM.

  13. #38
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Seattle WA
    Posts
    334
    I have 2 MSI 7970 Lightnings with 3 of the new Asus VG248QE 144Hz 1ms monitors and I don't notice this microstutter everyone talks about. Maybe I'm just not sensitive to it...Which might be a good thing....

  14. #39
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by BababooeyHTJ View Post
    No, the move to a higher refresh rate is for a smoother image in motion. Go check the display section at any popular forum and that'll be the general consensus. The difference in input lag between a 60 and 120hz display is not noticeable. In fact there are several 120hz displays with higher input lag than most 60hz displays.

    I didn't claim that it did. I did claim that you can see a difference in smoothness past 20ms.

    Yeah, whatever you say. There are a lot of people who would disagree with that. You can skew results to come out any way that you like it.

    I disagree with your opinion.

    I'll tell you right now, if it weren't for the ability to set up a framerate cap I would have sold my 7950s long ago.
    I understand your opinions but I do highly recommend you read the science behind it before leaning towards a possible placebo effect.

  15. #40
    Visitor
    Join Date
    May 2008
    Posts
    676
    Science to back up BababooeyHTJ motion claims:

    http://scien.stanford.edu/pages/labs...Verslegers.htm

  16. #41
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by SKYMTL View Post
    I understand your opinions but I do highly recommend you read the science behind it before leaning towards a possible placebo effect.
    You didn't link anything to prove your point on the subject. Just because you don't notice a difference doesn't mean that many other people don't.

    I know what I've seen and I don't know many gamers with a fair amount of experience with a 120hz display that don't notice smoother motion at a higher refresh rate.

    There are a lot of people who don't notice the 64hz bug in Oblivion, Fallout 3, and new vegas despite it being there in every 60hz display. Now that I think about I've never seen you mention it either.

  17. #42
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by cx-ray View Post
    Science to back up BababooeyHTJ motion claims:

    http://scien.stanford.edu/pages/labs...Verslegers.htm
    That's motion blur. Not stuttering or (in HDTV terms) judder.

    Remember, when V-Sync is turned OFF, frame times and frame rates run asynchronously to refresh rate.

    But now, we're straying pretty far afield since the refresh rate discussion only comes into play when comparing noticeable flickering (at sub-48Hz to 50Hz intervals) with my statements previously about frame intervals of 20.83ms.

  18. #43
    Xtreme Mentor
    Join Date
    May 2008
    Location
    cleveland ohio
    Posts
    2,879
    I've only used crossfire my self, I've never noticed any kind of stuttering.

    Aren't they using AFR (alternate frame rending) to render now ?

    I recall, way back in the 3,000 series there was option to change the rendering type
    HAVE NO FEAR!
    "AMD fallen angel"
    Quote Originally Posted by Gamekiller View Post
    You didn't get the memo? 1 hour 'Fugger time' is equal to 12 hours of regular time.

  19. #44
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    Quote Originally Posted by demonkevy666 View Post
    I've only used crossfire my self, I've never noticed any kind of stuttering.

    Aren't they using AFR (alternate frame rending) to render now ?


    I recall, way back in the 3,000 series there was option to change the rendering type
    They used to offer tiled rendering and half-screen rendering, but I believe there were often driver issues and image quality problems when combining the two images, especially in tiled mode.
    Fold for XS!
    You know you want to

  20. #45
    Xtreme Mentor
    Join Date
    May 2008
    Location
    cleveland ohio
    Posts
    2,879
    Quote Originally Posted by [XC] Lead Head View Post
    They used to offer tiled rendering and half-screen rendering, but I believe there were often driver issues and image quality problems when combining the two images, especially in tiled mode.
    it's an OLD can of worms. :|

    http://techreport.com/review/14284/c...e-x-explored/2
    HAVE NO FEAR!
    "AMD fallen angel"
    Quote Originally Posted by Gamekiller View Post
    You didn't get the memo? 1 hour 'Fugger time' is equal to 12 hours of regular time.

  21. #46
    Xtreme Addict
    Join Date
    Mar 2010
    Posts
    1,079
    This test makes absolutely no sense at all. You don't want to know the value of the average 99% of the frames. It's that 1% of slow frames that destroy the experience.
    What I want to know is which card has the minor slow frames, and which one renders them fastest.

  22. #47
    Visitor
    Join Date
    May 2008
    Posts
    676
    Quote Originally Posted by SKYMTL View Post
    That's motion blur. Not stuttering or (in HDTV terms) judder.

    Remember, when V-Sync is turned OFF, frame times and frame rates run asynchronously to refresh rate.

    But now, we're straying pretty far afield since the refresh rate discussion only comes into play when comparing noticeable flickering (at sub-48Hz to 50Hz intervals) with my statements previously about frame intervals of 20.83ms.
    The examples in my two posts of this thread were to illustrate that we can observe differences in motion far beyond the 48 FPS that you state. It may not be apparent when just viewing a single source. In direct comparison however, differences are quite easily detected. You don't have to be a picky or sensitive person for that.

  23. #48
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Posts
    519
    You guys are mixing two things up. Input refresh frequency doesn't have to be the same as display or frame drawing frequency. I think Doom 3 was the first to start that trend - it had 60 Hz on input even when fps was lower.

    Other games have those two connected - Quake 3 for example. If you had fast enough PC back in the day, and good enough monitor, you could have easily sensed the difference in input when 'overclocking' ps/2 sampling rate.
    2x Dual E5 2670, 32 GB, Transcend SSD 256 GB, 2xSeagate Constellation ES 2TB, 1KW PSU
    HP Envy 17" - i7 2630 QM, HD6850, 8 GB.
    i7 3770, GF 650, 8 GB, Transcend SSD 256 GB, 6x3 TB. 850W PSU

  24. #49
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    I got my 7970CF rig running last night, intel 990X CPU, 3X2GB Patriot RAM, old school 7200RPM 1TB drive, Dell 3008 25X16 display.

    I played some Crysis 3 at everything on very high, 4XMSAA 16XAF. It was pretty sluggish, but with CF off, it was MUCH worse. So I think I established a baseline of info that at settings one 7970 can't handle, adding a second doesn't "play the same due to short rendered frames" at Crysis 3 anyway. I'll see if I can find settings where the 7970 can mostly handle the game, and whether turning on CF improves/detracts from it in any way as perhaps the impact of the measured short frames is more subtle. (like 60fps feels the same as 40fps so you're not really getting any benefit from the second card)

    Then I switched the settings to everything on high but shadows on low, water on medium, and 2X SMAA, 16X AF and used CF a good while. (which I thought might be similar to what I was doing with my 680 SLi , FXAA+16X AF) Gameplay was very smooth and enjoyable, didn't notice any game altering microstutter.

    I realize these results aren't scientific, only observed, but I'm going to leave the 7970s in my main rig for a while and play a variety of games and see if I notice any hitching or fluctuations in fluidity with the CF.
    Last edited by Rollo; 03-05-2013 at 04:30 AM.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  25. #50
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Yew Nork City
    Posts
    121
    Imo the sad thing is crysis 3 IQ barely changes when you lower certain graphics options and you get much better performance.
    Quote Originally Posted by G0ldBr1ck View Post
    The origonal spirit of overclocking was to buy cheaper hardware and tweak it to perform as good as higher end more expensive hardware. Phenom 2 fits perfectly for this task.
    so many people seem to have forgotten this.


Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •