Page 1 of 3 123 LastLast
Results 1 to 25 of 62

Thread: Frame Rating: AMD plans driver release to address frame pacing for July 31st

  1. #1
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875

    Frame Rating: AMD plans driver release to address frame pacing for July 31st

    Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring. Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

    For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology.

    AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

    So what can we expect on July 31st? A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous. More to come in the next 30 days!
    http://www.pcper.com/news/Graphics-C...cing-July-31st

    https://twitter.com/AMDRadeon/status/347803712930070529
    Last edited by Final8ty; 06-20-2013 at 01:51 PM.

  2. #2
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    202
    Next time let them release the hardware *after* the software is properly working.

    How many generations will this go back? Knowing ATI, ZERO. Spent $1000 on a top of the line Radeon 69xx setup? You're out of luck. Get with the new generation of cards! I can't believe you're still using 1 year old tech.

  3. #3
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by mockingbird View Post
    Next time let them release the hardware *after* the software is properly working.

    How many generations will this go back? Knowing ATI, ZERO. Spent $1000 on a top of the line Radeon 69xx setup? You're out of luck. Get with the new generation of cards! I can't believe you're still using 1 year old tech.
    As a guy who ended up selling one of my 7970s after seeing first hand what CFx was like, I'm not exactly CFx advocate.

    I would say that until the FCAT tool came out, AMD likely didn't know the nature of the problem. (and that happened this year)

    Their turnaround time on this driver is pretty fast IMO. You're right that the question now is how many generations it works on, and how well it works across games.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  4. #4
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    I'm just hoping these drivers work. At least that way crossfire will be a viable gaming option.

  5. #5
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    let's see. for last one year i like what amd is doing or trying to do but will never believe until i saw and try it.


    When i'm being paid i always do my job through.

  6. #6
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    Have a little faith. I'm sure they wouldn't make this statement if the couldn't deliver.

  7. #7
    Xtreme Member
    Join Date
    Mar 2005
    Posts
    421
    Quote Originally Posted by mockingbird View Post
    Next time let them release the hardware *after* the software is properly working.
    if they did this there would have been no such thing as cf or even sli till gtx680

    im using 7970 cf it does work but like with sli back when i was running gtx480 you need to limit the fps for it to be smooth but if it drops below that fps limit you really feel it
    TJ08-EW 6700k@4.7 1.375v - Z170-GENE - 2x8g 3866 16-16-16 - 1070@ 2088\9500MHz -Samsung 830 64G, Sandisk Ultra II 960G, WD Green 3tb - Seasonic XP1050 - Dell U2713 - Pioneer Todoroki 5.1 Apogee Drive II - EK VGA-HF Supreme - Phobia 200mm Rad - Silverstone AP181 Project Darkling
    3770k vs 6700k RAM Scaling, HT vs RAM, Arma III CPU vs RAM, Thief CPU vs RAM

  8. #8
    Xtreme Member
    Join Date
    Aug 2009
    Location
    Belgium
    Posts
    163
    They will be giving the option to enable and disable it. Why? Does enabling it come with any negatives?
    Asus Z87 Deluxe, 4770K,Noctua NH-D14, Crucial 16 GB DDR3-1600, Geforce Titan, ASUS DRW-24B3ST, Crucial M500 960GB, Crucial M4 256GB, 3 X Seagate 4TB, Lamptron FC5 V2 Fancontroller, Noctua Casefans, Antec P183 Black, Asus Essence STX, Corsair AX860i, Corsair SP2500 speakers, Logitech Illuminated Keyboard, Win7 Home Pro 64 bit + Win 8.1 Home 64 bit Dual boot, ASUS VG278H

  9. #9
    Xtreme Enthusiast
    Join Date
    Apr 2008
    Posts
    898
    Benchmarking. If you're going for max FPS, you don't care one bit about runt frames, you want as many of those frames per second as the cards can pump out. Allowing people to disable it takes away the FPS hit you get in 3D benchmarks. It's sort of like the Tessellation slider - you can completely disable Tessellation for higher scores. You wouldn't want to disable either of these things for gaming, but for all out benchmarking they are both useful.
    [XC] gomeler - Public note: If you PM me to tell me that I am disrespectful at least have space in your PM box so I can tell you I don't care.

    [XC] gomeler - I come to the news section to ban people, not read complaints.

    I heart gomeler!

  10. #10
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    Buenos Aires, Argentina
    Posts
    644
    Quote Originally Posted by hokiealumnus View Post
    Benchmarking. If you're going for max FPS, you don't care one bit about runt frames, you want as many of those frames per second as the cards can pump out. Allowing people to disable it takes away the FPS hit you get in 3D benchmarks. It's sort of like the Tessellation slider - you can completely disable Tessellation for higher scores. You wouldn't want to disable either of these things for gaming, but for all out benchmarking they are both useful.
    At this point, you should actually question yourself what is the point of benchmarking "max FPS" if you have potentially useless frames on it that inflated the score, making it bogus. Is the same thing as the controversy many years ago when either nVidia or ATI (Don't remember which) cheated on some benchmarks due to image quality: One GPU posted higher scores because it was doing more work than the other one. This is around the same line.

  11. #11
    Xtreme Enthusiast
    Join Date
    Apr 2008
    Posts
    898
    I'm not here to debate that, just answering the question. If you do wish to debate, feel free to go to HWBot, where it is perfectly legal, and air your concerns.
    [XC] gomeler - Public note: If you PM me to tell me that I am disrespectful at least have space in your PM box so I can tell you I don't care.

    [XC] gomeler - I come to the news section to ban people, not read complaints.

    I heart gomeler!

  12. #12
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by hokiealumnus View Post
    I'm not here to debate that, just answering the question. If you do wish to debate, feel free to go to HWBot, where it is perfectly legal, and air your concerns.
    There's nothing to debate. The runts are rendered too briefly to be seen, making the benchmarks irrelevant and the perceived frame rate half the reported.

    My "guess" is pacing won't make a big difference as the frames are being rendered, just not displayed long enough. Will likely knock a few fps off for the driver overhead and/or interval timing, but I don't expect any "Holy cow is CFx slow now!" scenarios.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  13. #13
    Registered User
    Join Date
    Nov 2008
    Posts
    16
    Quote Originally Posted by nossy23 View Post
    They will be giving the option to enable and disable it. Why? Does enabling it come with any negatives?
    It adds lag. http://www.anandtech.com/show/6857/a...oadmap-fraps/6

    Quote Originally Posted by AMD slide
    • Alternating pattern of "fast" and "slow" frames
    • Results from each GPU sending images to the display as quickly as possible to minimize latency, rather than maintaining a regular cadence
      ...
    • Can be resolved by delaying display of completed frames as required, but this has the side effect of increasing frame latency
    • AMD's position is that users should be able to choose their preferred behavior

  14. #14
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by mockingbird View Post
    Next time let them release the hardware *after* the software is properly working.

    How many generations will this go back? Knowing ATI, ZERO. Spent $1000 on a top of the line Radeon 69xx setup? You're out of luck. Get with the new generation of cards! I can't believe you're still using 1 year old tech.
    Actually, the solution they are going with will fix this issue for at least 2-3generations, if not all vliw4/5 architectures.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  15. #15
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Rollo View Post
    There's nothing to debate. The runts are rendered too briefly to be seen, making the benchmarks irrelevant and the perceived frame rate half the reported.

    My "guess" is pacing won't make a big difference as the frames are being rendered, just not displayed long enough. Will likely knock a few fps off for the driver overhead and/or interval timing, but I don't expect any "Holy cow is CFx slow now!" scenarios.
    I don't know what exactly causes the issue. Its some odd timing issue. Its not just runt frames alone. You can still see microstutter with vsync, runt frames don't explain that. I'm just going by what I've seen and people have been reporting for years now. There is a reason that people use a hard cap like whats offered in RadeonPro or fpslimiter from a few years back. I think that runt frames are a symptom and not a cause.

    I'm interested to see how this fix works out.

  16. #16
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    From the preliminary results, it looks like it will make CFx like SLi in frame delivery, which is a big improvement. Will give CFx owners the experience they paid for without dicking around with things like fps caps in Radeon Pro.

    And I imagine you're right in that it won't solve all AFR issues, but I bet the difference will make CFx a much more usable solution. You HAVE to have multi GPU for some settings/solutions. This should put AMD back in the game. (for multi GPU anyway)
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  17. #17
    Registered User
    Join Date
    Apr 2010
    Posts
    57
    Quote Originally Posted by zir_blazer View Post
    At this point, you should actually question yourself what is the point of benchmarking "max FPS" if you have potentially useless frames on it that inflated the score, making it bogus. Is the same thing as the controversy many years ago when either nVidia or ATI (Don't remember which) cheated on some benchmarks due to image quality: One GPU posted higher scores because it was doing more work than the other one. This is around the same line.
    both did this BUT, as an nVidia owner from that era, I can tell you nV was noticeable, where ATI's optimizations where not, the worst of the old days was the FX line where they crippled the IQ to try and make up for the fact the FX line where slower then the previous gf4ti cards in almost every way that mattered.

    I say this having owned what at the time was the best card they had the 5800ultra....also known as the dustbuster from hell.....(thing was as loud as an 80's dust buster...)

    it was actually what drove me to agree to try a 9600 256mb card a buddy had been trying to force lend me for a couple months, i was just shocked when i did, it was faster in every game i tried, and it totaly destroied the 5800ultra in dx9 beta's i was in.....5800ultra at 800x600 or was barely playable, the 9600 could do 1600x1200 at higher fps.....

    the gf6 line vx x800 though, i had both, I preferred the x800xt pe to the 6800ultra I had.

    but yeah, both companies also use to run profiles for major benchmark apps/games that cheated, since it was known what would be rendered the whole benchmark pass they could make better use of prerendering....there was alot of talk about it back in the day.

    now days they both use IQ optimizations that lower the technical quality without lowering the visible quality, but you do have to watch it sometimes with both when it comes to beta drivers and new games, I have seen both put out drivers with....what i would consider noticeable IQ issues (if you really pay attention sometimes you can see minor dithering or the like) havent seen it in a release driver though, I think they both know if they did that more people would notice
    AMD FX-8350@4.6(280/2520/2520)| 32gb gskill ares ddr3 1866|
    2*sapphire dual-x 7870ghz|16896.8 GB hdd space|
    2*hp ZR24w+Gateway FPD1976W(video playback)|Asus Xonar DX|
    z5500 5.1 speakers|Windows 7 x64|CoolerMaster Cosmos Medusa
    PcPower and Cooling TurboCool 1200watt

  18. #18
    Registered User
    Join Date
    Apr 2010
    Posts
    57
    Quote Originally Posted by Rollo View Post
    From the preliminary results, it looks like it will make CFx like SLi in frame delivery, which is a big improvement. Will give CFx owners the experience they paid for without dicking around with things like fps caps in Radeon Pro.

    And I imagine you're right in that it won't solve all AFR issues, but I bet the difference will make CFx a much more usable solution. You HAVE to have multi GPU for some settings/solutions. This should put AMD back in the game. (for multi GPU anyway)
    again, if you use something like radonpro, you can eliminate microstutter 99% of the time even without the driver fix.

    another weird note is, trifire from what I have read is unaffected by the latancy issues for whatever reason... no clue why, and, honestly im not gonna grab a 3rd 7870 to find out(i like my raid card!!!)
    AMD FX-8350@4.6(280/2520/2520)| 32gb gskill ares ddr3 1866|
    2*sapphire dual-x 7870ghz|16896.8 GB hdd space|
    2*hp ZR24w+Gateway FPD1976W(video playback)|Asus Xonar DX|
    z5500 5.1 speakers|Windows 7 x64|CoolerMaster Cosmos Medusa
    PcPower and Cooling TurboCool 1200watt

  19. #19
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    394
    AMD said that the first driver fix would address the 7990 only and then the next driver fix would address actual dual card crossfire configs. So, is the July 31st. driver addressing 7990 only or all crossfire configs?

    I'm building up my rig to support two water cooled 7970s. I was going to do one 7990 but since you can't disable one GPU in cases were a game has negative results with crossfire, I'm not going to get stuck.
    Last edited by DefStar; 06-22-2013 at 08:43 AM.
    Custom case laser cut from a 3/16" thick sheet of brushed Aluminum 8"x80" & cold formed into a box then anodized black with 1/2" Poly-carbonate side panels..[.fully modular, all aluminum mounting brackets, HD bays, and mobo tray are removable...down to the bare box
    --Asus Maximus V Gene--
    --Intel 3770k @4.2 GHz De-lided and I soldered an Arctic Twin Turbo to the Intel.
    --MSI R7970 3GB @1150, 1500 cooled with an Arctic Accelero Xtreme--
    --G.SKILL Ripjaws @2400 MHz --
    --SeaSonic X-1050 Gold--
    --128 GB Sandisk UltraPlus is was only $59 new! Seagate 1TB HD--
    --Samsung S23A750D 120Hz monitor--
    --Razer Tarantula-- keyboard, yes it is like 8 years old!
    --Corsair M60 mouse--
    --Klipsh Promedia 2.1-- I rock stereo speakers the way they were meant to be rocked
    -- 100% Fun ...

    Does it ever shock anyone else when your hear someone use Darwin's "survival of the fittest" to justify genocide?

  20. #20
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by AzureSky View Post
    again, if you use something like radonpro, you can eliminate microstutter 99% of the time even without the driver fix.

    another weird note is, trifire from what I have read is unaffected by the latancy issues for whatever reason... no clue why, and, honestly im not gonna grab a 3rd 7870 to find out(i like my raid card!!!)
    Radon Pro not only ends micro stutter, it keeps your basement safe from dangerous gases!

    http://radon-pro.com/

    I don't know why you're tri-fire is exempt from latency issues without citation. There was an article sometime ago that trifire had less microstutter, but they didn't have fcat back then to really see what was going on.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  21. #21
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    single gpu is good single gpu is life


    When i'm being paid i always do my job through.

  22. #22
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Rollo View Post
    There was an article sometime ago that trifire had less microstutter, but they didn't have fcat back then to really see what was going on.
    Yeah, I've seen a couple of articles claim that. There was a rage3d article from 5 years ago that claimed that and Toms did a couple of years back.

    Personally, I didn't find that to be the case and knowing what we know now if anything it probably only makes microstutter worse. It sure as hell didn't make it better.

    Quote Originally Posted by AzureSky View Post
    again, if you use something like radonpro, you can eliminate microstutter 99% of the time even without the driver fix.
    Yeah, if you cap the framerate and can maintain a framerate above that cap. Thats tough on a 120hz display especially on my 120hz catleap (2560x1440). I would imagine it would also be very tough in eyefinity.
    Last edited by BababooeyHTJ; 06-22-2013 at 11:53 AM.

  23. #23
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    I think back then the thought was "More fps equals less variance in fps, so less microstutter". Which probably has some truth as less dips in framerate, but wouldn't impact on the runt frame situation AFAIK.

    I do know that I've had three Quad SLi rigs and the mouse felt really vague, and I don't remember my three way SLi being a ton better at that.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  24. #24
    Visitor
    Join Date
    May 2008
    Posts
    676
    There's no 2 or 3 way SLI lag that I can feel compared to a single card. I've been running two since the GTX 280 days and 3 since the GTX 680.

    I don't have the equipent to verify the below, but it at least explains things a little:
    http://www.youtube.com/watch?feature...zvn0gA#t=3001s

  25. #25
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    Quote Originally Posted by cx-ray View Post
    There's no 2 or 3 way SLI lag that I can feel compared to a single card. I've been running two since the GTX 280 days and 3 since the GTX 680.

    I don't have the equipent to verify the below, but it at least explains things a little:
    http://www.youtube.com/watch?feature...zvn0gA#t=3001s
    That is quite an interesting video. That certainly helps to explain things in easy terms for anyone to understand. It certainly shows that the new AMD driver makes a huge improvement when it comes to smoothness. It's not perfect, but the difference is huge.

    Thanks for putting that link up.

Page 1 of 3 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •