Page 18 of 26 FirstFirst ... 815161718192021 ... LastLast
Results 426 to 450 of 627

Thread: ATi 4870X2(R700) PCB Debut

  1. #426
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    we're not on the same page here. i meant video as in film. PAL is 25 fps, NTSC is 29.97 fps.
    most crappo cards are hd vid capable.

    4850 is a great card but really not enuf of a step up from an 8800gt....spose i could sell me gt...


    regarding the continuing saga of microstutter.

    i refer to the youtube video with tri sli/crysis - that may not be what you call microstutter, but there sure is what i would call "lag" or "skipping" in the video, which isnt too flash, but the dude in the video seemed all agog at his fps, so good for him.

    if vsync solves your stutter problems then great.. im really surprised that no-one else thought of trying the v-sync option.

    ...all i got right now is bottlenecks and not enuf shaders, i think.
    Last edited by adamsleath; 07-09-2008 at 03:56 PM.
    i7 3610QM 1.2-3.2GHz

  2. #427
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Boise, ID
    Posts
    353
    Quote Originally Posted by JAG87 View Post
    yes my friend, vsync fixes micro stuttering or as I like to call it, jitter. frame capping (@ your refresh rate) also solves it, but it is not by any means a viable solution as you get massive image tearing when your image is running at 60 fps out of sync from your refresh rate (basically with vsync off)

    micro stuttering is most evident between 60 and 120 fps (for a 60hz display), where a frame rate of 80 feels more like 40, and a frame rate of 100 feels more like 50. This doesn't happen in all games though (as somebody already mentioned) but holy hell is it ever blatant in the Source engine and in GRID. I dont have any problems in Crysis or COD4 or BF2.

    vsync totally fixes the problem in both GRID and Source, but while it is perfectly acceptable in the former, I cannot use vsync in the latter because I dont enjoy getting owned in DOD/TF2.


    hope this clears things up.
    Really good to know, thanks!

    I have been running SLI for about 3 years now and I never had a huge issue with microstuddering. Possibly I often didnt notice and possibly because I often use vsync.

    I am not entierly sure what it looks like to be honest...but I do have experience messing with different SLI profiles through the use of nHancer. Usually its when I create a new profile for a game that the drivers dont support yet. You can tell when you get the SLI value wrong...and I think that effect is what people are calling microstuddering. Sometimes you get the profile close..you see that boost in fps but the game is rendering out of sync.
    Last edited by Hu1kamania; 07-09-2008 at 04:07 PM.

  3. #428
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    This is just a quick test done with 3 different runs of Assassins Creed DX10.

    Even the Single HD4870 have different frame times. I see the same single card behavior in 3Dmark06.

    The bottom axis (x) is the framenumber and the left axis (y) is the frametime.

    So V-sync does play a difference.

    So far I have only found Call Of Juarez, Assassins Creed and 3Dmark06 to show microstuttering, however I can only see it when using Fraps and not when actually playing the games or running 3DMark06.

    UT3 runs without any microstuttering even though it has been claimed to have serious issues with it.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	assassins_stutter.PNG 
Views:	914 
Size:	11.0 KB 
ID:	81766  
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  4. #429
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    http://www.fudzilla.com/index.php?op...8348&Itemid=34
    recent fud
    The top of the line, called Radeon HD 4870 X2 2GB will have 2GB of GDDR5, the runner up is the Radeon HD 4850 X2 with 2GB of GDDR3.

    There will certainly be the price difference between GDDR5 and GDDR3 version of the card and it looks that GDDR3 based Radeon 4850 X2 comes a month later.
    i7 3610QM 1.2-3.2GHz

  5. #430
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Interesting. Wonder how much less the GDDR3 version will be and if it will be at the same core clock ? They do seem nearly positive about the 2GB part so that sounds good to me.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  6. #431
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Xcel View Post
    thanks for the insight annihilat0r! I sure hope the problem can be solved, the youtube crysis video with tri-sli gtx280s is pretty horrible.
    How did I miss that link...?

  7. #432
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Toysoldier: I encountered no synch/microstuttering issues with Assassin's Creed benchmarks, both with 8800GT SLI and 9800GX2.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  8. #433
    Xtreme Member
    Join Date
    Jul 2006
    Location
    Ontario, Canada
    Posts
    251
    Took the liberty of recording frame times with fraps in GRID, and highlighting the problem for those that don't have 20/20 vision. This is done in the exact same race, benched 10 seconds, took the first 2 seconds. Check the attached excel files.


    Game play with vsync off is absolutely unacceptable. I cant even tell you how much it stutters, it doesn't even feel like reduced fps it feels like the game is running off the fricken page file. I find it funny humorous how the Guru3D review of GTX 280 SLI, it touts that you get 77 fps. Funny how it runs like complete "chiet" until you turn on vsync. How are reviewers still ignoring this issue..


    These are the frame rates, in case anyone is wondering how it performs.

    2008-07-09 20:41:45 - GRID - vsync off
    Avg: 62.617 - Min: 44 - Max: 81

    2008-07-09 20:45:59 - GRID - vsync on
    Avg: 58.714 - Min: 46 - Max: 63
    Attached Files Attached Files

    Intel Core i7 3770K | Asus Maximus V Gene | 16GB DDR3-1600 | Asus GTX 670 directCU TOP | Intel 320 120GB | WD Caviar Black 1TB | Enermax Revolution 1050W | Silverstone TJ08-E | Dell UltraSharp U2711

  9. #434
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    So it seems the key is to turn Vsync on? That would make sense actually.

    And I have watched that Crysis video. Maybe my eyes are slow? Because i dont see nothing. I have a DLP TV, and some people say they see rainbows if they move their head or scan across the TV and such, and I dont see that either.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  10. #435
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    Last edited by adamsleath; 07-09-2008 at 06:20 PM.
    i7 3610QM 1.2-3.2GHz

  11. #436
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    @JAG: That's EXACTLY what I'm talking about. I can't believe no major review sites are mentioning this very obvious problem.

    Also, VSYNC might eliminate microstutter but it demolishes the playing experience.

    The problem with having to turn VSYNC on in SLI setups is; you can't enable triple buffering because SLI can't do it. So you're stuck with horrible input lag due to VSYNC.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  12. #437
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    Quote Originally Posted by JAG87 View Post
    Took the liberty of recording frame times with fraps in GRID, and highlighting the problem for those that don't have 20/20 vision. This is done in the exact same race, benched 10 seconds, took the first 2 seconds. Check the attached excel files.


    Game play with vsync off is absolutely unacceptable. I cant even tell you how much it stutters, it doesn't even feel like reduced fps it feels like the game is running off the fricken page file. I find it funny humorous how the Guru3D review of GTX 280 SLI, it touts that you get 77 fps. Funny how it runs like complete "chiet" until you turn on vsync. How are reviewers still ignoring this issue..


    These are the frame rates, in case anyone is wondering how it performs.

    2008-07-09 20:41:45 - GRID - vsync off
    Avg: 62.617 - Min: 44 - Max: 81

    2008-07-09 20:45:59 - GRID - vsync on
    Avg: 58.714 - Min: 46 - Max: 63
    I would say that GRID with V-sync on is stutter free and with V-sync off ... well - almost there.
    When playing the game, I can't tell the difference between V-sync on/off ..it feels/looks the same.
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	grid_stutter.PNG 
Views:	771 
Size:	9.5 KB 
ID:	81801  
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  13. #438
    Registered User
    Join Date
    Nov 2004
    Posts
    19
    I hope ATI makes the x2's GPUs work on the same frame, eliminating microstutter and lowering average frame latency. It seems like thats what they will do.

    What we really need a solution for is getting framerates over your monitors refresh rate with no tearing. UT/Q3 gets over 1000 fps and UT2003/4 get over 400 ... but monitors only support 75? Seems like a waste.

    What I dont understand is why neither ATI nor nVidia hasn't done this simple solution: Use video output blending.

    Its like this: 5 buffers=double refresh rate (150fps), 7 is triple (225fps), 9 is quad and so on....

    5 buffers works like this: render to buffer A. On completion tell the video output. Video output sends buffer A to monitor ... monitor takes 13ms to display it. Meanwhile, we're rendering to buffer B. Say that only takes 1ms, we go over to buffer C and start rendering there. At the end of rendering each buffer we tell the video output of the buffers that are finished. We render buffer C in 1ms, render buffer D in 10 ms, and start rendering the next frame over buffer B. Because buffer B is taking 5ms to render, the video output comes along and see's buffers C and D are finished. Instead of just sending one of them to the monitor IT BLENDS them together, creating a blur effect like you would see on any camera or any movie in the cinema to create the sense of motion. C and D are going to be unusable for the next 13ms while the graphics card has buffers B, E, and A to render to.

    I think this would be great for fast paced shooting games where smoothness, low latency, and greater than 75fps is critical.
    E8500 @ 4.3Ghz 1.4V on Water
    Gigabyte DQ6, 2GB OCZ DDR3 1333
    6*640GB WD RAID0, 9800GTX
    Antec1200, Corsair TX750

  14. #439
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    Quote Originally Posted by 0g1 View Post
    I hope ATI makes the x2's GPUs work on the same frame, eliminating microstutter and lowering average frame latency. It seems like thats what they will do.
    Do you know this or are you speculating ?
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  15. #440
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    Quote Originally Posted by 0g1
    I hope....
    I am just guessing.....

    So is it just SLI that is unable to do triple buffering, or ATi also?

    And og1, its got to be more complicated than that. It has to be.
    Last edited by Aberration; 07-09-2008 at 09:13 PM.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  16. #441
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    The problem with having to turn VSYNC on in SLI setups is; you can't enable triple buffering because SLI can't do it. So you're stuck with horrible input lag due to VSYNC.
    how does this manifest on screen? i mean what does it look like?
    which game? you got a specific example?

    input lag is an lcd response issue i thought... what exactly are you describing as "input lag"?

    triple buffer needs vram.
    Last edited by adamsleath; 07-09-2008 at 09:21 PM.
    i7 3610QM 1.2-3.2GHz

  17. #442
    Registered User
    Join Date
    Nov 2004
    Posts
    19
    Quote Originally Posted by 0g1
    I hope ATI makes the x2's GPUs work on the same frame
    I'm guessing that because of the cnet article where the ATI guy says the GPU's talk to each other before the end of the rendering process (through a proprietary link). And somehow it scales better than Crossfire. So I cant think of anything else it would be.
    E8500 @ 4.3Ghz 1.4V on Water
    Gigabyte DQ6, 2GB OCZ DDR3 1333
    6*640GB WD RAID0, 9800GTX
    Antec1200, Corsair TX750

  18. #443
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by 0g1 View Post
    I'm guessing that because of the cnet article where the ATI guy says the GPU's talk to each other before the end of the rendering process (through a proprietary link). And somehow it scales better than Crossfire. So I cant think of anything else it would be.
    Working on the same frame would give a super huge performance penalty. Basicly it would be like scanlines. It worked fine when it was just "pixel filling". But add shaders etc now and you are in the garbage. Both cards would have to do double the shading processing etc.
    Crunching for Comrades and the Common good of the People.

  19. #444
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by annihilat0r View Post
    @JAG: That's EXACTLY what I'm talking about. I can't believe no major review sites are mentioning this very obvious problem.

    Also, VSYNC might eliminate microstutter but it demolishes the playing experience.

    The problem with having to turn VSYNC on in SLI setups is; you can't enable triple buffering because SLI can't do it. So you're stuck with horrible input lag due to VSYNC.
    dont forget that vsync also can force you on low fps. If you dont reach a steady 60fps+ (if the refreshrate of your monitor is 60hz) vsync reverts to 30fps... and if i dips below 30 fps have fun with 12,5.... no go for crysis.

  20. #445
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    having vsync off does seem better / smoother ingame with my single card :|
    i7 3610QM 1.2-3.2GHz

  21. #446
    Xtreme Member
    Join Date
    May 2007
    Posts
    341
    Quote Originally Posted by Hornet331 View Post
    dont forget that vsync also can force you on low fps. If you dont reach a steady 60fps+ (if the refreshrate of your monitor is 60hz) vsync reverts to 30fps... and if i dips below 30 fps have fun with 12,5.... no go for crysis.
    That isn't true. Even with no triple buffering, it has been a long time since I have experienced that phenomenon.

    V-Sync with all games I play, does framerates in between 30 and 60 fps for example. The difference between VS on and off in terms of performance, is also not very noticable, where tearing certainly is.

    Can't live without my VS.

  22. #447
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by adamsleath View Post
    most crappo cards are hd vid capable.
    ugh still not on the same page. how can a deinterlaced (motion-blurred) video running at 25-30 fps demonstrate microstuttering? i reckon it can only demonstrate megastuttering is you know what i mean.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  23. #448
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Seraphiel View Post
    That isn't true. Even with no triple buffering, it has been a long time since I have experienced that phenomenon.

    V-Sync with all games I play, does framerates in between 30 and 60 fps for example. The difference between VS on and off in terms of performance, is also not very noticable, where tearing certainly is.

    Can't live without my VS.
    get a faster lcd. :p

    tearing shouldn't be a problem with any of the newer lcds.

  24. #449
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Vegas ,NV
    Posts
    1,636
    at this point i think crysis is a lost/abandoned cause. The game just is poorly coded from the ground up, either that or just light years ahead of current hardware. I was hoping for a nice boost with the 4870, but performance still isnt great. From looking at benchmarks seem's it favours nv GPU. Hopefully Warhead will bring many more optimizations
    ~

  25. #450
    Xtreme Member
    Join Date
    May 2007
    Posts
    341
    Quote Originally Posted by Hornet331 View Post
    get a faster lcd. :p

    tearing shouldn't be a problem with any of the newer lcds.
    I haven't seen any >60 Hz 2560 x 1600 displays, and mine is just a little over 1 yo

Page 18 of 26 FirstFirst ... 815161718192021 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •