View Poll Results: Where should this thread go?

Voters
44. You may not vote on this poll
  • This goes to the nVidia section, as only nVidia users can read vram usage

    5 11.36%
  • This goes to the ATI section, as only ATI users justify their large vram

    3 6.82%
  • This shall stay in the news section for now

    26 59.09%
  • Delete this thread, as we don't need such misleading/irrelevant/troll information

    10 22.73%
Page 1 of 3 123 LastLast
Results 1 to 25 of 65

Thread: List of known/suspected games to eat more than 1GB video memory at 1920x1200

  1. #1
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607

    List of known/suspected games to eat more than 1GB video memory at 1920x1200

    This is a list of games I have confirmed or suspect to eat more than 1GB video memory at 1920x1200 resolution in worst case (not average), while the graphics settings are set reasonably high, e.g. 4AA. Please read the notes below the list before you proceed with doubt. For users who only care about case (a) please ignore this article.

    I have recently upgraded from 2 x 5870 1GB CrossFireX to 2 x 6950 2GB CrossfireX. However unfortunately AMD driver doesn't return video memory usage for DX11 under Windows 7, so I'll have to also rely on screenshots from nVidia users. (For a fair comparison between 5870 1GB and 5870 2GB click here.)

    Method to pick a suspicious game: if I feel obvious improvement of lag spikes / min fps, I add it to the list. It works the same way as system memory - simulated video memory is a lot slower than dedicated video memory, just the same way as virtual memory is a lot slower than system memory.

    Method to confirm a game: If I witness a screenshot proving that a game running at a resolution of 1920x1200 or lower can consume 1GB video memory or more, I confirm the game. All confirmed games would be marked as Red. I do not guarantee always being able to post a link to the screenshot of proof if you have doubt, because image sharing fails over time. You'll have to choose whether to trust me based on your own justification.

    2007-11: Crysis / Warhead: No more lag while quickly rotating my camera. Confirmed to exceed 1GB video memory easily.
    2008-12: GTA IV: Famous for requirement of over 1GB video memory to max out graphics settings. Confirmed to exceed 1GB video memory easily.
    2009-06: ARMA II: Reported to hit 1.5GB at 1080p. Waiting for confirmation from nVidia users.
    2009-09: Unigine Heaven Benchmark: Terrible lag spikes on 1GB cards. Waiting for confirmation from nVidia users.
    2009-11: Call of Duty 6: Modern Warfare 2: Did not notice lag with 5870 1GB, however confirmed to exceed 1GB video memory by screenshots from nVidia users.
    2009-12: Colin McRae DiRT 2: Confirmed to exceed 1GB video memory by screenshots from nVidia users.
    2010-02: Battlefield: Bad Company 2: Did not notice lag with 5870 1GB, however confirmed to exceed 1GB video memory by screenshots from nVidia users.
    2010-02: Napoleon: Total War: 5870 1GB would lag as hell during Picture-in-picture scenes, such like troops taking over buldings. Waiting for confirmation from nVidia users.
    2010-02: STALKER Call of Pripyat with complete mod: Confirmed to hit 1.2GB according to screenshots from nVidia users.
    2010-02: Aliens vs. Predator: Reported to hit 1GB on a GTX 460 at 1080p, however no lag spike noticed. Waiting for confirmation from nVidia users.
    2010-03: Metro 2033: MSAA 4X (not AAA): Confirmed to exceed 1GB video memory easily. Even 470/570 gets killed easily.
    2010-07: Starcraft 2:Mothership cloaking *MANY* Carriers leads to 20 fps on 5870 1GB but over 30fps on 6950 2GB. Confirmed to exceed 1GB video memory by screenshots from nVidia users.
    2010-09: Civilization V: Confirmed to use up all 1.5GB of GTX 480 by screenshots from nVidia users.
    2010-10: Lost Planet 2: Confirmed to hit 1.2GB by screenshots from nVidia users.
    2010-12: World of Warcraft: Cataclysm: Running two instances (logging in two characters) concurrently in DX11 mode would definitely eat more than 1GB video memory, and 5870 1GB struggles at 2-3 fps, while 6950 2GB has no problem at 30 fps. Each DX11 instance is confirmed to approach 1GB video memory usage by screenshots from nVidia users.
    2011-02: Bulletstorm: Confirmed to hit 1GB at 1080p according to screenshots from nVidia users.
    2011-03: Dragon Age 2: No more lag/unbearable min fps with 6950 2GB. Confirmed to exceed 1GB video memory by screenshots from nVidia users.
    2011-03: Total War: Shogun 2: AA is not yet supported officially, however fps increased from 30 to 40 during my upgrade. Waiting for confirmation from nVidia users.
    2011-03: Homefront: Reported to exceed 1GB video memory. Waiting for confirmation from nVidia users.
    2011-03: Assassin’s Creed Brotherhood: Confirmed to use 975MB at 2450x1440 4xAA. Waiting for confirmation from nVidia users for worst case of 1200p.
    2011-03: Crysis 2 DX9: Confirmed to use up all 1.5GB of GTX 480 by screenshots from nVidia users.
    2011-04: Shift 2 Unleashed: Confirmed to hit 1.3GB by screenshots from nVidia users.
    TBA: Crysis 2 DX11: Still no ETA. Both nVidia and wiki have removed the DX11 feature description stealthily.
    TBA: Digital Combat Simulator: A-10C: Reported to use all 1.5GB of GTX 480. Waiting for confirmation from nVidia users.
    TBA: The Witcher 2


    I will keep this list updated. However if inappropriate please remove it.

    Notes: It is recommended for less experienced users to read about Virtual Memory and have a brief understanding about paging/swapping. Basically for graphics cards it works in a similar way - simulated video memory is a lot slower than dedicated video memory, just the same way as virtual memory is a lot slower than physical memory. There are two categories of "out of video memory":

    Case (a): Actual video memory usage is greatly above the capacity of dedicated video memory on the graphics card. In such case, all the contents within the sight of the camera (in from of the player) "fight" for a place in the dedicated video memory but unfortunately swapping happens all the time, causing unplayable average fps obviously below 30.

    Case (b): Actual video memory usage is slightly above the capacity of dedicated video memory on the graphics card. In such case, only the contents within the sight of the camera (in front of the player) would be loaded into the dedicated video memory, while the contents outside the sight of the camera (behind the player) would be pushed into the virtual video memory. High fps is still achieved if the player doesn't rotate the camera; however when the player quickly rotates the camera, swapping happens between the dedicated video memory and the virtual video memory, for which the latency and bandwidth of the PCIE communication is bottlenecking, causing temporarily low fps (aka lag, choppy lag, lag spike), which often make up the "min fps" of a benchmark session.

    For users who only care about case (a) please ignore this article. For enthusiast users who do care about both case (a) and case (b) then this is the list for you.
    Last edited by sniper_sung; 04-28-2011 at 09:06 AM.

  2. #2
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607
    Sample proof of screenshot used:

    Screenshots for Metro 2033, 1920x1080 MSAA 4X. A GTX 580 1.5GB is going to get horrible min fps at 1920x1200.












  3. #3
    Xtreme Member
    Join Date
    Mar 2005
    Posts
    421
    nice idea
    i have doubts about the accuracy of memory usage reported by afterburner though as it has seemed a bit off to me in the past at least with some games but it has been updated a few times since then

    edit:
    for example i just ran quake 4 at 1920x1200 ultra detail no aa with gtx480 sli
    at these settings the game recommends you have a 512m+ card but afterburner reports it using ~1150m if thats so how did the 512m cards we use to play this with cope?
    then i enable vsync and 16xq 8xss aa and it jumps up to 1400+ on the first level
    windows 7 is using ~120-130m

    last i checked you need to go back to dx9 winxp to get a fairly accurate memory usage

    lets see how something older like unreal goes
    Minimum CPU Speed: 166 MHz
    Minimum RAM Required: 16 MB
    total install 680m
    uses 310m at 1920x1200 no aa and 282m with aa
    Last edited by dasa; 04-20-2011 at 12:40 AM.

  4. #4
    Xtreme Addict Chrono Detector's Avatar
    Join Date
    May 2009
    Posts
    1,142
    I ran the original Crysis with my GTX 580 at max settings, 1920x1080, 16AA, 16AF and it used over 1GB of VRAM. Kinda concerned about the small VRAM size on the GTX 580. Lost Planet 2 also uses more than 1GB at some point as well, max settings at 1920x1080.
    AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160

  5. #5
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Great idea for this list!
    When I get back from holiday I will add my suspects here

    I'm also surprised by Digital Combat Simulator: A-10C! I've planned getting this game at some point in the future and it would be fantastic if they're using all available VRAM for better geometry and texture details!
    GTA IV is a classic! I've waited with properly finishing this game till 2GB cards entered at more affordable high end price points (compared to super high end / custom boards). Now that I own HD 6970 I've started playing it again at max. settings getting better frames than before on HD 5870 with only high shadows and visibility set to 38
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  6. #6
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Just because a game uses more than 1 Gb of Vram doesnt mean that it is going to run at over 30 min FPS on a 2 Gb card with the same GPU, or see any improvement to performance even in games that use up more than 1 Gb Vram.

    Every review on the 1 Gb 6950 shows it performing just as badly at minimum FPS scores as the 2 Gb card in any example where you would expect Vram usage to exceed 1 Gb.

    At the most in any such case, the improvement to minimum FPS was bought up from something like 18 FPS on a 1 Gb card, to 22 FPS on a 2 Gb card.

    Even with the extra 1 Gb Vram, the 2 Gb 6950 would still not maintain a minimum FPS of higher than 30, indicating that the lack of Vram on the 1 Gb cards is not the factor that is limiting performance, the factor is that the GPU itself is too slow.

    I've seen plenty of users running triple monitor eyefinity rigs on crossfire 1 Gb 5870s, and never complaining about low FPS problems.

    In the case of the OP's examples, improvements were not seen because the graphics cards were upgraded from 1 Gb to 2 Gb cards, improvements were seen because the GPUs were upgraded from 5870s to 6950s.

    Try again with a fairer comparison of a 1 Gb 6950 vs a 2 Gb 6950, and conclude that the 2 Gb card manages to maintain a 30 FPS minimum in games where the 1 Gb is falling below 25 to fairly conclude that 2 Gb of Vram evidently brings minimum frame rates in such games up to a 30 minimum over a 1 Gb card.

    Note in this chart, the minimum FPS of both the 1 Gb and 2 Gb 6950s in Crysis Warhead for the second result:



    Having said that, yes I would still have bought 2 Gb versions of my cards if they had been available for the simple reason that like a lot of enthusiasts, I am simply convinced without requiring any logical reasons that more Vram = better, even though there is not any evidence proving that this is actually true. However, I do not believe that having 2 Gb Vram would provide any noticable improvement to any game, even if if using up over 1 Gb Vram and having to have the excess offloaded into the shared memory.
    Last edited by Mungri; 04-20-2011 at 12:56 AM.

  7. #7
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    When you use AA4 / AA8 must close the game up and running again in order to get a more accurate result

  8. #8
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    Colin McRae DiRT 2: Reported to use over 1.2GB at 1080p. Waiting for confirmation from nVidia users
    1.2gb lol





    http://img534.imageshack.us/img534/6...0420120700.jpg

    http://img215.imageshack.us/img215/3...0420120803.jpg

  9. #9
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607
    Quote Originally Posted by bhavv View Post
    Just because a game uses more than 1 Gb of Vram doesnt mean that it is going to run at over 30 min FPS on a 2 Gb card with the same GPU, or see any improvement to performance even in games that use up more than 1 Gb Vram.

    Every review on the 1 Gb 6950 shows it performing just as badly at minimum FPS scores as the 2 Gb card in any example where you would expect Vram usage to exceed 1 Gb.

    At the most in any such case, the improvement to minimum FPS was bought up from something like 18 FPS on a 1 Gb card, to 22 FPS on a 2 Gb card.

    Even with the extra 1 Gb Vram, the 2 Gb 6950 would still not maintain a minimum FPS of higher than 30, indicating that the lack of Vram on the 1 Gb cards is not the factor that is limiting performance, the factor is that the GPU itself is too slow.

    I've seen plenty of users running triple monitor eyefinity rigs on crossfire 1 Gb 5870s, and never complaining about low FPS problems.

    In the case of the OP's examples, improvements were not seen because the graphics cards were upgraded from 1 Gb to 2 Gb cards, improvements were seen because the GPUs were upgraded from 5870s to 6950s.

    Try again with a fairer comparison of a 1 Gb 6950 vs a 2 Gb 6950, and conclude that the 2 Gb card manages to maintain a 30 FPS minimum in games where the 1 Gb is falling below 25 to fairly conclude that 2 Gb of Vram evidently brings minimum frame rates in such games up to a 30 minimum over a 1 Gb card.

    Note in this chart, the minimum FPS of both the 1 Gb and 2 Gb 6950s in Crysis Warhead for the second result:

    Having said that, yes I would still have bought 2 Gb versions of my cards if they had been available for the simple reason that like a lot of enthusiasts, I am simply convinced without requiring any logical reasons that more Vram = better, even though there is not any evidence proving that this is actually true. However, I do not believe that having 2 Gb Vram would provide any noticable improvement to any game, even if if using up over 1 Gb Vram and having to have the excess offloaded into the shared memory.
    Do you truly believe in the numbers from anandtech? It would be funny to see how 580 SLI is beaten by 6950 CF in certain games then:



    As I have said, you would never be able to know the worst case scenario for each game simply from benchmarks and reviews.

    1) The time interval to calculate min fps may vary a lot between benchmarks. For instance, if you have 0 fps for 0.1 seconds, and 60 fps for another 0.1 seconds, the min fps may be either 0 for 0.1-second interval, or 30 fps for 0.2-second interval.

    2) The benchmarks does not necessarily cover the most stressful scenes of each game. It is not always like how your 2 x 560 Ti 1GB was beaten by my 2 x 6950 2GB by only 27% during Metro2033benchmark.exe - if you really play the game through (hint: Chapter 4 Child, which uses 1.5GB vram at 1080p), it may be a huge difference on the average fps.

    That's why I say it's not sufficient to justify from review numbers, otherwise my list is meaningless.
    Last edited by sniper_sung; 04-20-2011 at 01:46 AM.

  10. #10
    NooB MOD
    Join Date
    Jan 2006
    Location
    South Africa
    Posts
    5,799
    Please keep this thread in the News section, it's handy.
    Xtreme SUPERCOMPUTER
    Nov 1 - Nov 8 Join Now!


    Quote Originally Posted by Jowy Atreides View Post
    Intel is about to get athlon'd
    Athlon64 3700+ KACAE 0605APAW @ 3455MHz 314x11 1.92v/Vapochill || Core 2 Duo E8500 Q807 @ 6060MHz 638x9.5 1.95v LN2 @ -120'c || Athlon64 FX-55 CABCE 0516WPMW @ 3916MHz 261x15 1.802v/LN2 @ -40c || DFI LP UT CFX3200-DR || DFI LP UT NF4 SLI-DR || DFI LP UT NF4 Ultra D || Sapphire X1950XT || 2x256MB Kingston HyperX BH-5 @ 290MHz 2-2-2-5 3.94v || 2x256MB G.Skill TCCD @ 350MHz 3-4-4-8 3.1v || 2x256MB Kingston HyperX BH-5 @ 294MHz 2-2-2-5 3.94v

  11. #11
    Xtreme Enthusiast
    Join Date
    Mar 2010
    Location
    Istanbul
    Posts
    606
    Civilization V even without AA
    Half life 2 & cinematic mod 10 & sgssaa on


  12. #12
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    I think some of the games just cache as much textures as possible in the video memory, so they show up as using 1GB+ but don't necessarily need that much.
    "No, you'll warrant no villain's exposition from me."

  13. #13
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607
    Quote Originally Posted by Pantsu View Post
    I think some of the games just cache as much textures as possible in the video memory, so they show up as using 1GB+ but don't necessarily need that much.
    I agree with you. That's why we didn't observe lag spikes in some games listed above, even if they're confirmed by screenshots. I think only those games both reported to have lag spikes with 1GB cards and confirmed by screenshots would actually need more than 1GB vram.

  14. #14
    Xtreme Addict
    Join Date
    Nov 2003
    Location
    NYC
    Posts
    1,592
    Final fantasy 14 benchmark picks up around 20-30% improvement going from a 5870 1gb to a 2gb model with no other changes

  15. #15
    Xtreme Member
    Join Date
    Apr 2005
    Location
    Sweden
    Posts
    324
    bad company 2 and shift 2 eats up more than 1GB of vram for me, 1920x1200 at max wiht 4x aa and a gtx470
    E6600"L630A446"? @3600@1.?v cooled by Tunic Tower sitting on Abit AB9 Quad GT played on ASUS 8800GTX opperated by a lazy slacker!

  16. #16
    Registered User
    Join Date
    Dec 2010
    Location
    belgium
    Posts
    37
    civ V eats up 1280mb on both my gpu's with everything maxed
    bfbc2 uses around 800ish on each gpu
    I7 3770K
    Asus Z77 Sabertooth
    G.Skill ripjaws X 1600Mhz 8Gb
    Asus GTX670 DC2
    Noctua NH-D14
    Crucial M4 128Gb
    500Gb caviar black
    HAF922|HX850

  17. #17
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    Quote Originally Posted by Pillo-kun View Post
    bad company 2 and shift 2 eats up more than 1GB of vram for me, 1920x1200 at max wiht 4x aa and a gtx470
    This did not happen with me

    AA4




    http://img854.imageshack.us/img854/2118/bfbc2aa4.jpg

    32CSAA




    http://img580.imageshack.us/img580/6981/bfbc232xaa.jpg

  18. #18
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by sniper_sung View Post
    Do you truly believe in the numbers from anandtech? It would be funny to see how 580 SLI is beaten by 6950 CF in certain games then:
    Different GPUs again, and also the frame rate is not unplayable on the 1.5 Gb cards even at those figures (the average seems to be over 30 FPS). Also, I trust Anandtech more than your copy paste OP from another forum:

    http://forums.overclockers.co.uk/sho...php?t=18255161

    some games perform better with Nvidia hardware, other perform better with ATI.

    I believe what I see with my own 1 Gb GTX 560s though, that is in a lot of cases that are reported to use more than 1 Gb Vram, I see no lag or slowdown.

    Compare 1 Gb and 2 Gb cards with the EXACT SAME GPU and clock speeds if you want to create an accurate comparison, and show the 2 Gb cards providing playable 30 FPS+ gameplay in instances where the 1 Gb card cannot. I really wonder why any such comparison of 1 Gb vs 2 Gb 6950s hardly shows any difference if 2 Gb is meant to improve performance.

    Quote Originally Posted by Trolle View Post
    civ V eats up 1280mb on both my gpu's with everything maxed
    Thats good to hear then, this is a game I play fully maxed out with 8x AA, and it works completely fine with FPS constantly over 40.
    Last edited by Mungri; 04-20-2011 at 07:13 AM.

  19. #19
    Xtreme Mentor
    Join Date
    May 2008
    Location
    cleveland ohio
    Posts
    2,879
    try Malaysia....
    with Max AA and AF.
    HAVE NO FEAR!
    "AMD fallen angel"
    Quote Originally Posted by Gamekiller View Post
    You didn't get the memo? 1 hour 'Fugger time' is equal to 12 hours of regular time.

  20. #20
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    Quote Originally Posted by demonkevy666 View Post
    try Malaysia....
    with Max AA and AF.
    max aa in game , 8XQCSAA









    2010-10: Lost Planet 2: Reported to hit 1.2GB during benchmark. Waiting for confirmation from nVidia users.






    32CSAA






  21. #21
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Nice thread! There are differences in how ATI and NVIDIA manage their buffers but it shouldn't be too much...

    Oh and snipe, could have saved yourself a lot of work by making a list of games that DON'T require more than 1gb at 1080

  22. #22
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601


    S.T.A.L.K.E.R : COP
    Atmosfear 2.1
    Natures Pack + Textures Pack
    All maxed in game settings
    GTX 470 SOC

  23. #23
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607
    Quote Originally Posted by cold2010 View Post
    max aa in game , 8XQCSAA
    That tells it: even at 1080p it approaches 1GB, which means 1GB is not reassuring for 1200p.

    It's not just once I heard from others that Dirt 2 can eat more than 1GB, so check your settings.

  24. #24
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Location
    Bloomfield Evergreen
    Posts
    607
    Quote Originally Posted by bhavv View Post
    Different GPUs again, and also the frame rate is not unplayable on the 1.5 Gb cards even at those figures (the average seems to be over 30 FPS). Also, I trust Anandtech more than your copy paste OP from another forum:

    http://forums.overclockers.co.uk/sho...php?t=18255161

    some games perform better with Nvidia hardware, other perform better with ATI.

    I believe what I see with my own 1 Gb GTX 560s though, that is in a lot of cases that are reported to use more than 1 Gb Vram, I see no lag or slowdown.

    Compare 1 Gb and 2 Gb cards with the EXACT SAME GPU and clock speeds if you want to create an accurate comparison, and show the 2 Gb cards providing playable 30 FPS+ gameplay in instances where the 1 Gb card cannot. I really wonder why any such comparison of 1 Gb vs 2 Gb 6950s hardly shows any difference if 2 Gb is meant to improve performance.



    Thats good to hear then, this is a game I play fully maxed out with 8x AA, and it works completely fine with FPS constantly over 40.
    You still don't understand that I'm discussing the worst case scenario of each game here. Do you understand what's the difference between the worst case and the average case? Do you think anandtech could include all the worst cases for each game (even including the so called "min fps")? As listed in the OP you just listed, how many review sites would tell you that World of Warcraft would run at merely 10+ fps in worst case on GTX580 SLI, even though it's not due to vram usage?

    You still don't understand that Metro 2033 is an nVidia's game. Even the average fps is affected by vram limitations - GTX580 SLI is beaten by 6950 CF at 1600p (anandtech numbers), how funny! Heavily OC'ed 560 SLI is beaten by stock 6950 CF at 1200p (your numbers), how funny! Now tell me, doesn't nVidia have advantage in Metro 2033 at lower resolutions?

    I have seen many times that people are claiming 2GB 5870 can offer better play experience than 1GB 5870. Sometimes it's possible to produce a difference even on the average fps, like suggested #14 here, and sometimes it's not possible to measure the min difference due to insufficient time interval used, while people can still see the difference like this.

  25. #25
    Xtreme Enthusiast
    Join Date
    Apr 2010
    Posts
    514
    That tells it: even at 1080p it approaches 1GB, which means 1GB is not reassuring for 1200p.
    LOL
    This with 8XQCSAA not 8MSAA
    It's not just once I heard from others that Dirt 2 can eat more than 1GB, so check your settings
    I don't need so, i Retesting three times But some

    I remembered something important

    Quote Originally Posted by sniper_sung View Post
    sample proof of screenshot used:

    Screenshots for metro 2033, 1920x1080 msaa 4x. A gtx 580 1.5gb is going to get horrible min fps at 1920x1200.

    this is not 1920.1080 , So be sure of the source
    Last edited by cold2010; 04-21-2011 at 03:04 AM.

Page 1 of 3 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •