I can't speak for any other games as I have not tracked VRAM use.
For Skyrim with not too much more than the HD pack and the 1.5 texture packs I easily go over 1515MB vram use even after running the texture optimizer and this is not in eyefinity/surround this is on Ultra settings at 1920x1200 with 8xAF and 0xAA and a lighting add-on.
I even tried Eyefinity myself and could not get it to run favorably on a 5990 with 4GB total vram (2x2GB) on Ultra it would run for about 1-2 minutes then turn into a slideshow I'm guessing due to VRAM starvation in Eyefinity going across 3x 1920x1200 monitors so I'm not sure even 3GB would be sufficient.
Skyrim is probably the single most important game for those into RPG / games in this setting and many of us were waiting on NVidia to release something competitive, but since it does not have nearly enough VRAM for us, the 680 ended up being a disappointment.
MSI GeForce GTX 680 2GB GDDR5 492 €
28nm GPU with PCI Express 3.0
1st NVIDIA 28nm GPU.
Support NVIDIA GPU Boost Technology.
Support PCI Express 3.0
Afterburner Overclocking Utility
Support GPU/Memory Clock offset and Power limit control.
Support in-game video recording.
Support wireless control by android/iOS handheld devices.
Support built-in DX11 effect test
3DVision Surround
Support 3 displays in full stereoscopic 3D with a single card.
Support up to 4 displays with a single card
All Solid Capacitors
10 years ultra long lifetime (under full load).
Lower temperature and higher efficiency.
Aluminum core without explosion
Product Specification:
Product Name N680GTX-PM2D2GD5
Model V282
GPU NVIDIA GeForce GTX 680
Codename GK104
CUDA Core 1536 Units
Core Base Clock 1006 MHz
Core Boost Clock 1058 MHz
Memory Clock 3004 MHz
Memory Size 2048MB GDDR5
Memory Bus 256 bits
Output DisplayPort / HDMI / DL-DVI-I / DL-DVI-D
TDP 195 W
Card Dimension 270*111.15*38.75 mm
Form Factor ATX
Technology Support
DirectX 11
OpenGL 4.2
PCI Express 3.0
CUDA Y
SLI Y
PhysX Y
PureVideo HD Y
HDCP Y
Accessory
Driver CD Y
Manual Y
Installation Guide Y
6-pin Power Cable Y, 2
DVI to VGA Dongle Y
Garantía: 2 años.
http://www.pccomponentes.com/msi_gef...2gb_gddr5.html
GTX 680 OC or SLi one ?
http://www.facebook.com/photo.php?fb...type=3&theater
Thanks to my man kaktus1907
http://forum.beyond3d.com/showpost.p...postcount=3214
Last edited by Olivon; 03-21-2012 at 07:43 AM.
Last edited by kaktus1907; 03-21-2012 at 07:43 AM.
Man from Atlantis(B3D, DH, S|A, 3DC, OCN), MfA(G3D, CH), kaktus1907(XS,TPU,AT) and zennino
SIS 6326 > Ti 4200 > 9800XT > 9800GT > GTX 460
Celeron 366 > Celeron 1700 > Athlon XP 2500+ > E6300 > Q9650
Alice Madness Returns | Assassin's Creed: Brotherhood | Assassin's Creed: Revelations | Batman Arkham City | Battlefield 3 | Bulletstorm | Call of Duty: Modern Warfare 3 | Crysis 2 | Darkness II | Darksiders | Dead Island | Dead Space | Dead Space 2 | Deus Ex: Human Revolution | Dragon Age Origins | Dragon Age 2 | F.3.A.R. | F1 2011 | Half Life 2 | Hard Reset | Kane & Lynch 2 | L.A. Noire | LEGO: Pirates of the Caribbean | LEGO: Star Wars III: The Clone Wars | LOTR: War in the North | Mass Effect | Mass Effect 2 | Mass Effect 3 | Mini Ninjas | NFS Hot Pursuit | RAGE | Renegade Ops | Skyrim | The Witcher 2 | Tomb Raider: Underworld | Transformers: WFC | Trine 2
Wow, indeed.
Some people toss in absurd texture packs with super-duper high resolution foliage (that you never see the resolution of) etc. that jacks the memory usage up with high inefficiency. It's doable, but you really gain nothing from those mods other than using the official high-res pack + a couple of select general packs and modded ini.
+ some crazy gridsize levels ^^
Man from Atlantis(B3D, DH, S|A, 3DC, OCN), MfA(G3D, CH), kaktus1907(XS,TPU,AT) and zennino
SIS 6326 > Ti 4200 > 9800XT > 9800GT > GTX 460
Celeron 366 > Celeron 1700 > Athlon XP 2500+ > E6300 > Q9650
Alice Madness Returns | Assassin's Creed: Brotherhood | Assassin's Creed: Revelations | Batman Arkham City | Battlefield 3 | Bulletstorm | Call of Duty: Modern Warfare 3 | Crysis 2 | Darkness II | Darksiders | Dead Island | Dead Space | Dead Space 2 | Deus Ex: Human Revolution | Dragon Age Origins | Dragon Age 2 | F.3.A.R. | F1 2011 | Half Life 2 | Hard Reset | Kane & Lynch 2 | L.A. Noire | LEGO: Pirates of the Caribbean | LEGO: Star Wars III: The Clone Wars | LOTR: War in the North | Mass Effect | Mass Effect 2 | Mass Effect 3 | Mini Ninjas | NFS Hot Pursuit | RAGE | Renegade Ops | Skyrim | The Witcher 2 | Tomb Raider: Underworld | Transformers: WFC | Trine 2
Apologies for thinking that the console optimized textures are complete garbage for the most part and not worthy of PC gaming. Yes they are user generated and probably not as optimized as they would have been from the developer but we do not have much at our disposal currently.
when i first downloaded skyrim for steam, i thought the 5GB game file was a joke. even fallout NV has 30GB of textures
i then got the high res pack and play with max textures at 1080p on a 1GB card and still have no problems. but i also think the textures look HORRIBLE. atleast let my old gpu choke to death but look good doing so. instead they just gave us a mediocre update thats suppose to run well on crap and give nothing to the high end people
2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
XS Build Log for: My Latest Custom Case
2GB is plenty even for 2560x1600 AA'd with reasonable Skyrim mods... yes, you can slam into the wall if you add on the tiny but bloated ones that make negligible visual gains, but if you leave those ones out and optimize properly, you're well within limits and it looks 99% as good. Other than that game, I can't think of one that runs into the wall (BF3 doesn't *NEED* more, it will consume it if available).
Last edited by GoldenTiger; 03-21-2012 at 08:30 AM.
Radeon HD 7990 (April)
4096 (1D) Shader-Einheiten, 256 TMUs, 64 ROPs, 2x 384 Bit DDR Interface, 850/2500 MHz, Spieleverbrauch vermutlich ~350 Watt, Performance geschätzt ~500%
GeForce GTX 690 (Mai)
vermutlich 3072 (1D) Shader-Einheiten, 256 TMUs, 64 ROPs, 2x 256 Bit DDR Interface, Spieleverbrauch vermutlich ~330 Watt, Performance geschätzt 500-530%
GK110 (August)
vermutlich ~2500 (1D) Shader-Einheiten, 512 Bit DDR Interface, Spieleverbrauch vermutlich 250-300 Watt, Performance geschätzt ~495%
This monster ... GK110 will be 5% less than a dual of the same generation? OMG!
http://www.3dcenter.org/news/vermutl...90-aufgetaucht
And how do you know said unofficial texture packs have been properly QA'd before release, ensuring they are properly optimized? People say that many 3rd partly texture / graphics mod packs eat up a ton of memory but has anyone stopped to think that some (or MOST) of them may be poorly optimized, resulting in an unnecessarily inflated memory footprint?
A few examples of improvements through patches implementing proper in game texture efficiency have been seen in a number of tiles:
- AvP: Large (10%+) increase after the 2nd patch improved texture efficiency
- Wargames: EU Conflict: latest two patches have focused upon texture performance and the result has been ~20% performance increase in my tests.
- Shogun 2: Patch effectively improved across the board performance with a special focus being put upon fixing a memory leak in the texture caching system
I could go on and on. Basically, picking out a 3rd party mod results in a VERY poor comparison, particularly considering that most of the time an architecture's rendering limits will be reached far before memory will come into effect.
While many games will allocate more, they do not *need* more to run at full performance. BF3 and Crysis 2 especially are well-known to do so. Skyrim can be pushed to with excessive modding that offers little visual gain compared to a more modest modded install. Others can be pushed with extreme settings that VRAM would have no impact on even if present due to performance. Notice the framerate in the few you do have that are legitimately over 2GB, and their settings from the filenames, and that's with an oc'd 7970. Unplayable with those numbers, let alone minimums...
[As was said, GPU speed is an issue long before VRAM ever becomes one. Some extreme settings can result in higher than 2GB can run properly, but that's more for show than actual use.
Last edited by GoldenTiger; 03-21-2012 at 09:52 AM.
Yeah, there is a reason (though it's not exactly known on any specific case of course). My guess, as a coder, would be partly caching to transition any load stutters (pre-caching of assets), partly less frequent garbage collection (to limit performance hits during gameplay) if excess VRAM is available past what is needed to run, and partly extra VRAM being put to actual use. However, this is speculation on my part as to why many titles seem to exhibit this behavior currently.
Unfortunately, there really hasn't been a ton of testing done on brand-new cards regarding this... WSGF had a 1gb vs 2gb article for GTX 460, and I've seen some tests of 1.5GB vs 3GB GTX 580 with negligible differences including BF3 at 2560x1600, but the line as to where you need to be is pretty unclear. At this time though, barring surround setups (and even including some without applying tons of AA, or even with in non-bleeding-edge titles), it *appears* (and I emphasize that keyword) to me that 2GB is enough.
Last edited by GoldenTiger; 03-21-2012 at 09:58 AM.
Also presumably you want the card to last for a while, so when will 2gb be breached? 18 months time?
Bookmarks