http://www.techpowerup.com/163638/Ne...s-Surface.htmlNew GK104 SKU Details Surface
We know from a late-March article that NVIDIA is working on two new SKUs based on its GK104 silicon, for launch in May. With the Kepler architecture, particularly with the design of the new-generation Streaming Multiprocessors (SMX), NVIDIA substantially increased CUDA core density. Each SMX holds 192 CUDA cores, and as with the previous-generation Fermi architecture, the SMX count is the only thing NVIDIA can tinker with, to control CUDA core count in new GPUs. GeForce GTX 680's little brother, hence will have 7 out of 8 SMX units enabled, and end up with a CUDA core count of 1344. This leaves easier to configure parameters such as clock speeds, for NVIDIA to design the perfect SKU to capture a price-point. NVIDIA is targeting the sub-$399 market, while somehow maintaining competitiveness with Radeon HD 7950.
Specifications of the new SKU follow.
GeForce GTX 670 Ti, by the numbers
4 Graphics Processing Cores (GPCs), 7 Streaming Multiprocessors (SMX)
1344 CUDA cores
112 Texture Memory Units (TMUs), 32 Raster Operation Processors (ROPs)
256-bit wide GDDR5 memory interface
Around 900 MHz base core clock, boost clock and feature availability not known
Around 1000 MHz (5.00 GHz GDDR5 effective) memory clock, around 160 GB/s memory bandwidth
Estimated price US $349-399
The new report reinforces the May launch time-frame.
Fractal Arc Midi
Asus P8Z77V-PRO
Intel Core I7 3770K
Corsair Hydro H80
8GB G.Skill Ripjaw-X 2133Mhz
eVGA GTX670 SC 2GB
Corsair AX-850W Gold
GPU: 4-Way SLI GTX Titan's (1202 MHz Core / 3724 MHz Mem) with EK water blocks and back-plates
CPU: 3960X - 5.2 GHz with Koolance 380i water block
MB: ASUS Rampage IV Extreme with EK full board water block
RAM: 16 GB 2400 MHz Team Group with Bitspower water blocks
DISPLAY: 3x 120Hz Portrait Perfect Motion Clarity 2D Lightboost Surround
SOUND: Asus Xonar Essence -One- USB DAC/AMP
PSU: EVGA SuperNOVA NEX1500
SSD: Raid 0 - Samsung 840 Pro's
BUILD THREAD: http://hardforum.com/showthread.php?t=1751610
palm went just through my forehead lol
Microstuttering happens because of the minuscule differences in frame load times between the two gpus, it has nothing to do with overall FPS. This occurs with any 2-card setup (including AMD); the reason why it's better with the addition of an extra card is because the frame offset becomes smaller when you add a third party to fill in potential gaps.
First, FPS play a role: With high fps you are usually rather CPU bottlenecked, and then the offset slowly vanishes as the CPU dictates the frequency of the displayed frames and both GPUs have to wait in an even pattern. With low fps, most of the time there is a GPU bottleneck. The "deeper" this bottleneck is, the worse the problem with microstuttering.
Second, has it ever been proven that without a CPU bottleneck additional GPUs lessen microstuttering? By proven I mean frametime measurements for at least 5 games, no CPU bottleneck, data for both CF and SLI and complete disclosure of GPU and CPU load and graphic settings.
Last edited by boxleitnerb; 04-05-2012 at 02:24 PM.
1. You might be correct, I haven't ever considered going 2-card so I haven't done my research. My point was entirely that you will encounter microstuttering regardless of if you have rediculously high FPS.
2. You can look it up for yourself, but I'm pretty positive I've seen reviews confirming that 3 card has statistically significant less microstuttering.
i remember that review. the third card only added a few more % of help, but fixed microstuttering, it was a 6870x2+6870.
basically i think the idea is to reduce the gpu load to like 80% rather than 100, so that the one lagging behind can use up that extra 20% to keep things on track.
however i think it would have been a better test to see what the effective framerate is at different cards because even with microstutter you really dont feel the stutter, you just think the game is running at a much lower framerate than it really is.
2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
XS Build Log for: My Latest Custom Case
Rage 3d came to the same conclusion with their article on microstutter with quad-fire with 4870x2.
\Project\ Triple Surround Fury
Case: Mountain Mods Ascension (modded)
CPU: i7 920 @ 4GHz + EK Supreme HF (plate #1)
GPU: GTX 670 3-Way SLI + XSPC Razor GTX670 water blocks
Mobo: ASUS Rampage III Extreme + EK FB R3E water block
RAM: 3x 2GB Mushkin Enhanced Ridgeback DDR3 @ 6-8-6-24 1T
SSD: Crucial M4 256GB, 0309 firmware
PSU: 2x Corsair HX1000s on separate circuits
LCD: 3x ASUS VW266H 26" Nvidia Surround @ 6030 x 1200
OS: Windows 7 64-bit Home Premium
Games: AoE II: HD, BF4, MKKE, MW2 via FourDeltaOne (Domination all day!)
1. Yes, mathematically the jitter is still there, however it decreases if you go towards the CPU bottleneck. For example, if you assume a constant CPU time of lets say 10ms you could get a cadence of 10ms-90ms-10ms-90ms (90ms being the time it takes one GPU to render its assigned frame) which is pretty noticeable as the deviation from the average frametime is high. If you have higher fps, so maybe 10ms-15ms-10ms-15ms, it becomes much more even and harder to notice as classical stuttering. It can feel smooth (so no classical stuttering), but the perceived fps are lower than the displayed fps.
2. THG did a very poor test here. It is full of inconsistencies, diagrams and information are missing. In the scene that was shown for the frametime graph, there was clearly either a bad CF profile (thus an artificial CPU limit) or a true CPU limit. Either way, the analysis was far from conclusive and no other proof was shown.
There is microstutter with single card setups as well...
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
I have yet to encounter it. This is a whole different category, as AFR stuttering potentially concerns every game while the one you mention probably only occurs in select scenarios.
Not everyone cares about eyefinity or surround. I'll stick with my 120hz 3d ready display.
Not this again.
The definition of microstutter is very loose for some people apparently. I never heard that term used before discussions of the AFR related microstutter that we have all heard about and/or seen first hand. Its impossible to the that phenomenon on a single gpu.
Microstutter actually DOES occur on all GPU's be it single card or more. The definition of microstutter is simply stemming from typical rendering, which is "uneven frame times within a second", which simply means that the frames in "frames per second" are not being rendered at an even spacing. A single GPU does not render every frame of every second at exactly the same rate within the second as some are more demanding than others inside a second. However, it typically is described as becoming more visible with 2+ GPU's as you excaberate the effect by using even more uneven times mixed together. The reality is that even with dual-card setups it is basically unnoticeable to most people, some notice it if they look for it because they had read about it (a handful) and a minute number actually notice it for real (non-placebo) while playing. Most people confuse things like load hitching for microstutter .
I'll stick with my S-IPS 2560x1600 30" LCD myself... call me when consoles are doing that UrbanSmooth!
In all seriousness, even 1080p is better than consoles render virtually any title at, only a tiny handful of them render at 720p even, most are sub-720p for good ones and upscaled. 2560x1600 is double+ the pixels of 1080p, so.... yeah. Also, multiple-cruddy-TN-displays is not what most people with the cash to afford this hardware go for .
Last edited by GoldenTiger; 04-06-2012 at 01:40 PM.
Most people don't see the blatant stutter in Fallout 3, Oblivion, and New Vegas that appears on any 60hz display. I can see it clear as day and didn't need anyone to tell me about it.
You post on [H], do you remember those HP LP2465 refurbs that were popping up in the hot deals section for a while? The input lag was horrific on that display, yet most people didn't notice it.
When I play Crysis with a GTX280 and then move to a 4870x2 and despite the higher actual framerate the game feels no smoother, thats not difficult to notice. I noticed crap like that in a few games and ended up selling the card in large part due to that. I didn't even know that was microstutter at the time.
I've seen microstutter with 6950s in a couple of games. It really stood out in Metro in a lot of spots. I saw it in Stalker Clear Sky, clear as day. The choppy feeling framerate despite is very annoying at times and can be seen at 40 or so fps. I've never seen noticeable uneven frametimes resulting in choppy performance at any framerate that I would consider playable on a single gpu. I've never seen any research that shows this to be a problem in real world usage with a single gpu. I'm sorry but unlike with sli and crossfire this isn't a well documented problem with a single gpu.
I'm not saying that microstutter will stop me from using sli or crossfire in the future since I've found both solutions to work really well most of the time but I wouldn't buy two mid-range cards in an attempt to get the performance of a high end card. Microstutter was a massive problem with 4870 crossfire but it does appear to be less of an issue these days. Thats not to say that its non-existent.
I really want one of those Dell 30" full resolution LEDs; if they weren't so god damn expensive I'd have upgraded my 24" LCD years ago.
I wonder if that is more due to poor game coding than than anything else. Fallout 3 in particularly seemed to have a lot glitches when I was playing it, if for no other reason the game was massive and they probably just couldn't get around to patching everything in a reasonable amount of time. I suppose with that logic we could just say that better drivers would entirely eliminate microstuttering, but being realistic whenever we're dealing with devices in parallel it'll be near impossible to get them perfectly aligned on the microsecond scale short of having optical connections to remove all latency.
It is a bug with a quick workaround. From what I gather it has something to do with the game natively running at 64hz. It ends up causing an odd stutter. Divinity 2 had this issue as well before the expansion and DKS patch was released. The reason that I bring is up is that I think that is just a blatant stutter that no one ever had to tell me about and it seems like most people don't notice it. I was one of the few people looking for a way to cap the framerate when New vegas was released since that and an ini edit will fix the issue.
This is what microshutter looks like. This is on a single card setup, it just so happens that in general such shuttering is more likely to occur in sli/crossfire because of the way AFR works.
\Project\ Triple Surround Fury
Case: Mountain Mods Ascension (modded)
CPU: i7 920 @ 4GHz + EK Supreme HF (plate #1)
GPU: GTX 670 3-Way SLI + XSPC Razor GTX670 water blocks
Mobo: ASUS Rampage III Extreme + EK FB R3E water block
RAM: 3x 2GB Mushkin Enhanced Ridgeback DDR3 @ 6-8-6-24 1T
SSD: Crucial M4 256GB, 0309 firmware
PSU: 2x Corsair HX1000s on separate circuits
LCD: 3x ASUS VW266H 26" Nvidia Surround @ 6030 x 1200
OS: Windows 7 64-bit Home Premium
Games: AoE II: HD, BF4, MKKE, MW2 via FourDeltaOne (Domination all day!)
correct terminology is microstutter btw, not microshutter.
All along the watchtower the watchmen watch the eternal return.
Some of us refuse to deal with bezel-lines. ONLY way I would do triple is with 3 projectors so that picture is seamless. Even then, the times I've tried out triple monitors I've just found myself thinking "what's the point?". Triple monitor would just cause you to really hate your eyes because those bezels ruin the entire experience (trust me, I've seen it).
1 huge monitor > 3 monitors any day of the week.
Bookmarks