http://www.guru3d.com/article/geforc...80-sli-review/
http://www.hardocp.com/article/2012/...eo_card_review
http://www.guru3d.com/article/geforc...ay-sli-review/
Printable View
:up:
http://www.techpowerup.com/163638/Ne...s-Surface.htmlQuote:
New GK104 SKU Details Surface
We know from a late-March article that NVIDIA is working on two new SKUs based on its GK104 silicon, for launch in May. With the Kepler architecture, particularly with the design of the new-generation Streaming Multiprocessors (SMX), NVIDIA substantially increased CUDA core density. Each SMX holds 192 CUDA cores, and as with the previous-generation Fermi architecture, the SMX count is the only thing NVIDIA can tinker with, to control CUDA core count in new GPUs. GeForce GTX 680's little brother, hence will have 7 out of 8 SMX units enabled, and end up with a CUDA core count of 1344. This leaves easier to configure parameters such as clock speeds, for NVIDIA to design the perfect SKU to capture a price-point. NVIDIA is targeting the sub-$399 market, while somehow maintaining competitiveness with Radeon HD 7950.
Specifications of the new SKU follow.
GeForce GTX 670 Ti, by the numbers
4 Graphics Processing Cores (GPCs), 7 Streaming Multiprocessors (SMX)
1344 CUDA cores
112 Texture Memory Units (TMUs), 32 Raster Operation Processors (ROPs)
256-bit wide GDDR5 memory interface
Around 900 MHz base core clock, boost clock and feature availability not known
Around 1000 MHz (5.00 GHz GDDR5 effective) memory clock, around 160 GB/s memory bandwidth
Estimated price US $349-399
The new report reinforces the May launch time-frame.
palm went just through my forehead lol
Microstuttering happens because of the minuscule differences in frame load times between the two gpus, it has nothing to do with overall FPS. This occurs with any 2-card setup (including AMD); the reason why it's better with the addition of an extra card is because the frame offset becomes smaller when you add a third party to fill in potential gaps.
First, FPS play a role: With high fps you are usually rather CPU bottlenecked, and then the offset slowly vanishes as the CPU dictates the frequency of the displayed frames and both GPUs have to wait in an even pattern. With low fps, most of the time there is a GPU bottleneck. The "deeper" this bottleneck is, the worse the problem with microstuttering.
Second, has it ever been proven that without a CPU bottleneck additional GPUs lessen microstuttering? By proven I mean frametime measurements for at least 5 games, no CPU bottleneck, data for both CF and SLI and complete disclosure of GPU and CPU load and graphic settings.
1. You might be correct, I haven't ever considered going 2-card so I haven't done my research. My point was entirely that you will encounter microstuttering regardless of if you have rediculously high FPS.
2. You can look it up for yourself, but I'm pretty positive I've seen reviews confirming that 3 card has statistically significant less microstuttering.
i remember that review. the third card only added a few more % of help, but fixed microstuttering, it was a 6870x2+6870.
basically i think the idea is to reduce the gpu load to like 80% rather than 100, so that the one lagging behind can use up that extra 20% to keep things on track.
however i think it would have been a better test to see what the effective framerate is at different cards because even with microstutter you really dont feel the stutter, you just think the game is running at a much lower framerate than it really is.
Rage 3d came to the same conclusion with their article on microstutter with quad-fire with 4870x2.
Sounds about right. Seems the consensus is that there's no reason to buy a third or fourth card for FPS gains; however, the gameplay experience will be improved by the reduced microstuttering.
1. Yes, mathematically the jitter is still there, however it decreases if you go towards the CPU bottleneck. For example, if you assume a constant CPU time of lets say 10ms you could get a cadence of 10ms-90ms-10ms-90ms (90ms being the time it takes one GPU to render its assigned frame) which is pretty noticeable as the deviation from the average frametime is high. If you have higher fps, so maybe 10ms-15ms-10ms-15ms, it becomes much more even and harder to notice as classical stuttering. It can feel smooth (so no classical stuttering), but the perceived fps are lower than the displayed fps.
2. THG did a very poor test here. It is full of inconsistencies, diagrams and information are missing. In the scene that was shown for the frametime graph, there was clearly either a bad CF profile (thus an artificial CPU limit) or a true CPU limit. Either way, the analysis was far from conclusive and no other proof was shown.
There is microstutter with single card setups as well...
I have yet to encounter it. This is a whole different category, as AFR stuttering potentially concerns every game while the one you mention probably only occurs in select scenarios.
Not everyone cares about eyefinity or surround. I'll stick with my 120hz 3d ready display.
Not this again. :rolleyes:
The definition of microstutter is very loose for some people apparently. I never heard that term used before discussions of the AFR related microstutter that we have all heard about and/or seen first hand. Its impossible to the that phenomenon on a single gpu.
Microstutter actually DOES occur on all GPU's be it single card or more. The definition of microstutter is simply stemming from typical rendering, which is "uneven frame times within a second", which simply means that the frames in "frames per second" are not being rendered at an even spacing. A single GPU does not render every frame of every second at exactly the same rate within the second as some are more demanding than others inside a second. However, it typically is described as becoming more visible with 2+ GPU's as you excaberate the effect by using even more uneven times mixed together. The reality is that even with dual-card setups it is basically unnoticeable to most people, some notice it if they look for it because they had read about it (a handful) and a minute number actually notice it for real (non-placebo) while playing. Most people confuse things like load hitching for microstutter :rolleyes:.
I'll stick with my S-IPS 2560x1600 30" LCD myself... call me when consoles are doing that UrbanSmooth! :rofl:
In all seriousness, even 1080p is better than consoles render virtually any title at, only a tiny handful of them render at 720p even, most are sub-720p for good ones and upscaled. 2560x1600 is double+ the pixels of 1080p, so.... yeah. Also, multiple-cruddy-TN-displays is not what most people with the cash to afford this hardware go for ;).
Most people don't see the blatant stutter in Fallout 3, Oblivion, and New Vegas that appears on any 60hz display. I can see it clear as day and didn't need anyone to tell me about it.
You post on [H], do you remember those HP LP2465 refurbs that were popping up in the hot deals section for a while? The input lag was horrific on that display, yet most people didn't notice it.
When I play Crysis with a GTX280 and then move to a 4870x2 and despite the higher actual framerate the game feels no smoother, thats not difficult to notice. I noticed crap like that in a few games and ended up selling the card in large part due to that. I didn't even know that was microstutter at the time.
I've seen microstutter with 6950s in a couple of games. It really stood out in Metro in a lot of spots. I saw it in Stalker Clear Sky, clear as day. The choppy feeling framerate despite is very annoying at times and can be seen at 40 or so fps. I've never seen noticeable uneven frametimes resulting in choppy performance at any framerate that I would consider playable on a single gpu. I've never seen any research that shows this to be a problem in real world usage with a single gpu. I'm sorry but unlike with sli and crossfire this isn't a well documented problem with a single gpu.
I'm not saying that microstutter will stop me from using sli or crossfire in the future since I've found both solutions to work really well most of the time but I wouldn't buy two mid-range cards in an attempt to get the performance of a high end card. Microstutter was a massive problem with 4870 crossfire but it does appear to be less of an issue these days. Thats not to say that its non-existent.
I really want one of those Dell 30" full resolution LEDs; if they weren't so god damn expensive I'd have upgraded my 24" LCD years ago.
I wonder if that is more due to poor game coding than than anything else. Fallout 3 in particularly seemed to have a lot glitches when I was playing it, if for no other reason the game was massive and they probably just couldn't get around to patching everything in a reasonable amount of time. I suppose with that logic we could just say that better drivers would entirely eliminate microstuttering, but being realistic whenever we're dealing with devices in parallel it'll be near impossible to get them perfectly aligned on the microsecond scale short of having optical connections to remove all latency.
It is a bug with a quick workaround. From what I gather it has something to do with the game natively running at 64hz. It ends up causing an odd stutter. Divinity 2 had this issue as well before the expansion and DKS patch was released. The reason that I bring is up is that I think that is just a blatant stutter that no one ever had to tell me about and it seems like most people don't notice it. I was one of the few people looking for a way to cap the framerate when New vegas was released since that and an ini edit will fix the issue.
This is what microshutter looks like. This is on a single card setup, it just so happens that in general such shuttering is more likely to occur in sli/crossfire because of the way AFR works.
correct terminology is microstutter btw, not microshutter.
Some of us refuse to deal with bezel-lines. ONLY way I would do triple is with 3 projectors so that picture is seamless. Even then, the times I've tried out triple monitors I've just found myself thinking "what's the point?". Triple monitor would just cause you to really hate your eyes because those bezels ruin the entire experience (trust me, I've seen it).
1 huge monitor > 3 monitors any day of the week.