-=The Gamer=-
MSI Z68A-GD65 (G3) | i5 2500k @ 4.5Ghz | 1.3875V | 28C Idle / 65C Load (LinX)
8Gig G.Skill Ripjaw PC3-12800 9-9-9-24 @ 1600Mhz w/ 1.5V | TR Ultra eXtreme 120 w/ 2 Fans
Sapphire 7950 VaporX 1150/1500 w/ 1.2V/1.5V | 32C Idle / 64C Load | 2x 128Gig Crucial M4 SSD's
BitFenix Shinobi Window Case | SilverStone DA750 | Dell 2405FPW 24" Screen
-=The Server=-
Synology DS1511+ | Dual Core 1.8Ghz CPU | 30C Idle / 38C Load
3 Gig PC2-6400 | 3x Samsung F4 2TB Raid5 | 2x Samsung F4 2TB
Heat
Yes that's exactly what tried to explain. Look, I was speaking about minimum and maximum being all over the place. But average is displayed smoothly. I'm sorry I forgot to mention I'm using crysis built in framerate counter which shows min, average and max -all at the same time! console command: r_displayinfo 1
That's how I first noticed it just before crysis was released and BEFORE this microstuttering issue gets widely observed. I was looking at that built in counter or FRAPS and realized I'm not getting nowhere near what they are saying but possibly what that minimum fps counter shows. At least it feels just as low as it shows. When I use cf in crysis I get about 10fps min, 40avg, and even over 100fps max and all the counters -min+avg+max are all over the place (average is although somewhat steady as it should be). But surely it is not 40fps. It feels like 10fps at times (looking at that smoking red flare at the beach, for example and moving mouse of course).
Weird thing is in oblivion when using AA+AF and texturepacks, sometimes in heavily forrested areas fps counter (built-in) shows even 55fps but it's like 15. This could be the case just like in your video. It's like jumpy or sudden stops. But when the fps goes over 60 and stays there it's gone, instantly! I had to give up using 4xAA+16xAF just because of that. 2xedge detect + 4xAF helps avoiding that fairly well.
Last edited by L7R; 07-16-2008 at 12:50 PM.
Why GRID suddently became the microstuttering benchmark tool?
I'm wondering same. I haven't really noticed it on GRID. Actually that's the most stutter-free game I have right now. Might be because fps is pretty much always over 60 now when crossfire is actually working. It would be nice if someone has noticed that too why microstuttering is harder (or impossible) to notice when fps goes over 60? Is it because when frames are rendered even completely out of time (ie both cards get a frame almost at the same time) there's still 30fps left. Which can be considered smooth.
Because the large textures are memory-intensive, as are alot of newer titles(as will be most DX10 titles) and it highlights one of the biggest causes of "stutter"...using memory links for inter-gpu communication. It also highlights issues with running physics simulations on the gpu in the same aspect.
In the end, if RV770 is not prone to these issues, it highlights the benefit of the new sideport vs. other technologies, which would be a design win for AMD. How gpu's communicate with each other in rendering tasks is essential in the future, as silicon process problems and limits come into play...altohugh many are still denying it, multi-cored processing is coming, whether we like it or not, and if your cores are fantastic, but thier communication sucks, your cores suck too.
FPS seems smooth over 60 becuase any frames over 60 get discarded anyway(not really, but almost), so this is what is ideal for LCDs. but as teh "refresh rate" of LCD increases, performance within those margins becomes more important as well...120hz LCDs means 60FPS is not enough so that "smooth" experience.
There are times, however, that even this is not enough FPS because of other issues that have already been mentioned in this thread, but at that point drivers can fix most issues, but this also requires work for each individual app, as the driver must schedule the workload properly.
Understanding the importnace of workload balancing also has me aware of how un-importnat, really, this Hydra tech is, as GPUs already do this anyway...the only benefit would be using different display devices, but even then issues come up.
This is why this is so important now, and has taken as back burner in the past...we are at a critical stage in this regard...LCD tech cannot make the jump if the display devices cannot keep up.
Last edited by cadaveca; 07-17-2008 at 11:12 AM.
First answer from AMD:
2nd generation PCI Express bridge design of course means that PLX chip has been upgraded to support PCI Express 2.0 standard.
"Next generation GPU inverconnect for improved scaling" means more bandwidth for data transfers both between cards and between system memory.
The bandwidth between both GPUs has been bumped from 6 GB/s to 20 GB/s (According to to HardOCP preview).
Waiting for more information![]()
Favourite game: 3DMark
Work: Muropaketti.com - Finnish hardware site
Views and opinions about IT industry: Twitter: sampsa_kurri
Could it be threading issues in GRID?
As I understand it there are a lot of reasons why a game will “stutter” or lag
One scenario could be threadproblems.
Bottlenecks will slow down performance. Bottlenecks are created if something gets overheated with too much work and cannot handle the task.
Here are some thoughts about C2Q and threading.
C2Q is two C2D without internal communication. They communicate through the FSB. Also Intel design is done to use the FSB as little as possible because they have some latency issues (the computer is at its slowest). It is vital that the cache is used for C2D or C2Q to be speedy and that means that the hit rate needs to be very high.
If you take the C2Q which has two C2D (I call them A and B here). One thread is positioned on one core in C2D-A, and another thread is located on core in C2D-B. Then if the thread on C2D-A is moved to the other core on C2D-B it means that it will need to re-fetch all data that was stored on C2D-A L2 cache and all this data needs to go through the FSB that isn’t that fast (high latency). When the thread has moved the hit rate for cache data goes down until data has been processed. This traffic in the FSB also need to handle I/O graphics and that may slow it down more, maybe switching in the Northbridge takes extra time. So for a fraction(?) of a second the C2Q is slowed until data in cache is refilled and the FSB gets up speed again because there isn’t any queue of data that needs to be sent or retrieved.
If this is the case than microstuttering (or FSB-stuttering)could be a problem that is related to games that scales to a lot of cores and use some memory.
All processors are more or less sensitive for switching treads to other cores (of course it depends on how much memory they are using). I think that Vista is NUMA aware and Phenom supports that. That is a technique to add some intelligence on where threads are placed in order to optimize memory latency. I know that there are some who say that NUMA doesn’t get any advantages but it is very hard to measure the performance gain with it because it isn’t that often the no-NUMA is hurt with issues like the one described. But this microstuttering (or I/O problem in this case) could be something that is solved with NUMA. I think that Nehalem is going to have NUMA and I don’t think that Intel put it there for fun.
Also both Nehalem and AMD Phenom isn’t as sensitive for un-optimized threading as C2D and C2Q.
Last edited by gosh; 07-17-2008 at 12:29 PM.
So, are you Sampsa ready to start building another testing rig over a Phenom?
I think we need some proof if it's a threading problem before all crossfire users switch to Intel-free rigs.
And more efficient drivers.
A lot of problems I am seeing stem from the different testing Operating Systems. Some use XP, some Vista, some use XP 64, and others use Vista 64.. 4 completely different sets of drivers, with different kernels and different everything. Do I need to list the differences?
And one of the problems I am noticing is people are posting and not reading...
PS. At XS, I think when posting a review link, it must become mandatory for the poster to list the operating system used in the title. Otherwise, that review cannot be compared to any other with any degree of confidence.
My watercooling experience
Water
Scythe Gentle Typhoons 120mm 1850RPM
Thermochill PA120.3 Radiator
Enzotech Sapphire Rev.A CPU Block
Laing DDC 3.2
XSPC Dual Pump Reservoir
Primochill Pro LRT Red 1/2"
Bitspower fittings + water temp sensor
Rig
E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB
I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.
Got any proof of this, or is it just speculation because you prefer AMD?
Anyone out there with a Phenom on 3870x2?
And then what does this have to do with the topic? It seems like it came completely from left field. Is this the whole slower but smoother concept?
And also at http://www.hardforum.com/showthread.php?p=1032761583
For such an untested theory, sure is being plastered everywhere.
Though his comments were a tad off topic, there is no reason to go down the aggressive route.
Until the card is officially released and tested, all we are doing is speculating and there are many forms of speculation, none of which require your approval before they can be posted.
If you don't like a post, respond to it constructively or don't respond to it at all.
I, for one, think that the xfire scaling issues in Crysis are a seperate issue seeing as 4870X2 in xfire scales pretty well in some games but doesn't in others, meaning its probably an application-based issue.
Perkam
It's fustrating, AMD are like the proverbial willy (pee pee) teasers, they tempt us with this new found 2 Terrabyte powerhouse, show us just how much power it gives to a lot of current games, yet make us wait...for nearly a month.
We know so little about the R700, yet also so much...
Rolll on with the next set of (p)reviews with more mature drivers and board revision!
John
Stop looking at the walls, look out the window
Any other games besides Grid?
Crysis,Cod4,UT3 maybe?
Reading VR-Zone's review, they say microstuttering still exists.
http://www.vr-zone.com/articles/Grap...0/5935-14.htmlATI's drivers still needs some polishing for CrossFire setups. During the course of our testing, we noticed strange and random artifacts appearing in our Unreal Tournament 3 benchmark, as well as micro-stutters which occur very randomly as well in other softwares.
As much as speculation doesnt require my approval, me posting whether I approve or not doesnt need yours
Its one thing to speculate. Its another when its obvioulsy biased, and then cut/paste to every microstutter thread he can find. The problem with it, is usually a hypothesis is based on limited observation and then tested further. This has shown to have absolutely no thread of truth to it at all.
For one this doesnt explain then why there isnt microstutter on single GPU solutions. It doesnt explain microstutter on C2D. And it doesnt explain why there is so far no microstutter on the 4870, not only x2 but also CF isnt showing it so far. Sure there isnt enough testing yet, but there is more done on just the x2 than his 'theory'.
Bookmarks