You were in uni and I was in Grade 6 in 1999!
Printable View
You were in uni and I was in Grade 6 in 1999!
oh hang on 1999...oh..no i was working by then; no time for games, too busy; i finished uni in '95.
AdvancedMicroStutterDevices
Today anounced the new Phenom Processor, Hector said that this brings new boundaries providing more Advanced Microstutter per clock than the competing Intel Core2 Quad cores.
Now that ATi (AMD's Graphics arm) also produce Advanced Microstutter Ready cards we have the perfect solution for gamers wanting their Microstutter fix.
In other news nVidia's plan to counter Microstutter with the GX2 and 795S (S representing Stutter) seem to look quite powerful.
All eyes turn to Intel for their lack of response (rumours suggest they are perfecting microstutter in Ray Tracing).
Joking aside it would be amazing if nVidia and ATi produced a option in the drivers to reduce stutter at a compromise of 10% (at most) in performance), it would certainly make Crossfire/SLi worthwhile if you could have the fluidity and consistency of a single GPU on a MultiGPU setup.
Ahh adamsleath
Glide kicked serious buttocks back in the day, I remember the Original Unreal and UT on a Voodoo2 and Voodoo3. wow
Oh and Warzone 2100 as well as Need For Speed !
John
ehh, you'd be surprised at how many people here think that phenom's offer smoother performance compared to conroe (even though it is less)
:rofl:Quote:
795S (S representing Stutter)
but geez i was playing UT with an onboard ATI vid on a P2...at lanparty (ie room full of geeks)
and it was fun, aswell as team fortress of course :D back around then.
and i didnt even know what an fps was, or lag for that matter.
...and then everyone became a counterstrike junkie.... but ut and tf more fun; cs waiting for friggin respawn no thanks.
Well 2 years of planning, developing and designing this chip with an R700 card in mind would have to account for something is my guess.
Something else that's weird is the fact that the R700 card is shorter than the R680 card and that while it probably consumes a bit more power. This is also done by doing away with the bridge chip or putting a smaller one there. If there is a bridge chip on there then it sure as hell is not the same bridge chip as the PLX chip on R680.
The last few pages are epic lol, either really funny or really pitiful :D
That's very disappointing to see people exagorating on that microstutering thing. Everyone read one small article and for some reason all started noticing it...even those who never used multiGPU solutions :rolleyes:
I noticed the microstutter on my 9800gx2 and in most cases I could play through in but it would get worst if I ran a 3d app in a window. Good thing nvidia released the gtx 280 so I could step up.
I don't mind others being concerned about it, it might mean that I could get a cheaper 4870X2 if people still think it's not good enough for them :p:. I don't have a multiGPU set-up and I would be better off getting a faster single card than an extra GPU atm (using a 7800GT :shrug:). If I would ever have money and the need for more graphical horsepower then I may decide on getting an CFX set-up as I couldn't care less about that micro stuttering.
I could see how that argument could hold water. Just before I explain my theory I would like to warn you that I am a Intel Fanboy and will never buy AMD...ever!
Ok now in some games which are GPU limited and have a few scenes which are CPU limited The Core2 will have very variable frame rates, yet the Phenom would be consistently low.
I am sure a lot of people would agree that if your frames fluctuated from 30fps to 60fps you WOULD notice the difference (you will get moments of fluidity and then moments of semi fluid frames. This would cause some annoyance in that even a constant 30FPS with no fluctuation OR frame rates closer to the average, ie 30 to 40FPS range would be smoother in this circumstance.
Therefore in SOME games a Phenom may be Smoother than a Core2, BUT as soon as you upgrade your GPU (to remove the GPU limitation) the Core2 will be far superior as it would handle the CPU limited areas better than a Phenom which lacks Venom.
adamsleath
Back in the day I had a Pentium II 400Mhz with a i740 AGP coupled with 2 Monster 3D VooDoo2's (12MB). That was a good PC for Unreal and UT. I can tell you there was no Microstutter back then! I then upgraded to a Pentium III 600Mhz with a Voodoo3 and then onto a 1Ghz Pentium 3 with a Voodoo5
Ahh those were the days.....
I am now deciding whether to get 2x 1GB 4870 cards, a Zotac GT280GTX AMP! or hold on until this semi magical Alien technology R700 dual GPU ATi solution.
It's a tough one, I know that I would be happy with either of the 1st two, perhaps even more so with the GT280GTX if it did not cost so much money (I mean seriously the high end always used to be around the £300 mark!!)
In the UK 2 4870 cards will cost the same as a Zotac GT280GTX AMP! as prices were leaked for an hour until (£210 inc vat). = £420 for two
John
JohnZS, stay on topic or stay out :up:
Now where are the die shots of the R700 :stick:
Perkam
Sry gom, but i see it on a daily basis. Crysis, GRID(Although much better after hotfix), Vegas2, and Mass Effect all studder when "strafing".
But when I drop cpu to 3ghz, it's not so bad. GRID was really annoying...nothing like going down a straight @ 300kph, all of a sudden for it to seem like 500kph...
Anyway, 3 months ago...almost every app had some issues. Today, 3DMark06 and Vantage still show it, as do the above mentioned apps. Vantage, in the space test, when the camera goes dow the side of the ship, and then into the meteor feild, will report frames 40-70FPS, but it looks like 12FPS. When it drops down to 30FPS or lower, it smooths out.
Now, I do not beleive all of these are stutter, per se, but there is a definate issue. And the issue is less with less cpu speed.:shrug:
Opps, sry Perkam, but you posted after I had already started my post.
Well Ive seen factory overclocked 9800gx2 for £289 + £20 delivery
http://www.abtron.de/shop/catalog/pr...ipk96brgstuhu5
And GTX 260 for £240
http://www2.computeruniverse.net/products/e90268519.asp
GTX 260 would be my choice, overclocking to GTX 280 performance is possible.
Thats like claiming SLI should be ultra perfect now after all the years of development and "field testing". Its its developed...0?
Shorter or longer card doesnt change anything. And I am sure a newer PLX chip can save some space if that was the matter. But we simply just talk PCIe 2.0 switching here. 0 effect on the microshutter. The issue is the fundamentals of AFR. So until nVidia and AMD drops that you wont see any fix. However they wont, because AFR gives the most performance/scaling.
And why would they drop AFR, they could also use something akin to vsync to even out the frames.
[QUOTE=JohnZS;3081917]Try compare AFR and scanline interleaving. It would simply be horrible performancewise to use the second today. Not even to talk about the endless issues back then with tearing, artifacts etc etc.
In short, its always been a painful experience with multi GPUs.
Can you explain why it's AFR at fault and not synchronization of output frames? :confused:
I fail to see how the actual rendering method affects stuttering :shrug: