:eek: LOL That's my Birthday,! :D :welcome:
Printable View
Oh man, that's great news!
Now I'm gonna regret buying a 4850:(
Yay, 3 days before my birthday! Heh, now I know what my August purchase will be :D
What I've noticed anything under 60fps it's noticeable in crysis. Best test is crysis where it is evident with any previous CF or SLI configuration. I shall say it will be groundbreaking news if the microstuttering is finally gone, but it's hard to believe before thorough tests.Quote:
I might suggest you try to find settings for the 4870 X2 that push them into the 30-40fps region, thats where microstuttering is most notable.
EDIT: Another game for testing purposes would be Oblivion with high resolutions and heavy AA+AF use. It's weird when I see even 50fps but it feels like 20fps. Anything over 60fps and it becomes smooth as silk.
It doesn't matter under which res microstuttering is most noticeable, because even if it is not noticeable you can still see it on the frametime benchmarks.
Yeah, I'll buy 15 and distribute 13 of them to the poor.
:rofl::rofl::rofl::rofl::rofl:
*please send one to me* (joking)
At any rate, this card is very promising in the high end market. I know this has been mentioned everywhere but would this force nVidia to drop the price of the GTX 200 series even more than it is? (if the CrossfireX scales very well)
Well dang. I guess I'll just have to manage to finance both a new monitor AND a 4870x2. These things just need to be out! We can only hope an early release due to mass card leak, like the 4850... though I doubt it, lol. Me and my childish hopes and dreams, heh.
Though, I would need a confirmation that it works fine on a P35 board, seeing as I don't really want to have to buy a new mobo. If it doesn't, I guess I'll just have to go with a single chip solution. =/
Why do you need a new monitor? 22" is just fine and I don't think resolutions above 1680 are worth the performance hit they cause.
1920x1200 is a nice jump up from 1680x1050 plus you have the added advantage of being able to display full 1080P content. These cards probably won't even dip that much from 16x10 to 19x12 anyways. Looking at CF 4870 results, they do very well at 19x12 and the added VRAM and scaling on the X2 will only make this even better. Can't wait to try this on my 24 :)
Haha, it's mainly due to me wanting a monitor that will scale resolutions properly, and has HDMI. PS3 looks quite :banana::banana::banana::banana:ty on 720p stretched to 16:10. That and the lack of scaling on this monitor has driven me insane with older games and their lack of widescreen support. Then add to the fact that my dad is coughing up half the price because he wants my old monitor, it makes it a decent deal. 220USD for a 24" would be all I'd be paying, so that works fine for me price wise. Unfortunately, it almost kills an upgrade path to the 4870X2 for me. I have other things I need to pay for (eg. College), and I like having a decent buffer in my bank account. :p: If you feel like talking me out of it, it'd be welcome. I'd like to be able to afford an X2. :yepp:
Personally I'm trying to figure out a way I could afford both. We'll see how that pans out though, hahahaha. If I can't though, I'll just go with the original plan of the 4870 1gb.
No one really has an answer for this, but there is no reason you can't sell your current board and upgrade to an affordable P45 based board. That's what I'm doing, and I'll only be running an HD4870. There are hardly any articles out there that compare single card results on PCI-e 1.0/1.1 vs PCI-e 2.0, but I read one where the 9800GX2 started to get choked at 1920X1200+ resolution with AA/AF, but only on certain games.
I see. Actually I'm pissed of at my 226BW as well, I wanna get a Xbox360 but the scaling issues deterred me from it. Which monitor are you planning to get?
Take a look at the BenQ G2400W. It's only a TN panel, but for a TN panel it has good color and black levels. Considering you're coming from a TN panel, it shouldn't be that big of a deal anyways. The selling points are low input lag, 1:1 pixel mapping, and cheap price.
It's hard to find a good 24" MVA/PVA monitor right now. You could get a good IPS panel, but it's going to cost ya.
:yepp: That's the exact one I was looking at. I've only heard good things about it compared to all of the others around the same price.
Meh, I'll probably end up going with the 4870 1gb as first planned. Might as well wait for a 2nd gen with fixed microstudder, just to make sure ALL of the issues we haven't found yet, are ironed out. =P
Either way, it's awesome that they've more or less fixed it. We'll need some more tests to be absolutely sure, but I'm liking the looks of it so far. Too bad it's out of my logical price range. I need a better job!:ROTF:
hyundai w241d is my 24" monitor
it has alot of connectors and supports 1080p over component :up:
currently my laptop (vga), pc (dvi) x360 (component) and PS3 are connected to this beauty :up:
input lag is tolerable (no difference in css between older tn panel and new va one!)
you really have to take a look into scaling options, this mon has 1:1, aspect ratio and full option on all connectors, some monitors provide this on dvi only :rolleyes:
^ I don't think that will be necessary. For anyone.
Outside of 2560x1600, 2 X2s would be complete overkill 99% of the time. Sure if you want to use maximum amounts of anti aliasing at lower resolutions, you could still leverage some of this extra horse power but it would come at a high price of course. I doubt gains would be that substantial even at 1920x1200. Then again most people who'd spend $1400 or so on gpus and a power supply alone probably can manage to afford a 30" LCD.
im already running assassins creed at 1920 x 1440:p:
& grid @ 1600x1200 8XCSAA,
but will this x2 run empire total war with 10000 soldiers @ 1920x1200 8xAA @ good fps? :D
Hoping this means 55nm GTX280 will be out sooner. At least GTX280's will be more realistically priced after 4870x2 release. Around 4 weeks and it should be here. :up:
They can put 4*HD4870X2 in a PC, any board can do that as long as it has 4 PCIe 16x connectors (and they have to change the cooling for watercooling blocks).. But they can't be crossfired because they only have one crossfire connector.
Sampsa go into the shower, clear that hangover and tell us it have better fps in Crysis Vh then regular CF ;)
:confused::confused::confused:
I've just had a post removed in this thread and on this page, relating an AMD interview in India. Any reason to that ? Was there a problem with the site I linked to :confused: (link to the Indian site was found on Vrzone today...)
http://www.techtree.com/India/News/E...90984-579.html
glad to see someone that knows what he's doing.
but being the sceptic that i am, when doing the microstuttering fps-log tests, could you run the tests with CPU-bound settings (800x600 low) and GPU-bound settings (crazy high eyecandy)?
i read a post somewhere on these forums with a diagram. it said that, in CPU bound configs, there can't be stuttering because the CPU (driver) takes longer to execute command (that are sent to the GPU) than it takes the GPU to render them. and that in GPU bound situations, the CPU sends commands too quickly for the GPU to handle immediately, creating a delay, resulting in microstuttering.
as these cards are to crazy fast, the CPU becomes the bottleneck very quickly. see my point? so i'd appreciate as much angles to this subject as possible. greatly appreciated!
Great news Sampsa, thanks for taking the time to investigate microstutter. I am eagerly awaiting further results.
http://plaza.fi/s/f/editor/images/ms_test.jpg
Ok, I finished testing with R700, 4870 CF, 3870 X2 and GeForce 9800 GX2 (Single and SLI mode) and recorded frametimes in Crysis and Unreal Tournament 3 demo:
The worst case scenario happened with 3870 X2 in Crysis when every other frame was rendered after ~21,5 ms and every other after ~49,5 ms. However in UT3 demo 3870 X2 provided stable ~11,5 - 12,6 ms rendering times between the frames.
R700 seems to be very strong and it rendered every frame after ~21,6 - 22,1 ms in Crysis and ~11,5 - 11,9 ms in UT3.
Also 4870 CF provided equally good results compared to R700 and rendered every frame after ~21,3 - 21,7 ms in Crysis and ~11,2 - 12,1 ms in UT3.
GeForce 9800 GX2 showed alot more variation in rendering times and frames were rendered after ~21,9 - 25,1 ms in Crysis and ~7,4 - 11,5 ms in UT3.
PLEASE NOTE!
These results are measured and analyzed with simple tools including Fraps and Excel. In my own conclusion, it seems that microstuttering depends on game engine and if we leave 3870 X2 in Crysis out from the results it looks like NVIDIA's SLI (9800 GX2) is suffering more from microstuttering than AMD's CrossFire.
Well considering there boasting there multi gpu technology and makeing drivers much better for it I would have assumed that they'd do this good.
Now to see if the 4870x2 will need crossfire activated in games to run at full power or if some it'll only run at 4780 speed. Which would kinda defeat the purpose.
Those results are comforting, at least as far as those 2 titles are concerned. It appears the issue has been reduced in severity substantially so that is a good sign of things to come. Now if only dual card crossfire and sli behaved similarly...
@Caveman
The card still is crossfire so the game must be scalable and supported for the card to use both GPUs and to show gains so that hasn't changed. There will still be titles where it will fall behind the 280 in light of this.
Well that suck's I was thinking since it's 2 gpu core's on one die it may be regarded as a single card.
What current games don't utilize crossfire or don't well?
:up: nice results
looks good enough for me
now wait for those in game frame rates...mmmm
I finally have given up gaming on LCD's and have gone back to CRT's for FPS Games like COD4. I have 30" Dell 3007 and 20" Samsung 2ms LCD. My 22" CRT at 85hz is so much better in response and no input lag. I kept thinking it was graphics setup and microstutter. But in the end the Monitor made bigger difference. I have a new 24" FW900 CRT coming and will run with my one GTX280 watercooled for best of both worlds and hope to have great results. Maybe try the 4870x2 2GB when it hits newegg ! The new LCD's coming out say they have lower ms response, problem is they have increasing input lag. We're getting faster vid cards but slower more complicated LCD's !
:confused: how do you manage to get a CRT nowadays :confused: they've been removed from the market a while ago in Europe...
Eh lost planet I have no interest in.
Why can't you create your on profile's for ATI card's aswell?
It's true it's close to the gtx280 in performance but if your gonna pay for a $500 card your gonna have that feeling that you want the best performance all the time.
And this is only XtremeSystems I'm sure someone can figure something out just like they did with 4870 fan speed/overclocking ;)
micro slutering lol didn't know we had those that small :rofl:
Accurateit.com sells CRT's new and refurbished...
http://www.accurateit.com/home.asp
CRT monitors are not good for your eyes. I am working as a pilot and our Aeromedical examiner told us to avoid them because they cause eye strain and can lead to myopia.
More rumors:
Quote:
Early benchmarks of 4870X2
Here are some early estimates of the benchmarks of the Radeon HD 4870X2.
3D Mark06 19K
3DMark Vantage P12000
Crysis at 1600x1200 50 fps
* QX9770 at 3.2GHz on Intel X48 mainboard
http://my.ocworkbench.com/bbs/showth...threadid=75297
3d mark 19k is very impressive for a 3.20ghz Quad.
9800gx2/8800gtx sli get about 15 - 16.5k at those clock speeds.
so can any form of eyestrain.Quote:
CRT monitors are not good for your eyes
one major cause of MYOPIA is constantly using short sight - impairing long sight by constantly using short focus.
some popel have stronger eyes genetically and are less prone :D
lol dude.
regarding the microstutter issue from sampsa's testing that's good news so far :up:
19000 3D06? I can't imagine what these will get with a Core 2 at 4ghz + once you overclocked the card.
As far as CRTs go, I agree they tend to still be quicker but honestly, the slight delay I get with my panel is liveable and I'm used to it. Sure if I compare, subjectively a CRT will likely seem quicker in my head but at the same time theres the aforementioned eye strain. I'd get headaches on my CRT at anything below 100Hz. 4:3 is more or less dead anyways :/
Holy... I might just have to sell my 4870 for a 4870X2... I guess there's a reason this is called "Sparta"... MADNESS!
Here's a run with 2 x 4870's with the E8500 @ 4000. X2 shouldn't score any less/more IMO. The video card is not OC'ed by the way and no tweaks were used... this is my daily setup, 1 year old XP.
http://img520.imageshack.us/img520/5...dm06so0.th.jpg
Offtopic: anyone know if you have to do anything special in Vista 64 to get crossfire working properly? I just ran Vantage and got only ~9k / ~26fps for GPU tests and i noticed people usually get around 13-14k at 4000-4200mhz with ~45fps for the GPU tests. Using hotfix drivers... thanks.
http://img530.imageshack.us/img530/5...ancebm3.th.jpg
true i do like widescreen....got a 32" sony crt for that :p:...try putting that on yr desktop :lol:
but 120fps is gonna be possible with 4870x2 perhaps in a few more titles than before.
I'm thinking about a tri-CF setup (1x4870X2 + 1x4870), because I already got a 4870.
I really doubt that there won't be any microstutters, especially in high resolutions when 512MB won't be enough.
What do you think about that?
But you only need 2 pcie slots. X38/X48 offer 16lanes/16lanes (2.0) and P45 x8/x8 (2.0)
Were there ever any CRTs that could even do 120Hz above 1600x1200? I'll take higher resolution and slower performance than what a good CRT would do for me any day but thats just me. Could never go back from 1920x1200 :cool:
@Hias
As far as doing CF with a 512MB 4870 and a X2, I think it would be pointless as you won't have full access to the 1GB on the X2. The memory has to be mirrored in each cards memory bank and given the 512 on the 4870 is the lowest memory amount your stuck with 512MB and not 1GB. There *might* be some reasoning to this with a 1GB 4870 but even then I hardly see the point as scaling beyond 2 gpus tends to suck like I've said.
No way man. By the time Crysis very high, 1920 with AA/AF will be playable Crysis 5 would already be out.
the x2 looks great but im guessing that the dfi x38 users like myself need to hope that these cards work on this board unlike the 3870x2'
thanks for that.
http://www.xtremesystems.org/forums/...&postcount=527
but still i'm just thinking that the faster the cards results in less microstuttering because CPU becomes the bottleneck.
if we look for things to go wrong and be flawed, we never enjoy the good.
This is the same as people playing fps using lcd, some never notice input lag and some do.
those that do are measurin differences that few ever get to notice.
Microstuttering isnt an issue.
peoples perceptions are.
well, i don't think the cpu will be the bottleneck in crysis at 46fps, but it's not impossible.
Sampsa, can you please try crysis with frametimes somewhere between 33 and 40ms?
nvm
Ok, I found a new game where 3870 X2 shows very bad microstuttering and it is Race Driver: GRID.
I was checking my benchmark results for R700 preview I'm working on and noticed avg FPS was very bad with 3870 X2. So I immediately applied my previous testing methods using Fraps to log frametimes and analyzed results with Excel:
3870 X2 shows similar behavior in Race Driver: GRID than I noticed in Crysis but it seems to be even worse. In Crysis 3870 X2 rendered every other frame after ~21,5 ms and every other after ~49,5 ms. In Race Driver: GRID 3870 X2 rendered every other frame after ~24,9..27 ms and every other frame after ~40,2..42,4 ms.
I repeated the test with R700 and it showed again very stable rendering times between every frame: ~15,3 - 17,6 ms.
I'll continue and test Race Driver: GRID with 4870 CF and 9800 GX2. I've also asked about this issue from AMD and hoping to get some of my questions answered in the beginning of next week.
I have also noticed a lot of people are very skeptical about my tests. I don't claim to be an expert in this case and all these results are based on my own tests with simple tools and methods. I'm not even 100% sure if this issue with 3870 X2 in Crysis and Race Driver: GRID is actually microstuttering, lag or some other problem. I asked AMD about this and hopefully get an answer.
I have to return this sample tomorrow but imho I've deserved couple more from AMD ;)
Sampsa i didn't see what board you were using but unless you are using a TRUE 16x/16x board the 2nd PCI-E slot is only going to be 8x,, some even 4x,, and you could and probably are getting more issues because of that, than simply the cards themselves choking...
would you agree ?
can you confirm that R700 is microstutter-free when running crysis at 25-30fps (33.3-40ms frametime) ?
The real question is whether 4870x2 can scale in situations 4870CF can't
Thanks for taking the time out to address this issue Sampsa, It's greatly appreciated.
Hi
I wrote a little tool which quantifies the degree of microstutter from a given FRAPS benchmark. I don't claim it's foolproof, but it should give a roughly consisent approximation of the degree of MS, which scales does not scale with the absolute value of framerate (a non-dimensional index)
You can grab it at: http://www.ubern00b.net/microstutter.rar
The readme explains how to use it and exactly how the index is calculated. But in short, the index is the average percentage varation away from the stabilised local framerate.
An index of less than 8 ish seems typical for single GPU setups. With multi GPU setups we have seen some values as high as 40, although sometimes they can be of similar magnitude to the single GPU results. I think it may be dependent on the rendering mode which is used.
I'm very interested in microstuttering wrt the 4870x2 also, especially given its closer communication between GPUs via the xfir sideport :)
Normally I would say agree, but Crysis is a strange one...
I think the effectiveness of its motion blur smooths things along, as it seems very playable at any framerate over about 30fps. Normally I wouldn't be able to bring myself to play any shooter which averaged less than 60 and/or dipped below about 40fps regularly.
In Crysis when I enable Vsync it's sooo smooth you wouldn't believe it, alas when FPS drops to 40's the input lag makes things unplayable.
When I disable Vsync micro stuttering is the most noticeable thing ever. It's not really stuttering but the smoothness is totally gone.
Triple buffering + SLI = No go
They're mutually exclusive :P
So you can't use triple buffer in SLI? Well that's pretty much a deal breaker for me if gaming on LCD. Can someone confirm that triple buffer works in Crossfire? I use D3DOverrider to force triple buffer in games that do not support it and there is no way I will go multi-gpu if it can't be used.
sampsa, can you test 2 x 4780x2 ?
if ms fixed @ dual gpu/card.. does it help @ dual x2 card ?
thanks
The great thing about triple buffering is that there is (virtually) no performance hit. Well, you can't go above the monitor refresgh rate of course, but apart from that...
Use D3Doverrider (an application that comes with rivatuner) to force triple buffering, rather than going through the control panel.
Not sure about x-fire or SLI. Would be nice if someone could test it.