A single R700 on a P35 board, i think it will still kick some serious a$$es.
Printable View
A single R700 on a P35 board, i think it will still kick some serious a$$es.
arent theyre rumors, that crossfire will become compatible with nvidia chipsets?
would increase the amount of potential customer (concerning multi-gpu solutions)
but i cant see a good reason to buy nvidia chipsets, anyway:ROTF: (except triple-sli o.c)
No, PCIe gen2 brings you double the bandwidth of PCIe gen1, so no you have 8,0 GB/s (PCIe 16x) in each direction with gen2 while you 'only' had 4,0GB/s (PCIe 16x) in each direction with gen1 . Gen1 is still more than sufficient when using a single card and this will still be the case with R700 (or 9800GX2 for that matter). I don't think gen1 will even be of any hindrance when doing CF with two 4870s for example, if you have 2 full 16x slots that is.
Why does everyone keep saying that NVIDIA does not get the right for QPI, I just won't believe it until it is finally so late that I just can't deny it anymore :p:. No NVIDIA fanboy here though, I for some reason liked ATI more than NVIDIA although I can't really think if a good reason why, maybe I just like red or something.
just a bit more infos here:
http://forums.vr-zone.com/showthread.php?t=296463
Quote:
Radeon HD 4870X2 to be available by end Jul, custom designs by August
We have seen how the 4850 can take on the 9800GTX, 4870 taking on the 260GTX. According to news we received, the Radeon HD 4870 X2 which integrates two RV770XT GPUs (R700) model 4870X2 ATi made cards will go on sale by end of July.
The ref model PCB is named Spartan, model B509. Clk speeds are uncertain at press time. It will come with 16X32M x32 GDDR5 memory, 512 bits wide, 2GB(256Bit、1GBx2) and will sell for 499 USD.
Source: OCW
2GB(256Bit、1GBx2)
do my eyes see this correctly 2 gb¿? omgoooses!!!:p::yepp::up:
Mine too :D
2GB:woot:
if true
my card
Speculation posted by someone on that forum, source without link. :clap:
yes, that thought did occur to me;), but hopefuly someone can squeeeeeze 2GB onto the card :lol:
maybe they have triple sided gddr5 now? :hehe:
time will tell :shrug:
A 1.1 16x slot will probaley be fine even on the 4870x2 as most of the gpu to gpu traffic is handled on the pcb and not over the slot, just like the 3870x2 and GX2.
The only reason a PCIe 2.0 board would be faster ( at this time ) is due to chipset differences and not the slot itself. ( eg 780i vs 790i ; 790i tends to be about 5-10% faster with SLI )
And madness? This isn't madness... this is AMD! ( sorry had too :p ) If all reference cards ship with 1GB of GDDR5 on black PCBs, I'll die of happiness.
EDIT: Heres the apparent OCW source - http://my.ocworkbench.com/bbs/showth...335#post433335
"Radeon HD 4870X2 to be available by end Jul, custom designs by August"
i wouldnt hold out for the custom designed ones, apparently making a decent non-reference version of these dual die cards is very difficult.
note the sparseness of non-reference versions of every dual die card ever made by anyone.
SPARTAN ... hummmm where did I see that before ?!
AHHHHH here:
GTX280 Review @ Guru3D
http://www.guru3d.com/imageview.php?image=13886
http://www.guru3d.com/imageview.php?image=13892
http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif http://www.techzonept.com/images/smilies/icon_lol.gif
I guess the fact that there is a 2GB model pretty much proves that there's no shared memory between the two GPUs.
Or that the sharing is not ideal for most applications and games.
There has never been any real proof for shared memory, the proof that is represented lately only suggest a better cooperation between the 2 GPU cores. This could result in lower memory usage but not shared memory, not for this generation at least as future generations can improve on this.
Obviously, if ATI's really going for the multi-GPU way from now on, they'll have to improve the GPU cooperation, starting from now
They need an MCM type package.
I don't think they desperately need an MCM type of package, the connection between the 2 GPUs can be of low enough latency without the GPUs residing that close to each other.
Well, the 1Gb on GPU-Z can be taken two different ways...
#1- Shared memory
#2- 2x1Gb, like speculated.
Everyone was surprised at 40TMUs, everyone was surprised again with 800SPs, most were surprised at ~950tran and ~260mm2.
Why can't AMD/ATi surprise again with the R700?
That's probably because there is this saying that goes something like this:
"If something looks too good to be true then it probably is."
I'm also hoping to be surprised though.
Well said. I hope to be pleasantly surprised as well. 1gb of VRAM per gpu just makes more sense on a card like this given its marketed as their high end part. If it turns out to be the case, the card will probably even take on the 280 at 2560x1600 which would be quite sweet for the 30" LCD users ( and for future and current games/mods that will use 512+ Vram ) Still the cost may make the price budge.
Good job, you tried to insult, yet you're wrong. Just because a card is on a single PCB, that doesn't mean it doesn't use Crossfire :rofl: What logic is involved in arriving at your conclusion?
And a lack of pushing Crossfire on AMDs part is more than likely only due to a lack of marketing overall on their part. It's not like this observation is some sleuth-like insight into their entire technology camp.
did he say arse :D
I agree.. the cost of 2gb will drive the price up too much for an initial release.. I think they really want to come in considerably lower in cost than the gtx280.. but maybe someone will make a 2gb model after it's been out for a bit.. it would be nice to have at least 512mb per core, but 768 would be better... I guess we wait and see though.. I am really looking forward to this card!!
I wonder when we are going to see the 1st review of this puppy.
Must admit it is seriously making me think before putting down my cash on either a 2x HD4870 1GB setup or a GT280GTX. (the cheaper option), am hoping that this is cheeper than 2 x HD4870 1GB cards AND packs more of a punch.
John
According to Fudzilla they have received confirmation that it will be 2GB and codenamed spartan with an availability date of end of August (just in time for me b'day!).
http://www.fudzilla.com/index.php?op...=8279&Itemid=1
and at digitimes: http://www.digitimes.com/mobos/a20080702PD213.html
Makedon and Trojan are code names that have been known for a while, I guess Spartan fits into the code name scheme.
More sites than fudzilla are claiming 2GB so it might just be correct.
Ofcourse its correct. They just forgot to say its 2x1GB.
So wonder if we get to see someone cooking a egg on them too. :)
I really don't care what they name it allthough "this is sparta" would be nice .. hehe :D
If it performs the rumored 15% better than 2 x HD4870 in CF then I will be getting one.
The crossfire side port is believed to added the rumored +15% performance advantage over stock 4870s. Individual 4870s are more likely to clock better I'd think however.
latest from Fudzilla today is ETA now last week of August...didn't they say last week of July a few days ago? Not sure they know anything :D or may be it was Vrzone...
http://www.fudzilla.com/index.php?op...=8326&Itemid=1
Fudzilla is about 5% correct on this gen's ATI news.
"As fast as a 8800GT" LOOOOOOOOOOOOOL
Let me be more clear, I have the card and already tested it and compared to 4870 CF in 3DMark and real games. There won't be +15% gain in performance atleast yet, maybe later with more mature drivers.
Which driver are u using? Do u have the gddr3 or gddr5 version?
when can some details leak?
im getting good at waiting...no im not :(
And noise ? Same as a 4870 ?
Sampsa I BEG OF YOU, can you do a frame time benchmark in either Lost Planet or Call of Juarez and tell us if there's microstuttering?
I don't need FPS numbers I just want the sequential frame delay times compared to each other. You don't need to tell me "one delay is 10ms, the second is 50ms and the third is again 10ms" or something. Just tell us if there are significant problems of synchronisation
PLEASE!!!
Strange, mine sample is more just 15% faster than 4870 CF. Sampsa, share with me you driver, 'cause i think there is something wrong.
@spoof: Thats exactly what i think and why i asked him about his driver version.
So how hot is that chip in the middle? Looks like I can drop two MCW60 I have on it and save some money and not get a full block, maybe add a stick on HS to it? That card looks like its going to be a fire ball if its not on water. :rofl:
So july 14th is the now rumored NDA date on performance or what?
Previews on the 15th I heard. Driver is delaying launch.
Good move I guess. Undercut GTX 280 sales without even having to launch a card and build up the hype train (and healthy markups for the profit margins ;))
That's my point - if the leaked numbers blow away the GTX280, no one will bother buying it without a further price drop since the rumoed $499 for the 4870X2 puts it at the same price bracket right now. And if those numbers look awesome, people will hold off on the 280 until the R700 hits the floors.
GDDR5 and driver is sample_vista32-64_R700_8-52rcp2.exe
I've tested with 3DMark06, 3DMark Vantage (Performance & Extreme), Crysis, Assassin's Creed, Call of Duty 4, Bioshock, World in Conflict and Race Driver: GRID. I do my game tests playing the actual game and recording min, avg and max FPS with Fraps.
Sorry, I haven't tested Lost Planet or Call of Juarez, but haven't noticed any problems with the games I tested.
please let us know about microstuttering :worship:
they cant because it wont be :rofl:Quote:
at least tell us if the micro-stuttering is fixed
you'll have to pray for anti-micro-sluttering profiles or some strange set of tweaks depending on the game
probably.
micro-sl' is under nda:rofl:
"i can assure consumers that all crossfire related bugs have been ironed out"
:ROTF: i saw this message on one of those sky banners attached to a flock of flying pigs - truely, would i try to mislead ?
i dont; i just dont expect a fix. sli and cf has been around for ages and it's still flawed.
i'd like multigpu to work as much as anyone.
& i dont wanna be plagued by disappointing glitches with an already expensive card.
hopefully the games i want to play, play nicely with multigpu. what happens if they dont? tough toenails? buyer beware?
believe it or not i think multigpu is a great idea - if it works.
performance boost from existing gpu's, more sales for producers everybody wins - if it works well.
if some or most apps run perfectly well with xfire and other have problems then my logic says it's a programming problem related to specific apps. there's no helpful point in dismissing problems if they exist.
a list of apps that are non xfire compatible would be useful - perhaps "not compatible with multigpu" labelling:rolleyes:
im not sure of any tests aside from using your eyes..
you can google, i came up with this:
http://www.hardforum.com/showthread.php?t=1317582
:eh:Quote:
If I notice microstuttering, can I minimize/eliminate it?
Yes. By running the game at a setting where your graphics cards are able to output more than the monitors refresh rate (that is, the maximum FPS the monitors are capible of; the pixels on your screen can only change so fast) microstuttering is eliminated completely. Most monitors have a refresh rate of 60 or 70Hz, meaning you would need 70 or 80 FPS to eliminate microstuttering
so you could just run your game of choice at ~65 FPS, constant (or relatively constant aside from some dips), and then set your display to 60hz.
not a problem unless your dying to have 1920 4xAA crysis veryhigh...
I still have no video card in my rig... T_T
Stuck with laptop. I'm not sure if I can hold much longer
LCD's don't refresh like that though. The hertz refresh rate is quite irrelevant.
Do CRTs microshutter???
What's a microshutter?
sorry, not 'multigpu certified':lol:Quote:
not a problem unless your dying to have 1920 4xAA crysis veryhigh...
when can we expect benches for these things?
Google "crossfire microstutter" or "sli microstutter", you'll get a plethora of results.
Actually, this pretty much scared me away from xfire just now. Poo!
Basically it's when AFR mode is used and the GPU's put on frames asynchronously.
Here is a vivid example of micro-stuttering (honestly, that one actually calls for a "turbo-stuttering"). :rofl:
Wow that video looks even worse than the explanations about microstutter that I've read, looks like a poor video card struggling for a few FPS...
not fps related then? :shrug: owell, back to the sync problem...probly gets worse the more cards you try to 'sync.'
FRAPs FPS log can illustrate it quite well, with an Excel graph, but it's pretty much a matter of plain observation.
If microstuttering is an AFR problem, why don't they go back to the split frame rendering (sorry, don't know what the technical name for that is)?
sfr = screen tearing, and can be slower than afr, although maybe it works now.
wasnt there a chequerboard method? - then you could have 'super-tearing' :D
whatever method works best for the particular app hey?
afr2 seemed to work best with bf2 and m2tw and tf2 and just about any other game ive played...need for speed (wth sli)
when i tried sfr, my fps were much lower and i got strange artifacts in m2tw.
newer games are probably all just super and have no problems, 'cept crysis; any other apps people know of with multigpu problems?
I guess they should keep both way of rendering then. From the video the tri-SLi GTX280 stutters pretty badly, it might be less annoying to just deal with the tearing in SFR mode.
i think not.
there are a number of theories about microstuttering but none are thoroughly put to the test. there aren't any hahrd facts to pin-point the exact problem that's causing it. then there is also a great deal of misunderstanding of what microstuttering implies and i see people labeling pagefiling-symptoms with microstuttering.
as it's all still a big gray arrea, i notice a great deal of people aren't directly talking about the same thing. and to conclude microstuttering from a video is just silly. normal video framerates are between 24 and 30, remember?
i dont think so, but if it was how would you fix it? rewrite the game code to access pagefile differently? add more vram to the gtx280 :p:Quote:
i see people labeling pagefiling-symptoms with microstuttering.
i thought this was the 'explanation'Quote:
the graphics cards put out frames asynchronously.
ok when i looked at the video, sure looked like decent fps, but yes i saw the marked stutter or 'sticking' which reminds me of a classic bottleneck 'skipping'
id like to see a video of the same tri gtx 280 at slightly lower settings (ie higher fps) and see the result - i suspect you would still see the skipping / or stutter due to a syncing problem, but im curious to know whether stutter can be eliminated at high fps for example.
i guess there are some 4870 crossfire videos around on youtube somewhere...
http://youtube.com/watch?v=n8I9goSzo50 - wot game is this?
http://www.youtube.com/watch?v=rHfW6J7vHJA - this looked cool
we send ppl to the moon and cant make smooth gameplay?
:D
Moon is still virgin.
aww shucks; avoiding crysis like the plague.
not really acceptable.Quote:
normal video framerates are between 24 and 30, remember?
maybe i should just get a 4850 and stick it with my opty :p:
and get a copy of assassins creed :D and grid.
It's not just a crysis problem, it has been ongoing since SLI and CF...
Microstuttering makes all multi GPU solutions to date, including 3870x2 and 9800GX2, WORTHLESS.
I did a quite bit research about this, with both my (G92 SLI) and a friend's (9800 Gx2) scores, five games benched.
Assassin's Creed and World in Conflict aren't affected by asynchronous frame rendering (microstutter)
Crysis, Lost Planet and Call of Juarez are ABSOLUTELY affected by asynchronous frame rendering.
I have written a few articles about this which caused some massive uproar and got me banned from several forums. Here are some frame-time benchmark results from Call of Juarez.
Frame number / The time the frame is rendered / How much time it took to render that frame (ms)/ Momentary FPS
48 753 10 103
49 769 16 61
50 784 15 67
51 790 6 174
52 814 24 41
53 832 19 54
54 838 6 178
55 859 21 47
56 877 18 57
57 881 4 235
58 906 25 40
59 921 15 65
60 928 7 142
What FRAPS timedemo would show you: 13 frames rendered in 175ms = 75 FPS
However the real fluidity is much more different than this. You are basically playing at 50-something FPS, with frames rendered at 150FPS being inserted every third frame. Those third frames with super high FPS mean NOTHING for your gaming experience. They do not make your game more fluid. They make micro-stutter. That scene-> you are playing at at most 55 FPS. What the benchies show you is that you are playing at 75 FPS.
This is why you should NEVER, EVER compare single-GPU scores with AFR'ed (SLI/CF) scores. NEVER. And this is also why ALL of the review sites are massively misleading. "A second 8800GT bumped our score from 40FPS to 65FPS!!" REALLY? But you forget to mention that 65 FPS is not comparable with that 40FPS, don't you? It wouldn't matter even if the scenario was like this, would it:
Frame 1 rendered at T = 0ms
Frame 2 rendered at T = 1ms
Frame 3 rendered at T = 40ms
Frame 4 rendered at T = 41ms
Frame 5 rendered at T = 60ms
Frame 6 rendered at T = 61ms
6 frames rendered at 61ms, so 100 FPS huh? Those 20ms gaps where you receive no frames from the cards, and put your real FPS at something like 50 FPS should be ignored because they don't show in the benchmark, and all you care about are the benchmarks.
I'm amazed at after this microstuttering situation broke out how all major review sites are ignoring this and still comparing AFR FPS's to non-AFR FPS's.
DO NOT BUY SLI, DO NOT BUY CROSSFIRE, DO NOT BUY 3870X2, DO NOT BUY 9800GX2, (PROBABLY) DO NOT BUY 4870x2. YOU ARE GAINING ABSOLUTELY NOTHING IN AT LEAST HALF OF ALL GAMES. YOU ARE PAYING TWICE FOR A 1.4X INCREASE IN REAL PERFORMANCE IN HALF OF GAMES. THAT DOES NOT MAKE SENSE.