i bet every tom, :banana::banana::banana::banana: (ffs, richard?:ROTF:) and jane will want one when they go mainstream.Quote:
Ya, because everyone *needs* a Nehalem based system
Printable View
i bet every tom, :banana::banana::banana::banana: (ffs, richard?:ROTF:) and jane will want one when they go mainstream.Quote:
Ya, because everyone *needs* a Nehalem based system
They better have thought of a better way to do it, they have been working on this for 2 years and that has got to account for something. ATI already said to themselves that they would never do any big chips any more around the time they launched R580. R600 was way too far into development, so they had to release that and that is also their last big chip. So they basically started working on this multi GPU concept around the time R580 launched.
@Sr7, it was extremely unlikely that RV770 would have a separate clock domain for the shaders as this would require a major design change and their chip would be based on RV670 as so many have said. The would have to redesign their shaders to be able to make them run at much faster clocks than the rest of the architecture and these the shaders would probably get at least twice as big as their current shader are.
BTW doesn't power consumption scale relatively linear with increase in transistor count, but it scales exponentially with increase in clocks, isn't that right?
No, power consumption is much much harder to predict as such. It depends on active parts, design, electrical properties etc. Some parts uses alot more power than others and so on (scheduler vs cache etc). Just like we have Itaniums at 3 the frequency of GT200 with 2billion transistors and yet only 50% the power consumption. All at 65nm.
I don't know whether to buy a 2x 4870 1GB Crossfire to replace my G92GTS or hold on to see if this 4870X2 uses something better than Crossfire (and so would not be victim of the 512MB VRAM limitation or microst st stutter).
John
haha nv monolith fan; still it depends on the games and whether they are cf compatible/friendly.
I would not be surprised if ATi do release a 400 or 480 castrated SE version of the RV770 as they have always had those sorts of models in the past (basically same tech as the big boys but castrated and speedbinned so they can churn them out and sell bucket loads in the 9600GT segment).
I have no idea which of my games are SLi or Crossfire Friendly. I tend to play all sorts of games and love FSAA and AF.
I tend to play.
Crysis (Medium as high tends to choke my G92 GTS)
Bioshock
GTA San Andreas with the EBS Shader mod (looks quite cool).
Tom Clancy Advanced War Fighter
Quake 4
All the Half-Life 2 games Have Episode 2 and many mods
World in Conflict
Unlreal Tournament 3 and 2004
Colin McCrae Dirt (this one struggles on the G92 GTS)
F.E.A.R
Farcry
Black & White 2
Settlers Rise of an Empire
So hopefully they all work nicely with the 4870 1GB Crossfire, OR it is Zotac GT280GTX AMP! Time! (and ear plugs)
John
According to ExtremeTech:Quote:
A Lot Faster But Not A Lot Bigger
Somehow, ATI managed to cram two-thirds more stream processors, more than double the texture units, and improved render back-end units into a chip that's only about 40% bigger than the RV670. And that's all on the same 55nm manufacturing process. You don't typically do that by keeping the same architecture with just a few tweaks here and there, and in ATI's case, they have looked at and re-worked most of the major facets of the chip in order to meet these design goals.
First, there's the memory controller. Gone is the "ring bus" architecture of the 600-series GPUs. In its place is a new distributed controller design, with the memory interface spread out around the edges of the chip, and memory controllers spread throughout the die near the blocks of render back-ends. Comparatively low-volume traffic like PCI Express, display controllers, and inter-chip communication are handled by a centralized hub. Each memory controller block has its own L2 cache block. The net result is, according to ATI, better bandwidth utilization in less die space. Also new is support for GDDR5 memory, which will find its way to market first in the Radeon HD 4870. We previously wrote about the advantages of GDDR5.
This may be our first hint at what is new with the dual-chip "R700" product coming in a few months—these hubs may communicate between two RV770 chips in a fashion that is more efficient than in past multi-GPU boards.
Memory aint different. perhaps amount. power configuration doesnt matter. PCIe bridge is needed to expand the singleslot.
And yes, in terms of running 2 or 1 card there is no difference for the way crossfire works. They both work the exact same way, with the good and bad things.
Quite sure, and magic doesnt exist. 3DFX version today would be horrible I assume. But back then things like 800*600 was badass and quality was a town in inner mongolia.
And they state "hubs may communicate" and such. In short they dont know and just guesses. And with people that earn money on clicks. Guesses usually gets affected by it.
[QUOTE=Shintai;3081905]Quite sure, and magic doesnt exist. 3DFX version today would be horrible I assume. But back then things like 800*600 was badass and quality was a town in inner mongolia.
Erm I think you may have misunderstood my post. I agree that a Voodoo5 is rather useless (but an effective paper weight or retro computer art) in today's world.
However their technique of SLi did not cause any microstutter or problems in games for me back in the day of the year 2000...playing Quake 3 and Devil Inside and Half-Life.
John
Well if it's just the same thing as before I'm not interested in the X2. I don't care about what FRAPS says or what 3dmark says, I only care about how the game plays.
Let's just hope that the 4870 is good as a single card because this tactic of skimping on the design and bunching chips together in a microstutteringly fabulous package is not very enticing at all. If they're going to do multi-GPU configurations to compete it better be a good method of combining them.
Microstutter is starting to sound like a feature, imagine this on the box.
- ATI Radeon HD 4870X2 GPU @ 700MHz
- 1024GB GDDR5 Memory @ 3000MHz
- 24x custom filter anti-aliasing (CFAA) and high performance anisotropic filtering and Microstutter™
- PCI Express 2.0 support
- 256-bit GDDR3 memory interface
- ATI CrossFireX Multi-GPU Technology
- Microsoft DirectX 10.1 support
- Shader Model 4.1
- ATI Avivo HD video and display technology
- Unified Video Decoder 2 (UVD) for Blu-ray and HD Video
- Built-in HDMI with 7.1 surround sound support
- On-chip HDCP
:D
Yeah and these average Joes talking "my microstuttering is better than your!"
4x Advanced Microstutter engine
:ROTF:
8X with Catalyst AI enabled:rofl:
i have no clue as ive never seen Microstutter....but it gave me a good chuckle.
..and for that matter i never want to see MS (tm:)) either.
Hehe :rofl:
Next up will be Toms Hardware's VGA Charts where they have a bar for microstutter.
Anandtech's investigation into which Microstutter provides the best straffing jerk effects in games would be an in depth article.
Followed by X-Bit labs overclocking (getting more stutter out of your microstutter) article.
Oh and last but not least Kyle over at the [H]ardOCP on a rant over how Microstutter is the best innovation in gaming since the Glide API
lol
John
:rotf:
:ROTF:
funny bugger.
I can imagine these stickers on boxes
"Advanced Microstuttering Ready"
http://en.wikipedia.org/wiki/Glide_API
glide; never heard of it; i musta been too busy at university around then
haha i know enuf to know that i know very little :)
You were in uni and I was in Grade 6 in 1999!
oh hang on 1999...oh..no i was working by then; no time for games, too busy; i finished uni in '95.
AdvancedMicroStutterDevices
Today anounced the new Phenom Processor, Hector said that this brings new boundaries providing more Advanced Microstutter per clock than the competing Intel Core2 Quad cores.
Now that ATi (AMD's Graphics arm) also produce Advanced Microstutter Ready cards we have the perfect solution for gamers wanting their Microstutter fix.
In other news nVidia's plan to counter Microstutter with the GX2 and 795S (S representing Stutter) seem to look quite powerful.
All eyes turn to Intel for their lack of response (rumours suggest they are perfecting microstutter in Ray Tracing).
Joking aside it would be amazing if nVidia and ATi produced a option in the drivers to reduce stutter at a compromise of 10% (at most) in performance), it would certainly make Crossfire/SLi worthwhile if you could have the fluidity and consistency of a single GPU on a MultiGPU setup.
Ahh adamsleath
Glide kicked serious buttocks back in the day, I remember the Original Unreal and UT on a Voodoo2 and Voodoo3. wow
Oh and Warzone 2100 as well as Need For Speed !
John
ehh, you'd be surprised at how many people here think that phenom's offer smoother performance compared to conroe (even though it is less)
:rofl:Quote:
795S (S representing Stutter)
but geez i was playing UT with an onboard ATI vid on a P2...at lanparty (ie room full of geeks)
and it was fun, aswell as team fortress of course :D back around then.
and i didnt even know what an fps was, or lag for that matter.
...and then everyone became a counterstrike junkie.... but ut and tf more fun; cs waiting for friggin respawn no thanks.
Well 2 years of planning, developing and designing this chip with an R700 card in mind would have to account for something is my guess.
Something else that's weird is the fact that the R700 card is shorter than the R680 card and that while it probably consumes a bit more power. This is also done by doing away with the bridge chip or putting a smaller one there. If there is a bridge chip on there then it sure as hell is not the same bridge chip as the PLX chip on R680.
The last few pages are epic lol, either really funny or really pitiful :D
That's very disappointing to see people exagorating on that microstutering thing. Everyone read one small article and for some reason all started noticing it...even those who never used multiGPU solutions :rolleyes:
I noticed the microstutter on my 9800gx2 and in most cases I could play through in but it would get worst if I ran a 3d app in a window. Good thing nvidia released the gtx 280 so I could step up.
I don't mind others being concerned about it, it might mean that I could get a cheaper 4870X2 if people still think it's not good enough for them :p:. I don't have a multiGPU set-up and I would be better off getting a faster single card than an extra GPU atm (using a 7800GT :shrug:). If I would ever have money and the need for more graphical horsepower then I may decide on getting an CFX set-up as I couldn't care less about that micro stuttering.
I could see how that argument could hold water. Just before I explain my theory I would like to warn you that I am a Intel Fanboy and will never buy AMD...ever!
Ok now in some games which are GPU limited and have a few scenes which are CPU limited The Core2 will have very variable frame rates, yet the Phenom would be consistently low.
I am sure a lot of people would agree that if your frames fluctuated from 30fps to 60fps you WOULD notice the difference (you will get moments of fluidity and then moments of semi fluid frames. This would cause some annoyance in that even a constant 30FPS with no fluctuation OR frame rates closer to the average, ie 30 to 40FPS range would be smoother in this circumstance.
Therefore in SOME games a Phenom may be Smoother than a Core2, BUT as soon as you upgrade your GPU (to remove the GPU limitation) the Core2 will be far superior as it would handle the CPU limited areas better than a Phenom which lacks Venom.
adamsleath
Back in the day I had a Pentium II 400Mhz with a i740 AGP coupled with 2 Monster 3D VooDoo2's (12MB). That was a good PC for Unreal and UT. I can tell you there was no Microstutter back then! I then upgraded to a Pentium III 600Mhz with a Voodoo3 and then onto a 1Ghz Pentium 3 with a Voodoo5
Ahh those were the days.....
I am now deciding whether to get 2x 1GB 4870 cards, a Zotac GT280GTX AMP! or hold on until this semi magical Alien technology R700 dual GPU ATi solution.
It's a tough one, I know that I would be happy with either of the 1st two, perhaps even more so with the GT280GTX if it did not cost so much money (I mean seriously the high end always used to be around the £300 mark!!)
In the UK 2 4870 cards will cost the same as a Zotac GT280GTX AMP! as prices were leaked for an hour until (£210 inc vat). = £420 for two
John
JohnZS, stay on topic or stay out :up:
Now where are the die shots of the R700 :stick:
Perkam
Sry gom, but i see it on a daily basis. Crysis, GRID(Although much better after hotfix), Vegas2, and Mass Effect all studder when "strafing".
But when I drop cpu to 3ghz, it's not so bad. GRID was really annoying...nothing like going down a straight @ 300kph, all of a sudden for it to seem like 500kph...
Anyway, 3 months ago...almost every app had some issues. Today, 3DMark06 and Vantage still show it, as do the above mentioned apps. Vantage, in the space test, when the camera goes dow the side of the ship, and then into the meteor feild, will report frames 40-70FPS, but it looks like 12FPS. When it drops down to 30FPS or lower, it smooths out.
Now, I do not beleive all of these are stutter, per se, but there is a definate issue. And the issue is less with less cpu speed.:shrug:
Opps, sry Perkam, but you posted after I had already started my post.
Well Ive seen factory overclocked 9800gx2 for £289 + £20 delivery
http://www.abtron.de/shop/catalog/pr...ipk96brgstuhu5
And GTX 260 for £240
http://www2.computeruniverse.net/products/e90268519.asp
GTX 260 would be my choice, overclocking to GTX 280 performance is possible.
Thats like claiming SLI should be ultra perfect now after all the years of development and "field testing". Its its developed...0?
Shorter or longer card doesnt change anything. And I am sure a newer PLX chip can save some space if that was the matter. But we simply just talk PCIe 2.0 switching here. 0 effect on the microshutter. The issue is the fundamentals of AFR. So until nVidia and AMD drops that you wont see any fix. However they wont, because AFR gives the most performance/scaling.
And why would they drop AFR, they could also use something akin to vsync to even out the frames.
[QUOTE=JohnZS;3081917]Try compare AFR and scanline interleaving. It would simply be horrible performancewise to use the second today. Not even to talk about the endless issues back then with tearing, artifacts etc etc.
In short, its always been a painful experience with multi GPUs.
Can you explain why it's AFR at fault and not synchronization of output frames? :confused:
I fail to see how the actual rendering method affects stuttering :shrug:
Same here :clap:
Using 3870X2 since release and havent seen this micro-stuttering that all these nvidia fanbois are whining about, and they will have never used a 3870X2 :rofl:
Crysis, COD4, Grid, Dirt, Sega Rally revo and list keeps going with games ive never seen this so called crippling disease that multi gpu's generate, probably a nvidia trait, but that means i will never ever experience that :up:
You need to watch it there Cranky as your verging on ignorant fanboyism yourself.
As for the issue at hand, it happens with *all* current multi gpu setups, 3870x2 included. It varies in severity and most people won't notice it unless they make a point to look out for it and even then its usually not a show stopping issue. It is more prevalent at lower framerates which has been mentioned and this is largely why it isn't noticed much in most cases, as multigpu configurations usually result is high framerates.
I've used both 7900GT SLI and now 8800GTS 512s for the last 6 months and never have found it to be all that noticeable. I've also played around with some 3870/X2s for a few weeks and it was the same deal, it's there but I honestly wasn't irritated with it. People are blowing this *WAY* out of proportion and I think there are some severe cases of OCD here.... :p:
Yes it would be nice if it didn't happen at all but I personally don't find it to be a that big of a deal like many of you seem to think it is. People have used multigpu for years now and it seems that not until recently that this came to everyones attention so it sounds like more of an excuse not to go multigpu than anything.
If the stuttering is *that* bad you could try changing the name of the executable to one from a game in which AFR is not used by the ATI/Nvidia drivers.
Well, when it comes to stutter, I definately am not an nVidia fanboi, please feel free to check my hwbot to verifiy :rofl:
Anyway, video of stutter in Crysis:
http://www.youtube.com/watch?v=-5SXeR0torc
NOw, if it doesn't exist, please kindly explain what's going on? You seem to know all teh answers, Cranky, so what's the story here? I'd love a fix if you have one..
it would be better described as micropausing by that video.
I've had macropausing.. it was driver related.. both on ATi and nVidia.
i didnt notice anything. (in that youtube video)
i think you have to make sure you enable microstutter in catalyst
:|
:D
:ROTF:
Quote:
spouted garbage like this thread has attracted
^-well that's reassuring; what drivers are you running?Quote:
Agreed. I've been using SLI/CF solutions for the last 9 months and I have yet to notice any stuttering or screen tearing. A bunch of hypochondriacs the lot of you are
Same here
Using 3870X2 since release and havent seen this micro-stuttering that all these nvidia fanbois are whining about, and they will have never used a 3870X2
hypochondriac, hope so, but im really turned off multigpu solutions when i hear about problems.
also dont know whether it has occurred with nvidia sli.
you are suggesting problem may be in specific sli/cf profiles? anyway i see the logic...maybeQuote:
you could try changing the name of the executable to one from a game in which AFR is not used by the ATI/Nvidia drivers.
Yeah, I have to totally agree. But it's a good example of what to look for. Behavior such as this can all be classified as stutter to me...measured framerate does not accurately depict displayed framerate onscreen.
I've yet to get possession of my new place, so I only have Mass Effect, Crysis, and my Steam titles to play ATM...I chose crysis becuase the disc was in my drive already, and after reading Cranky's post, I quickly snapped a video. I could make many more vids showing many issue like this, and other strange behavior when running multiple cards. I can throw more vids up in the next couple of weeks...but I gotta move and get unpacked first!
In regards to drivers...all of them do it. I've tried every driver that supported Crossfire since day 1, have been running Crossfire X850's, X1900's, X1950's, HD2900's, HD3870x2's, and now 4850's(until 4870's come, and then 4870x2's).
Cranky, in regards to specs, ASUS Maximus Formula SE, RD600, XBX1 and 2, MSI P35, ASUS P5K, DFI BloodIron, E6600 @ 3.6ghz, E8400 @ 4.25, Q6600 @ 3.6ghz, QX9650 @ 3-4.5ghz( less cpu speed= less stutter), Ram doesn't matter (got all versions of DDR2, promos, elpidia, samsung, Hynix, Micron, 1-8gb makes no difference), Enermax Galaxy 1000w DXx, PPC Crossfire 750 (red one). XP Home, XP pro, Vista 32 and 64, nothing matters. In Crysis, I will say that this problem is less @ 16:10/16:9 resolutions in comparison to 4:3(which the video was shot in, 1280x1024). Whole system is water cooled, nothing tops 40c.
THere's no way there's a cpu/gpu limit here....low res for the video, and higher res plays much better. It's realyl does seem driver realted, and I'm sure it has to do with either memory management, or traffic scheduling, as common resolutions exhibit this behavior far less than un-common ones...1680x1050 is nice, 1920x1200 is running into gpu limits, 1440x900 is like butter, but jaggy.
Beleive me, I'd love nothing more than if issues like this were not a reality.
I've spent alot of money on hardware chasing decent rendering on my monitor...the big bastard is so dman fantastic, and so friken' crappy at teh same time...:shakes:
Now what i'm wondering is how much is its TDP HD 4870 has 160Watts, so this is 320Watts? how much can you lower with some tweaks? 20Watts?
And yet you persist, without knowing what it is you whine about. :yepp:
Ive use ATi drivers as they are released.
If your turned off multi gpu's, then do us a favour and go buy a gtx280.
So despite what others have stated in this thread your unsure whether this described problem effects nvidia? :clap:
Just a way to show that the directional button was pressed all the time, and having full view of the screen. Had it plugged in because I've been playing GRID and Trials2 with it ;)
It looks like you have no hardware probs causing that problem, maybe im not noticing due to gaming mainly 1650 or 1920.
It does appear as if res is too high causing the variations in speed, but we know thats not the case. Do the fps drop when this is happening?
I wonder if todays LCD monitors are adding?
Does it with crt i spose?
But dont beat yourself up over that prob dude, just grab the new X2 when released like the rest of us who enjoy the fastest cards, if only for a little while ;)
Well, hotfix driver fixed GRID alot, it's still there, but by and far less perceptible(was using DiRT .exe name to get Crossfire working in 8.5 and 8.6), so while it is an issue that plagues multi-gpu platforms, it is an issue that I know will be resolved in the future...3870's are affected by this the most, so I truly think I may not be far off in my assumption of the cause.(I'll just step out here now and say the same happens on my 8800U SLi rig, but nowhere to the same degree as shown in the video, and not in Crysis...at least not in a way that I notice it as much).
While this may be the case, case and point would be the 3870x2...bridge chip consumes power too, and is not present on single cards, also, higher data flow to two cards over one slot creates a higher draw through the pci-e slot for chipset communication(double the data means double the power draw..yes still slightly less, but still double the bus traffic). "R700" will most likely NOT be named 4870x2...unless they are purposefully binning chips for these card(more than possible, chips are out since like March).
because problems obviously exist,Quote:
And yet you persist, without knowing what it is you whine about.
and youve just instigated more whining
go check out the 4850 cf thread and see if you can actually help :lol:
with the problems.
your 3870 works great; that's just super; plenty of people around who are experiencing problems for one reason or another.
im curious and interested to know the solutions to the problems - PRIOR to buying into it.
..and i do get a larf out of things that dont work, perhaps because ive been down the sli road and i thought it was garbage.
well, i assume it is or else there would be no point in posting it.
but who and how to solve his problem?
and no, im not really interested in arguing; im more interested in the solutions to multigpu problems.Quote:
If your turned off multi gpu's, then do us a favour and go buy a gtx280.
So despite what others have stated in this thread your unsure whether this described problem effects nvidia?
and i think gtx280 is too expensive/overpriced.
and yes im thinking about multigpu...but i want to guarantee that i do not run into problems.
or else i would not be viewing this thread.
just trolling for the hell of it.
:rolleyes:
go try and peddle somewhere else; there are plenty of wanna be 2-bit-salesmen aroundQuote:
go buy a gtx280
why dont you go and buy a couple of 4850's :rolleyes:
Ok, I might be totally wrong, but that stutter seemed in perfect sync with the gun-swing (weapon-bob) resulting from the character using legs to walk instead of wheels.
Could it then be that this micro-stutter is the result of the game strangely simulating the walking motion? (The in-game view being fixed to the character model and so on...) :confused: I mean, try strafing in real life and you should see some micro stutter as well, right? Probably the same for forward walking, only the human brain smooths it out for ya'. (The same function, though, is not active when you're sitting in front of the PC. ;))
That's progress for you. I remember Wolfenstein 3D "glided" through the nazi chambers really smoothly....:rolleyes:
This arguing about microstutter reminds me of the bull:banana::banana::banana::banana: that always comes up when someone says how the human eye can't see over 30fps or 60fps, depending on which idiot is saying it that week. Bottom line is, not all eyes are created equal and not everyone is sensitive to the same things. I know people who use a 60hz CRT 8 hours a day 5 days a week and see no flicker while I can't even look at one for more than a few seconds without going crazy. People use the default mouse settings in windows their whole lives while the first thing I do when using a different computer is disable acceleration. At the same time you have people who say that 60fps is the most you will ever need while I see and feel a difference up to 120ish and maybe even more (i use at least 120hz for first person shooters). Just because you don't see or feel the microstuttering (or whatever you want to call it), does NOT mean it doesn't exist and that it doesn't bother others. When we are spending hundreds of dollars on these GPUs, we have every right to want them to work how we see fit.
I thought the same too, but know better as my 2900's stutter, but nowhere near as bad, and 1680x1050(higher res than the vid, which was 1280x1024) plays fine, without that "stutter". I'm sure many people could pipe up and tell you that this is not behavior of the game...
@shiznit93 & cadaveca
Sure. No problem. :p:
Just my :2cents:... ;)
cadaveca - have you tried adjusting the PCI-E frequency? Maybe the swap file is playing a part somehow?
The HD 3870 had a TDP of 105, then the HD 3870X2 had a TDP of 190, so you get the 20Watts i'm talking about. If you check NVIDIAs 8800GT to 9800GX2 you see the same range of optimization.
then for HD 4870 you have 300Watts or so, there is no way they can lower 70 Watts to get it to 250Watts that is already a lot of heat to disipate.
LoL...Stevil...you got my old dualie out of my BP6...you know I've been doing this a while...So yes, PCI-e Frequency has been changed, with no noticuble difference.
Also note, pls, that as I have mentioned before...that video is with in-game res set to 1280x1024...but 1680x1050 it's barely noticible...so your comment aobut swap file makes a wee bit of sense, however, it's the gmae itself, or the driver interaction, that makes the problem as evident as it is.
I mean really...I don't play @ 1280x1024...the video is only an example of the behavior.
GRID..in corners...when the actual rendering is far less than on a straight(due to draw distance)...it happens on the straights too, but nowhere near as pronouced...
It really makes me think of how the game is being rendered that is the issue...framerate will be 100+...appears like 20FPS...seems like as data flow to the card increases, stutter rears it's ugly head...like the data flowing in the card itself is stalling...maybe it's the data on the pci-e bus...I dunno.
ANyway, I'll end this here for now...maybe I'll start a thread on the issue at a later time, but I'll leave off with this:
The problem IS going away as drivers are updated.:fact: WHy, or how, I dunno...but it truly seems OEMs are aware of the issue, and are working to rectify it. We can hope that R700's multi-gpu communication implementation will overcome this problem, or be a fix, but until the card comes out, we won't know.
I also have a feeling that the driver for R700 is what will hold the card back. I do not think AMD wants another relase fiasco like UVD on thier hands, so I remain confident soemthing will happen.
Yeah, data flow was my thought too ;)
maybe EMI is causing corrupted data flow between the cards?
Not my area, but i wouldnt think so. Seems like bad latency between cores data but i doubt drivers can adress that.
4870X2 = DDR3 = Less power consumed than 2x 4870.
GDDR5 consumes less power than GDDR3 so why would they step back and use GDDR3? Besides, GDDR5 provides more bandwidth
that would make it more of a 4850x2 though if they would opt for gddr3, but why would they do that? the 4870x2 will be the top product, so the price of gddr5 cant really be the issue (and it doesnt use more power and is faster).
I think they will just bin the 4870 chips and run them at lower volts to keep tdp somewhat under control.
I've heard even 4890
The 4870X2 SKU is just made up by us since R680 used it but a different name might be telling as to how much it is actually like the X2
Sorry for thread jacking.
Does anyone know if there would be 2GB versions of the R700?
Also (and although this might be slightly off topic again (sorry)) did anyone see the article where 4xHD4850's in CrossfireX mode were slower than 3 and in some cases slower than 2 HD4850's here?
Perhaps the R700 is all ready and tapped out somewhere but is not being released due to the driver issues with 4GPU's?
John
I've heard late July availability so its likely taped out already. Driver patching up is likely a big part of the issue as well as production of cores + memory
GDDR5 limited quantities won't be too big of a problem if all the indications of the 4870 already being on sale and being launched June 25th are true