Wait, why is it unfair to compare a 1xgpu to a 2xgpu solution (ala 280 / 4870x2)?
What about when dual cores first came out? Were they not compared to (and favoured to) their single core predecessors?
Printable View
Wait, why is it unfair to compare a 1xgpu to a 2xgpu solution (ala 280 / 4870x2)?
What about when dual cores first came out? Were they not compared to (and favoured to) their single core predecessors?
To be fair though if the GTX 280 were droped to lets say... $399, that is $150 less than the projected 4870X 2GB price. Yes the 4870x2 will often be faster but with that much of a gap in price you may be surprised at how many people may opt for the 280 instead ( those who can't afford a $549 gpu, which in reality is alot of people ) Yes performance wise the 4870x2 would be the better buy but the fact remains you NEED that money to take advantage of that better buy.
Unless AMD soon drop 4800 prices the 4870 1GB will only be $40-50 less than a 280 if they drop to $399 ( I'd be surprised though ) I'd predict a drop to 249 for the 512 and 299 for the 1GB would result from a $399 280. At current prices I wouldn't still consider the 4870 512 competitive if the 280 reach that point.
Considering Nvidia now no longer have a more expensive high end solution like they've been accustomed too (88GTX, ULT, 98GX2 etc ), I'd assume the 55nm refresh / X2 counter will surely reclaim that top price point ( at least in profits, remains to be seen on performance )
I'm curious about the Hynix memory on these instead of the Qimonda on the 4870. I wonder how they OC. I am cautiously optimistic. :smoke:
so going by that, ahhh, 'logic' (if u can call it that) k|ngp|ins / fuggers etc 3DM & vantage scores dont impress u - or @ least shouldnt, cause they used multiple cards *&* multiple GPUs on those cards at that!
tsk tsk tsk how un-impressive of them to obtain such high scores using so many cards/gpus!
its just as un-impressive as using a 4870X2 to obtain more fps in a game! oh the shame!
touche on the usefulness - u were more useful checking out Win7 - stick to that!
psssst!: 4870X2 > GTX280 EOD
Can you leave this thread already? I hate to engage in ad-hominem, but all you're doing right now is violently defending your early [now regrettable] waste of money on your GTX280.
AMD has already redefined their high end solution as one that uses two chips on one PCB. It takes up two slots, just like the GTX280. No one cares if the GTX280 core is the better one.
At this point in time, for anyone looking to buy a single, two slot video card, the recommendation is the R700.
End of discussion.
This sounds like the old AtthlonX2 argument vs 2p4s on a MCM argument.
Doesn't matter how many cores or GPUs are on the PCB. If one beats the other and is the same price, then that is probably the better card.
BTW, is there a 4850x2 coming out?
Yes, scheduled in September.
According to rumours Hd4850X2 is coming out in ~midSeptember. No one knows the exact date but around then, I think. What partners/how much cards will be coming out is unknown.
Yea, it's not impressive, thats why Nvidia decided to drop GTX 280 price by $200 (and it's been out for like 2.5 weeks) and probably more in the future. :rolleyes:Quote:
Originally Posted by [cTx]Warboy
GTX 280 wins in Single-GPU Solutions. Hands down.
HD 4870x2 wins in Single-Card Solution. But is not impressive by my book.
You know what I think is not impressive? A $650-$700 card (oops, $500 now!) that couldn't outperform a $300 card by a wide margin. What was it again? The HD4870 has 80-90% of GTX 280 performance?
And NVIDIA never going to release another driver?
Going off the reviews of a 4850 in cf, it will. The question is at what price will they put it!
How about you silence your waste of lip, I don't have to leave the thread. I have as much right as you to be here. I've said it before, I'm not here to justify me buying my video card. Because It's my personal choice. I haven't bought anything up saying about what I bought or what I paid for it, So before you make claims against me. I'm also allowed to voice my opinion as long as I stay on topic and go by the forums rules. So you don't like it, Too bad.
Probably the wrong thread or forum even to ask this but it is sort of related :)
I'm hoping to upgrade from my 2900xt in time for CoD5 coming out in November. While the 2900 was a joke for it's price, it hasn't let me down in the CoD series of games (what I play predominatly).
So the question is what would be the better choice between 4870, 4850x2 or the 280gtx.
Each has their own strengths but I would love the 4850x2 if the price is right. what concerns me is not the microstuttering as such but does rendering in CF affect responses times in game (multiplayer remember). I've no doubt all the cards will make a solid 100-125 FPS with a gaming config turning graphics down, so FPS isn't the issue, just response on CF.
So, I suppose will CF affect gameplay resonse times and if so, the 4870 or 280GTX would be better choice come Sept/Oct time?
Quote:
Super Mega Analysis :
Benchmark from Various site
http://www.anandtech.com/video/showdoc.aspx?i=3354&p=4
Anandtech's Benchamrk
Age of Conan 2560x1600 4xAA/16xAF
$499 4870 X2 : 44.8
$499 GT280 : 22.7
Crysis 1920x1200
$499 4870 X2 : 39.1
$499 GT280 : 34.3
Oblivion 2560x1600 4xAA 16xAF
$499 4870 X2 : 50.3
$499 GT280 : 36.8
Grid 2560x1600 4xAA
$499 4870 X2 : 84.2
$499 GT280 : 40.6
========================================
http://techreport.com/articles.x/15105/3
Techreport's Benchmark
Half life 2 episode 2 2560x1600 4xAA/16xAF
$499 4870 X2 : 84.6
$499 GT280 : 65.5
Quake War 2560x1600 4xAA/16xAF
$499 4870 X2 : 100.2
$499 GT280 : 74.2 /
Crysis 1920x1200
$499 4870 X2 : 24.8 average /17 min
$499 GT280 : 20.3 average /17 min
Race Driver 4xAA 1920x1200
$499 4870 X2 : 111.9 average / 77.0 min
$499 GT280 : 67.6 average / 56.0 min
========================================
http://www.driverheaven.net/reviews....d=588&pageid=1
Driver Heaven's Benchmark
Call of Duty 4 - 2560x1200 4xAA 16xAF
$499 4870 X2 : 90 average / 49 min
$499 XFX GT280 XXX : 65 average / 33 min
World in conflict - 2560x1200 4xAA 16xAF
$499 4870 X2 : 61 average / 47 min
$499 XFX GT280 XXX : 44 average / 38 min
HL 2 Episode 2 2560x1600 8xAA
$499 4870 X2 : 110 average / 30 min
$499 XFX GT280 XXX : 69 average / 29 min
Minimum FPS
R700 mini fps very impressive.
http://techreport.com/articles.x/15105/3
Crysis 1920x1200
$499 4870 X2 : 24.8 average /17 min
$499 GT280 : 20.3 average /17 min
Race Driver 4xAA 1920x1200
$499 4870 X2 : 111.9 average / 77.0 min
$499 GT280 : 67.6 average / 56.0 min
http://www.driverheaven.net/reviews....d=588&pageid=1
Call of Duty 4 - 2560x1200 4xAA 16xAF
$499 4870 X2 : 90 average / 49 min
$499 XFX GT280 XXX : 65 average / 33 min
Call of Duty 4 - 2560x1200 4xAA 16xAF
$499 4870 X2 : 61 average / 47 min
$499 XFX GT280 XXX : 44 average / 38 min
HL 2 Episode 2 2560x1600 8xAA
$499 4870 X2 : 110 average / 30 min
$499 XFX GT280 XXX : 69 average / 29 min
Last Gen Multigpu vs latest round multi-gpu
http://techreport.com/articles.x/151...cles.x/15105/3
Techreport's Benchmark
Half life 2 episode 2 2560x1600 4xAA/16xAF
$499 4870 X2 : 84.6
$499 GT280 : 65.5
9800GTX SLI : 75.4
9800GX2 : 66.4
Quake War 2560x1600 4xAA/16xAF
$499 4870 X2 : 100.2
$499 GT280 : 74.2
9800GTX SLI : 82.3
9800GX2 : 69.6
Crysis 1920x1200
$499 4870 X2 : 24.8 average /17 min
$499 GT280 : 20.3 average /17 min
9800GTX SLI : 28.1 average /21 min
9800GX2 : 24.7 average /21 min
Race Driver 4xAA 1920x1200
$499 4870 X2 : 111.9 average / 77.0 min
$499 GT280 : 67.6 average / 56.0 min
9800GTX SLI : 88.2 average /48 min
9800GX2 : 85.5 average /72 min
4870X2 vs GTX280 scale
Age of conan : Twice as fast as GTX280
Crysis : equal
Oblivion : 37%
Grid : Twice as fast
HL2 episode 2 : 30% faster
Quake War : 25% faster
COD 4 : 40% faster
World in conflict : 38% faster
Power consumption
Power consumption taken from techpowerup
4870 : 274
GTX280 : 318
4870X2 : 436
GTX280 SLI : 564
Now , IF anyone was buying a powersupply for a GTX280 SLI , they really would need a really good 750W + PSU. A Crosair 750W which comes with 4x 8pin PCIE connector is really good for $110.
Micro stutter
Well according to some result microstuttering problem has been solved but i need to look at more result before i will go with that line.
Overclocking
Driver Heaven used a 670Mhz core clocked GTX280 and it couldn't beat a 4870X2. As for 4870X2 ocing , i still have to see more result.
CF ISSUE Known game
Stalker : amazing Crossfire scaling has been solved in this game : http://www.legitreviews.com/article/745/5/
The Witcher : unknown
World in conflict : solved , look at driver heaven result
Clive Barker's Jericho : unknown
Can member give more list of games that have CF issue.
More update coming as more preview start rolling out.
Source >>> http://forum.beyond3d.com/showpost.p...6&postcount=27
regards
how about you'll move it outside? both of you
Sorry if I overlooked it and didnt see it, is there a benchmark between 4870x2 and gtx280 for folding or something similar? I mean gpgpu, non-gaming performance?
:welcome:
Yah it would interesting indeed, isn't it so that you can't use only one chip on an 3870x2? Would be nice if they fixed that, it would be a folding beast.
Because of the way the folding client works, the nVidia GPU2 client pushes out way more PPD than a comparable Radeon. My 8800GTX does 4500PPD at 600 core/950 mem, a 9800GTX will do about 6000PPD. GTX 280 would probably do about 7000 PPD or more, but I haven't looked.
It's a major consideration for folding now that it's dropped in price like a proverbial rock.
DH uses 177.39 as well...
think they used older drivers because they dont re-test all the cards each time a new one comes out
i can't say this enough though: great job to amd for this... now I hope they're not having a party right now. They should be on their toes steadfastly designing and testing rv840 and r800. if they become complacent, they will go back down to nothing...
I have to wonder if the r700 would be limited by a generation 1 PCIE16x slot... I would hope that will be one of the things they test.
Nobody has any thoughts about Hynix versus Qimonda ram? I guess I will have to wait and see. I will probably be getting one of these cards unless nVidia can come up with something more compelling in that timeframe.
Count me in.. I'm the second and a loyal fan of single gpu solutions for the known reasons .. If SLI/x2 cards prove to be no PITA in the future then I'll go that route.. But for now I'm not covered.. Even if I have to pay paradox price/perf ratios like this round sometimes.. I still find GTX280 a wonderful chip in architectural terms but it definitely leaves a bad taste in the mouth the fact that Nvidia could have done way much better things with this round if they had die shrink it in the first place..
But in the vast majority of your post you hit right on target... Excellent post for the most part..
Now can we concentrate on the topic plz?
Well... I was one of those crazies who put a 9800 Pro into a 4x AGP slot back in the day. Everyone said I was stupid and that it would be horrendously bottlenecked. Turned out... I got pretty much the same FPS as other 9800 Pro owners.
So I would think not, the PCIE1.1 slots should still provide adequate bandwidth for most games.
Sorry for the anecdotal evidence here, just voicing my opinion.
There has been no evidence of the contrary either. If you have it, link please.
Just like when I told people I was putting a 9800 Pro into a 4x AGP slot and they said "OMG IT WON'T WORK!" "OMG IT WILL BE TOO BOTTLENECKED!" "BLAH BLAH BLAH BLAH!"
Put the card in... Worked fine, played games fine... Finished in benches a little below what people with 8x AGP got, no biggie. But of course, you NEED PCIE2.0 mach II.1 this time around, oh boy how you need it... Don't want to mess around here... Then there was the time I ran 3870 Crossfire with P35, that's right, 4x PCI-E 1.1 for the second card. Again, I was insane, it was foolish, unwise, wouldn't work right... Yet, the card worked and scaled just fine, finishing a little low on 3DMark06.
Now I'm sure we'll get to hear the old "it's not bandwidth, it's power!" Ah yes, because there won't be any... PCI-E power plugs on the card, perhaps room for 2 8pins even... But nah, gonna need that 2.0 :rolleyes:
Depends on the situation. Game, resolution, AA/AF, etc. all come into play with the current gen of cards and PCI-e bandwidth. Tom's did a review where the 9800GX2 did just as well on PCI-e 1.0/1.1 16X, as it did with PCI-e 2.0 16X in 90% of situations. However, some games at higher resolutions saw improvements with the added bandwidth of PCI-e 2.0. The most glaring example was Flight Simulator, where it lost huge performance on PCI-e 1.0/1.1 16X, at 1920X1200 with AA/AF.
I'm getting an HD4870, but I'll also be picking up a P45 based board. I'm not leaving anything to chance. I game at 1920X1200 and I love eye candy.
the P45 runs the 2 slots together at 8x and 8x if you want full speed get the X48.
http://compare.intel.com/pcc/showcha...&culture=en-US
What slots does dfi x48 dk have? as in PCI-e 1.0,1.1 or 2.0
my only concern with this card is the heat, and warranty ...
since this card is going to be hot, and i like to keep my comp quiet ... ill prolly get a waterblock
i never RMA ATI cards b4 , will ATI card manufacturers decline ur warranty if they suspect that u take out the cooler/put on waterblock?? i mean does that consider "modified" as they put it on many of their websites? i know that vmod = voids warranty, but what about putting on thermal grease/swapping a better cooler ....
i have put waterblocks on EVGA BFG, XFX cards b4, and due to PSU failure (which happened like 3+months after i put on the wb), cards were dead, and they let me RMA it.
I'm not aware of any ATI board partners that allow aftermarket modification and/or overclocking. You would void your warranty putting anything else on your card. If your worried about warranty your best off waiting to see if a custom 4870x2 comes out that includes the waterblock. I believe EVGA and XFX are the only board partners ( on either side ) that allow both overclocking and modification. As long as they recieve the card back in one piece with no physical damage they still honor their warranties. It would be nice to see some ATI partners do the same.
ahh thanx for clearing that up man
well stock cooler cant be that hot and loud, i guess heh
techreport said that its designed to operate @ 90+C
Yes, but it wont overclock with those temps, if you plan to do overclocking on the stock cooler, you MUST make the fan spin faster via bios update, and its loud!!!!!!! i can hear it from my living room when i am gaming. hehe but when i wear headphones i dont care for the noise.
If you want the max AND it to stay quiet then don't overclock other than via Overdrive.Quote:
Originally Posted by GAR
Guru3D readies X2 review for tomorrow.
Source - Guru3d
wow 4870s are @ as low as $254 on egg... some GTX 280 are selling as low as $420 .. wow
Asus 4870
POWER COLOR 4870
man i simply cant pull the trigger .. lol i am afraid the the moment i click buy the price is gonna drop :p:
i hope X2 is gonna be @ low $500, or $500
Well they state Review in the article but Preview in the Article topic, So who knows.
its a review of the preview. :rofl:
I was considering upgrading from my current card, but have since reconsidered. Even with the R700's impressive numbers, I just can't think of a game (that I would play) that requires this kind of horsepower.
What having super powerful graphics cards encourages is poorly optimised engines. When you have the kind of muscle of a GTX280 / 4870 / 4870X2 / 9800GX2, game developers aren't busy thinking "how can we make sure our games look as good, but run twice as fast" because the booming gpu market allows them to be lazy in this regard.
That is to say, we get wonderful looking games that often run far less well than they could. Especially a game like Crysis, for example, that seemingly does not cater to even medium spectrum computers at all.
And what do we need all this horsepower for? All this raw processing capability? A bigger is better philosophy with regard to the relationship between gaming and graphics is, as far as I'm concerned, is a cycle that only serves to trap itself, but enslaves us (the gamers, the consumers) in the process.
Didn't they stop making 256MB cards in the late 90's?
For price 4850 CF is pretty good and dosent lose that much to RV700.
Now I have to wait for 4850X2 :)
Not really. In most of the tests it is equal, and there are just as many tests that it loses in by 1-3fps than it wins in. So I'd call them equal for the most part. There is a chance the performance delta may be negative (it may be on average slower), but we're talking about less than 3% delta here, for a card priced at $499, while two 4870s run you $599.
They still have to fix scaling issues in Crysis though X_X Beyond that, almost every other game gets pwned by the X2.
Perkam
in the last page of the review the person said:
On highlighted comments, is this reviewer talking out of his/her ass? pure speculation? 2x more heat?Quote:
The Verdict
Alright then, time to wrap up this preview. As stated, there where some limitations enforced on this preview, mainly related to the number of benchmarks we can present. In our full review we'll obviously include the entire test-suite. But let me start by saying, all other benchmarks we took, were showing very similar performance. So for me it's simple, if this product launches at a 449-499 USD price level, AMD will have a unadulterated hit in the high-end segment.
Well, this was an interesting test for sure. The one thing each time I stumble into when I do something Crossfire/SLI/X2 related is ask myself the question: should you really be comparing a single GPU based product against a dual GPU based product? Is that even a fair thing to do ? The answer I'm afraid is two-fold. I strongly feel that I have more confidence in a single-GPU based graphics card mainly due to the fact you'll have much less worries to think about. See with Dual-GPU gaming you'll always run into slight irritations; the fact that sometimes a game is not yet supported or working at all in Crossfire. Also there's other stuff to consider, power consumption is doubled up but also .. your graphics solution will all of the sudden will create 2x more heat.
So that's a bit of a culprit there. Looking at it from the other side, if we take a GeForce GTX 280 (as presented in the preview) and compare that directly to bang for buck gaming performance, then the single-GPU solution is being annihilated. So this is a bit like comparing oranges and apples, both nice fruits .. yet very different. See, I tend to believe in the single-GPU based products, but just can't ignore what we tested today, the R700, is an extremely powerful graphics solution that will offer much better gaming performance.
What so hard to understand about a dual gpu producing twice as much heat? His point was once you go for a dual / multi gpu option, the heat produced doubles over a single gpu (eg 4870-4870x2 ) Its a legitment concern I'd say. Not everyone wants to / can deal with the heat these cards put out ( eg case too small, low airflow etc )
Well a lot of that depends on how well the stock cooler dissipates that heat to make temps manageable ;)
Isnt the price $549???
from guru3d preview
Quote:
Once the R700 arrived I spent a complete day with it just to test and see how many problems we'd run into. Fact is, I had close to none.
A lot of games that where not shown in this preview due to an enforced limitation, ran all really well.
I've had a lot of critique in the past on Crossfire, valid criticism as only the popular games editors mostly used had good Crossfire support. With the new 48x0 X2 cards we think the tide definitely has turned.
My experiences with the card have been really good, the performance is flabbergasting.
If priced right, this is the new king in the high-end arena of graphics cards. I'm impressed, really impressed. And that's a hard thing for a company to achieve.
http://www.guru3d.com/article/radeon...-x2-preview/14
seems like good news !
regards
cait wait!!!! cmon!!!!
good news? its no better, and sometimes slightly worse than 4870 XFire. which is to be expected. the X2 cards really havent been the same as xfire, i've seen them act worse than xfire or SLI.
Hilbert doesnt like overclocking very much for reviews nor much at all that i have seen him do, and as you can see he was only running his E8400 at 3ghz.
a bottle neck for those video cards thats for sure
his scores would have been higher if the cpu was higher.
but for the average go overclocker 3ghz is a nice medium. nothing special but its better than stock.
in terms of "VS GTX 280" .. yes it crushed it... you cant deny that. i mean were we expecting anything less ?
price is going to be the killer i think. i think Hilbert was dreaming with a 449-499 pricetag. i dont know...
you'll also notice in the review that the studdering, in GRID, was not gone. but that was only at 2560 resolution.
last thing and ATI has done this before, there was a big thread about this already.
ATI crippled Hilberts review by only allowing certain programs to be shown...
thats like ATI saying "we know it sucks in UT3, or Crysis so we are not allowing you to show it"
he also reported his temps as 80+c IDLE and 90+c load... that card WILL die if left to run at thos temps all the time.
the silicon on the dies will die.. period...
again this is something i think hilbert may have prevented if he set a fan profile. which may not be possible in the current drivers.
yeah , the guru3d should have used a more powerful processor, although the scores seemed pretty good considering, but i think it was at its limit, because at the end of the review.. the OC fairly substantial clock increase on the x2 only provided 4 frames... so to me it looked like a bottleneck, he should have ramped up the 8400 to 4.0ghz. :up: interesting in this test the 4870 beat the 280gtx on a few of the benches, so it seems the 280 is more cpu bottleneck sensitive than the 4870. Although for the average user, that is probably what they are looking at in terms of cpu power.
i think you'd have to life in the dark ages to believe that the card wont die if left to idle at 80+c and 90+c load.
i never said it would die a horrible quick death, but given a short enough span of time left running like that it will die.
those temps are just rediculous.
same for intel chips listing their max temp at 100c.. i dare anyone to run their chips at that for any extended period of time.
i would say the same, as it has been with nvidia cards for a while now they are very cpu dependant. wouldn't you agree? i havent run ATI since the 9800 days so i cant say how they have reacted to lower cpu mhz, but i imagine its the same way.
what i need to see from both companies are cards which dont need a super clocked cpu to show their true colors.
a cpu independant video card i feel is possible. but i am no gpu engineer.. so it's just wishful thinking on my part i think.
I had an overclocked celeron that ran 90c for over 7 years. It is still running to this very day.
Your example isn't very good. Intel recommends a max of 100c, but even their stock coolers keep it well under that. ATI coolers are designed to run at high temps for sound reasons, though they are perfectly capable of cooling the chips well below the stock settings.
first of all no intel chip, beside the P-M is specified to run more then 100°C for a longer periode of time, and the P-M can do it without a sweat.
Also ask all the X1800/X19xxXT users which temps there cards reach and how long they are using it. ;) (fyi X18xx/19xx series was/is on of the hotest cards with stock cooler :P )
I dont remember that anyone complained that cards where dying like flies.
Original G80GTX and Ultra's could run at 80+C all day. And the X1800s and X1900's were ridiculous for heat - lets say 100+ load was common and they sure as heck didn't die as flies.
And those are quite the claims you're making there Lestat - but seeing how your recent history of defending Nvidia cards is, I'm hoping you would've come up with more proof. Anyways, while the G200s certainly benefit from more CPU power, I'd say multi-GPU cards suffer the most from CPU limitations due to the inherent nature of data needing to be transferred and managed by the computer.
Also, considering that other (p)reviews have shown that the 4870X2 can actually scale better than the 4870s in CF at times, its not a given its no better
And again, just to prove that you don't like to read reviews:Quote:
Once the R700 arrived I spent a complete day with it just to test and see how many problems we'd run into. Fact is, I had close to none.
A lot of games that where not shown in this preview due to an enforced limitation, ran all really well.
I've had a lot of critique in the past on Crossfire, valid criticism as only the popular games editors mostly used had good Crossfire support. With the new 48x0 X2 cards we think the tide definitely has turned.
My experiences with the card have been really good, the performance is flabbergasting.
If priced right, this is the new king in the high-end arena of graphics cards. I'm impressed, really impressed. And that's a hard thing for a company to achieve.
http://www.guru3d.com/article/radeon...-x2-preview/14
Quote:
Once the R700 arrived I spent a complete day with it just to test and see how many problems we'd run into. Fact is, I had close to none.
A lot of games that where not shown in this preview due to an enforced limitation, ran all really well.
I've had a lot of critique in the past on Crossfire, valid criticism as only the popular games editors mostly used had good Crossfire support. With the new 48x0 X2 cards we think the tide definitely has turned.
My experiences with the card have been really good, the performance is flabbergasting.
If priced right, this is the new king in the high-end arena of graphics cards. I'm impressed, really impressed. And that's a hard thing for a company to achieve.
http://www.guru3d.com/article/radeon...-x2-preview/14
:up::up::up:
Was the overclock that substantial?
I saw a 19mhz overclock or 2.5 percent. Sure the memory was overclock quite a bit more (about 10 percent), but considering the bandwidth that ddr5 provides, I don't think it will have a huge effect.
We saw a 4 frame increase which seems inline with the overclock of 2.5%.
Not saying it isn't being held back by the processor, probaly is considering that multi chip or card solutions need as much power as they can get. But an overclock of 2.5 percent, we cannot expect it to get 10 or 20 percent increases.
http://www.guru3d.com/article/radeon...0-x2-preview/9Quote:
At that resolution [2560x1600]we did however get some stuttering here and there. Something ATI definitely needs to focus at as it is reported a lot in our forums as well. Other than that, breathtaking stuff ...
stu..stu..stu..
so the 4870X2 is to be released when in August? damn can't wait
Should be out around the 12th of august - wish it was sooner!
To clear temperature issue a bit - most modern silicon transistors can handle temperatures up to +150C with short peaks of up to +200C. Manufacturers are giving safe range way below practical max. as a safety measure. Every chip can have hot spots, so it is better to keep them 20-30C lower than they can survive. It is true that at higher temps chip will degrade faster, but the difference between running chip at 30C and 100C isn't big. It gets big when you are within 10% of maximum temperature.
On a side note I had Duron 700MHz CPU running for 2 weeks with temps over 107C due to bad cooler mount! 107C read from motherboard sensor - Durons didn't had on-die sensors so real temperature was even higher!! :D CPU survived 2 years till my friend crushed the core :).
well the other thing is overclocking, but then again even if i were to put waterblock on it and get temps to 45~50C @ full load, how much more clock speed would that yield over stock cooler @ 90C?(without increasing voltage)
also did most of the review sites use that ATI profile trick to increase fan speed to like 40% or more??
according to mascaras it helps the temp a lot, 4870 fan @ 45% = 56c full load, vs. most review sites reporting 80+C for 4870
http://www.xtremesystems.org/forums/...d.php?t=194052
so thats something to think about