maybe because they have the same Bios BUG as HD3870s , and you have to flash the bios
>> http://www.xtremesystems.org/forums/...d.php?t=174746
regards
Printable View
maybe because they have the same Bios BUG as HD3870s , and you have to flash the bios
>> http://www.xtremesystems.org/forums/...d.php?t=174746
regards
well techpowerup did it some how. BTW, the person reviewing the card was W1izz, soo he prolly played around with the program to get it working; same with gpu-z. id expect ATITool 0.27 to have ocing abilities for the x2, when he releases it.
did you try ATiwinflash to see if you can select each GPU for flash?? (like HD3870 Crossfire)
btw: 878Mhz here >> http://www.xtremesystems.org/forums/...53#post2730853
AMDTOOL should work Fine ........
regards
http://www.youtube.com/watch?v=VeHgoyGRbuI
what about this ?
I get the feeling things are about to get extremely controversial on this card...
Look thru the discussion about the review, you'll see people attacking them left and right, mainly pointing out the Anandtech review, which after reading thru you notice they use demos/flybys/cutscenes to bench. For call of duty 4 Anandtech used the cutscene of the guy smoking a cigar for their numbers, this doesn't even test the performance of really any of the engines effects(explosions, gunshots, etc)... [H] also explain the reason they do things the way they do, and that's because of the FX series benchmark scores compared to what it was capable of in gaming itself.Quote:
Originally Posted by HardOCP
Kinda makes me think it's time someone does a "benchmark vs real-world gameplay" review, merely to show if [H] has been full of it the past few years or if they've been right for quite some time. I personally haven't given [H] too much thought before, but they might just be on to something here.
Personally, I'm not taking sides until we see how this pans out. Either way, even if it took 2 gpus to do it, atleast ATi are finally making their way back into the fight. Hopefully it can compete with it's real competition!
wow, coolit has nice cooling systems, they are just BIG. cant wait to see what thermalright comes up with. 12 heat pipes anyone?
It does seem a little strange that the way benchmarks were done was not questioned so much until this card appeared.
HIS HD3870X2 also can do 918Mhz >> http://www.techpowerup.com/reviews/H...870_X2/23.html
maybe one of the Cores of your HD3870X2 not so good and looks like the max OC will be the Max Mhz the weakest Core can handle :
Quote:
We managed to max out Overdrive with the highest settings available. We found that when you overclock, both GPUs will overclock at the same frequency. Therefore the GPU with the weakest overclocking potential will be your highest stable overclock.
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
its funny to me, ATI keeps making cards only trying to match the 8800GTX... which sometimes it does... sometimes it doesnt.
still makes me glad I kept my GTX.
I am not a fan of crossfire, cause crossfire depends too much on driver support.
which means inconsistent performance.
sometimes it works, sometimes it doesnt.
with a single card, it always works, no matter what game you play.
Funny, cause even single card ATI need optimized driver too with their VLIW architecture. I guess no one learns their lessons since the introduction of HD2900XT and the promises of "future" drivers that deliver performance surpassing 8800GTX, only to see now that you need 2 of them to actually get anywhere near that.
Yeah, but if ALL cards are run with the SAME benchmark, it still gives a RELATIVE scale of performance. I don't see what water your argument or [H]'s argument holds.
Considering games are built on the nvidia architecture? Go figure.
It's not my argument, I merely see their point and would like to see it all further tested.
It does hold water though for the following reason...
You see, using cut-scenes and fly-bys don't use half of the effects actually used in gameplay. Why does this matter? Because a card can ROCK at some effects, and suck at others, and without testing the actual gameplay you wouldn't know which is better overall.
driver wise, ATI should drop legacy support, i.e. only support HD cards and up. most of the space in the drivers the release have all the optimizations for all the cards. if the only support HD and up, the driver size will be smaller but would allow more room for optimizations. you cant optimize an x700 to work with Crysis.
Exactly. A good example of this would be the CS:S stress test. It is nothing like real world gameplay. The same can be said about UT3 flybys.
Of course a lot of people are going to discredit [H] because of their 'controversial' results, but at least it offers a differing POV in gaming benchmarks.
Btw, interesting to see that EastCoastHandle has linked to every review except [H]'s. I wonder why... ;)
lol, the talk about games being developed on what hardware is trash talk. You buy video cards to like, what, playing games ? or simply showing your e-peen with your shiny new 512bit bus, 640 stream processors, 1.3B transistors card only to wait for magical drivers update for it to work correctly on newly released games ?
And Crytek should be more than happy cause they were able to develop on G80, don't want to make a game over a year late with only graphical as their strong point you know, might be trumped by other games before it even released.
http://images.anandtech.com/graphs/a...5732/16445.png
Excellent performance! Definately needed some more horsepower in this game.