Sampsa will probably do a better job at translating, but in the meantime here's an auto-googled english version:
http://translate.google.fr/translate...hl=fr&ie=UTF-8
many words remain in Finish, but better than nothing ;)
Printable View
Sampsa will probably do a better job at translating, but in the meantime here's an auto-googled english version:
http://translate.google.fr/translate...hl=fr&ie=UTF-8
many words remain in Finish, but better than nothing ;)
Yeah, DMC4 is pretty light comparatively. My 8800GTS 320 runs it with settings maxed, 1680x1050, and 4xAA 16xAF. Assassin's Creed was pretty light in that sense too, maxed except for AA, which was left at 2x. So personally, I don't feel that those are a good indication of the preformance. We need games that really TAX the graphics cards and give a good view of it's preformance. Not games that a gimped version of 2 year old hardware can max.
warboy
if you are not satisfied with the benches make your own or complain to reviewers directly - not here.
Warboy's weird capitalization makes him sound like a holy book or something. But people don't worship him, unfortunately.
We used Fraps for trying to save information about intervals between frames and noticed that, for example, in Crysis the ATI Radeon HD 3870-X2 graphics card rendered every other frame after 21.5 ms, and then after 49,5 ms (so 21,5 -> 49,5 -> 21,5 -> 49,5 you get the idea). In the game microstuttering problem occured as irritating twitching. In Race Driver: GRID, the problem was more serious, and every other frame rendered between 24.9 - 27 ms intervals, and second after 40.2 - 42.4 ms interval.
ATI Radeon HD 4870 X2 graphics card showed a much, much better performance in Crysis with frames rendering fairly steadily at around 21.6 - 22.1 ms intervals. In Race Driver: GRID situation was the same - frames rendered after 15.3 - 17.6 ms intervals. Corresponding results were measured for the two ATI Radeon HD 4870-graphics card in CrossFireX configurations, so it appears that AMD has apparently succeeded in removing the microstuttering problems that troubled Radeon HD 3800-series in HD 4800-series.
To compare, the Crysis and Race Driver: GRID were also tested with a NVIDIA GeForce 9800 GX2-graphics card, which was discovered to have more irregularities between frames rendering than the ATI Radeon HD 4870 X2. In Crysis the time between frames was 21.9 - 25.1 ms and in Race Driver: GRID 16.8 - 21.7 ms.
We didn't notice any microstuttering problems while using a single graphicscard. ATI Radeon HD 4870-graphics card and the GeForce 9800 GX2 with a single G92 GPU rendered frames without any substantial variation.
Translated with my mighty english skills. Original text:Here's the source: http://plaza.fi/muropaketti/artikkel...4870-x2-r700,1Quote:
Kokeilimme tallentaa Fraps-ohjelmalla ruutujen välisiä renderöintiaikoja ja huomasimme, että esimerkiksi Crysiksessä ATI Radeon HD 3870 X2 -näytönohjaimella joka toinen ruutu renderöityi 21,5 millisekunnin jälkeen ja joka toinen ruutu 49,5 millisekunnin jälkeen. Pelissä microstuttering-ongelma esiintyy ärsyttävänä nykimisenä. Race Driver: GRID -pelissä ongelma oli vakavampi ja joka toinen ruutu renderöityi 24,9 - 27 millisekunnin välein ja joka toinen 40,2 - 42,4 millisekunnin välein.
ATI Radeon HD 4870 X2 -näytönohjaimella mittaustulokset näyttivät huomattavasti paremmilta ja Crysiksessä jokainen ruutu renderöityi kohtalaisen tasaisesti noin 21,6 - 22,1 millisekunnin välein. Race Driver: GRID -pelissä tilanne oli sama ja ruudut renderöityivät 15,3 - 17,6 millisekunnin välein. Vastaavat tulokset mitattiin myös kahden ATI Radeon HD 4870 -näytönohjaimen CrossFireX-konfiguraatiolla, joten AMD on ilmeisesti onnistunut pääsemään Radeon HD 3800 -sarjaa tietyissä peleissä vaivanneesta microstuttering-ongelmasta eroon ATI Radeon HD 4800 -sarjan näytönohjaimilla.
Vertailun vuoksi kokeilimme Crysistä ja Race Driver: GRID -peliä myös NVIDIAn kahdella G92-grafiikkapiirillä varustetulla GeForce 9800 GX2 -näytönohjaimella, jolla mitattiin selvästi enemmän heittelyä ruutujen välisissä renderöintiajoissa kuin ATI Radeon HD 4870 X2 -näytönohjaimella. Crysiksessä ruutujen välinen renderöintiaika heitteli 21,9 - 25,1 millisekunnin ja Race Driver: GRID -pelissä 16,8 - 21,7 millisekunnin välillä.
Yhdellä grafiikkapiirillä microstuttering-ongelmaa ei havaittu. ATI Radeon HD 4870 -näytönohjaimella ja GeForce 9800 GX2 yhdellä G92-grafiikkapiirillä renderöivät ruudut tasaisin väliajoin ilman huomattavaa variaatiota.
So 4870X2 = 4870 CF. No magic :p:
Amazing they can keep the card cooled tho.
http://images.anandtech.com/graphs/r...3626/17191.png
90W more than the 236W TDP 280GTX...And comsumption at that speed is as expected like 4870 CF. I´m just amazed they can cool it.
My Original post was fine, not my fault people wanted to call me out. I tried to give ATi credit on doing something that even the performance even tho it's a 2:1 ratio.
I capitalize on points I want to make.
Fixed:ROTF:
the GTX+ SLI does alright in power consumption given its 55nm, HOWEVER and this is a big one, the GTX+ has a great deal less transistors than the 2XX series so if NV tried a dual GTX260 @ 55nm, I still find it hard to believe they would be able to bring power consumption around 400W or lower.
:shocked:
I've just read all the July 14th Previews....
I guess we all knew this card would perform, but was anyone just in awe after actually seeing some of these numbers? Can you really imagine playing Oblivion in such high-res with so much AA?
I just can't, I keep saying it's meaningless, but somehow it's actually what we've all been wanting.
I'm just happy for all of us, by X-Mass time this year, mainstream cards will be $100 bucks. Deep visuals will be part of every game now that a coder will know how to optimize their engine.
.
Ugh these reviews are conflicting...
Anyone notice on some 9800gtx+ sli beats gtx280 sli...how???
Some seem to say 4870x2 is better then 4870 crossfire while others say it's somehow slower.
Also some show it going strong and faster then all other cards then the rest seem to show it going slower then single cards or lower end cards....
Meh guess for real conclusive reviews were gonna have to wait awhile.
Different drivers used...every review is the same. At least we know AMD isn't specifying what drivers they use for thier review.
Thanks m8, i guess graphs speak for themselves. From what i understand R700 should perform faster than 280 as long as their is software support. With ATi putting this as their flagship card i will assume that they will put loads of effort into making CF work on as many games as they can. It is interesting to see that through excess shaders R700 can push up the AA on some games without loosing significant FPS, something that will probably show how R700 will perform in future games. That point i think is important, future games, none in their right mind will want to spend that much money on a card that is only able to play last years games. Seeing as R700 is DX10.1 and comes with Tesseletion on top of free AA in some games i cant see why anyone would say that 280 is the better performer.
I would also like to remind people that Nvidia went Multi-Gpu path first so don't blame ATi for beating them in their own game. Also everyone was giving ATi stick for only supporting DX9.0b in previous generations while Nvidia's DX9.0c support was praised. I think that some people should stop undermining a welcomed progress in graphics market and be happy that the card they wanted to buy originally now costs around $200 less thanks ATi's strong competition. For all Nvidia enthusiasts this should be good thing as you most of you could be reaching the point where you can now afford an SLi set up.
So you see, everyone wins. So stop bashing each other like someone has lost something valuable.
Also, some are previewing the 2 x 512MB version, others the 2 x 1 GB version.
AT fixed their review; it's a 2x1024 MB card also, not 2x512 MB.