"Tom's Games" is pretty good though:rolleyes:. That is the whole reason I have a $2000 machine . . . to play flash games!
Printable View
"Tom's Games" is pretty good though:rolleyes:. That is the whole reason I have a $2000 machine . . . to play flash games!
There were valid reasons for Tom's Games getting a facelift (tongue in cheek). It just depends on your point of view. Rob and Ben probably wouldn't call it valid.
My personal experience is GTX 285 performs equally if not slightly faster overall than an oced 4890 in most scenarios. ATI offers a slightly better IQ, cheaper price but much noisier than the 285. I don't believe 4890 cannot pass 260 in Crysis and fall so behind 285.
What the point of turning AA on? So you can get unplayable framerates for both cards.
Crysis with a higher resolution and no AA actually yields better image quality(and more importantly playable frame rates), than this test.
In my opinion, unless we are testing multi chip solution that can handle the load, crysis should not be test with AA because its simply an exercise in futility.
This test would be valid if one of the cards yielded playable framerates, but both are equally unplayable.
The only point of turning AA I can imagine is so ATI cards come out looking better than they really are, as xbitlabs is the only website, that has the 4890 coming out ahead a gtx 285 in crysis warhead. The only bloody website on the entire net including anandtech which is probably the 2nd most favorable review for the 4890.
I definitely question the results of this tomshardware review, however I also question xbitlabs review on NV stuff because I have already found lots of biased evidence against them.
My gaming machine has a 4890 and a 4870x2, I don't have any card from the g200 generation.
I wouldn't dare turn on AA and sacrifice the rest of my settings so I can turn on AA.
Crysis warhead is still choppy on this setup and this is with a core i7 at 4.5ghz.
biased? perhaps... but I'd say who ever is testing EndWar is not biased, it's dumb masochist!
http://www.tomshardware.com/reviews/...p,2297-14.html
quoted from the very same page:
yeah, exactly, well... hum... errrr... why use it as a benchmark then? they already say themselves why this game is absolutely useless as a benchmark and yet they still put it in the review. what kind of logic do i need to do smth like that? :p:Quote:
This game really isn’t ideal for benchmarking, since a software limiter caps its frame rate at 30 FPS.
Well it doesn't indicate bias, it indicates a lack of common sense.
Or a lack of necessary games on the reviewer's part...
LoL, nice try OBR...:rolleyes: :down: :shakes:
Yes Tom's site begin to be useless. I see here that many of you have confidence in xbitlabs, when i used this site for some Cpu comparation , it was told to me by some users in this news topic that xbitlabs are not veridic.
Anyway a good site from my country ROMANIA lead by MONSTRU and his friends, i think you guys know MONSTRU..:up:
http://lab501.ro/wp-content/uploads/...-2560-copy.png
http://lab501.ro/placi-video/nvidia-...-radeon-4890/6
It's an interesant review single CF/SLI, TRI SLI/ 3 4890, STOCK/OC review GTX 275/4890.:up:
There are a lot of variables when doing reviews. Unfortunately to this day I haven't seen anyone do it 100% right. IMHO what you need is:
- Exact driver settings for both setups. This is so important because it allows: 1 people to reproduce results, 2 indicates what the game would look like
- Screenshots. I am not talking at every setting, but when comparing ATI vs Nvidia you got performance and IQ. I don't want someone's "it looks better on [...]". Take some uncompressed screenshots and let me decide what looks better. I want to see what I am getting when going for one vs the other. Also reviewer can now compare side by side for our benefit (for those who care)
- System specs.. obviously. I mean really we have no idea what these reviewers are doing, but if we have system specs at least someone can reproduce the results (or try to)
With any one of those pieces missing the review is incomplete.
4xAF lmao, isn't that stupid by itself? I've never used anything below 16xAF since x1950 Pro. On powerful HD4850 i never even thought of either lowering this setting or even disabling it. Performance hit is negligible, but image quality gain is like off the charts.
Maybe thats why NVIDIA is so much better, because AF hits them so badly?
I'd like to see someone actually point out bias where it can't be put down to a bad reviewer. Lots of claims, no evidence. I understand that the "THG paid ad" bandwagon does have a lot of room though, so I forgive you for jumping on.
Well the past is meaningless, nobody goes to read Athlon 64 vs P4 reviews now.