lol no thanks gonna wait for some proper sites to do that
Printable View
Great review SKYMTL, will you be doing any SLI results?
Those of you who run 3+ monitors and are fully convinced 2gb is not enough.... BEHOLD!!! EVGA GTX680 FTW 4gb
I'm trying to download eVGA PrecisionX for when my cards arrive, but... the download keeps trailing off and getting interrupted due to how slammed their site is haha. /likes to have all software waiting on desktop. :D
Nvidia said that the left and right displays off of the center display in a triple monitor setup would run at a slightly lower FPS since they aren't constantly being looked at. This was to improve performance. I think the idea is super smart in most cases but there are some games like RTS games where I would prefer them to be all at the same FPS.
Anyone have a link to the article that talked about this?
this: http://tpucdn.com/reviews/NVIDIA/GeF...fwatt_2560.gif
Don't take me wrong, it is great to see an nVidia card among the AMD ones. FINALLY. But still some way to go for nV.
this: http://www.xtremesystems.org/forums/...=1#post5072621
few frames give or take is dead even for me, especially if it is not playable on either.
I'm confused... the chart you link shows the GTX 680 as better perf/watt than the Radeon 7970.
When a few frames are technically 15-20%, it shows the scalability will be better in SLI overall for playability. Hypothetical, BF3 24fps on Radeon 7970, 29fps GTX 680. Then two cards.... and your numbers look more like 44fps 7970 CFX and 54fps GTX 680 (assuming 85% SLI and CF scaling). The gap widens in absolute # despite the percentage being the same. Otherwise, you are correct... but those numbers result in playable vs. not enjoyable in these edge cases. Hopefully this helps explain the logic as to my thinking.
I didn't state 7970 anywhere, and I meant AMD cards in general.
I understand your logic and agree to some extent. We can continue this dispute once we see SLI/CF numbers.
There are two resolutions I commonly use, the native res of my 3x24" LCDs and the native res of my 30". I do not care about a million FPS @ 1024x768. If you spend 500$ on a GPU you usually do not use it with a 80$ 19" monitor. Anything under 19x12 is not a valid information for me/my customers.
Very interesting piece of SW. :up:
If anyone still wants EVGA Precision X - http://www.mediafire.com/?fh19bn8ac0bzuws
I've been curious about that as well, but I don't think nVidia has said that. TPU made the claim in this article - but I can't find anything in any of the reviews I've read (or on nVidia's site) that confirms what TPU reported.
Mate, by looking at the bench-runs you do...are you playing with a pad or something? The aiming around seems very sloppy and the only explanation that I could found was that you were using a pad in order to play those runs. :p:
BTW, have you ever tried to measure the performance differences between BF3 single-player and multi-player runs? I've found BF3 to be super taxing @ multiplayer...but of course, its a pita to test the game in such mode properly as there are many variables you don't control which would require a ton of extra testing in order to minimise such variances.
Anyway, good job with your review, yours are probably the ones I enjoy most since you put important staff in perspective while doing a good testing job altogether. Thanks :)
One of the main issues is the fact that in order to save some space and minimize processing / uploading time, I recorded all of the videos @ 30FPS with FRAPS. In an FPS, that seriously messes up aim. ;)
As for MP versus SP, I have tried but the issue is that MP introduces far too many variables into the equation. My benchmark runs ended up being all over the place.
Going to throw one in my HTPC to replace my 570. I've found myself using it alot more than my desktop lately. Ever since I got this home theater thing in the bag. I tend to use the desktop for flight sims, online fps and rts games and use the tv and htpc for everything else.
I should probaley wait for an aftermarket sink as I do quite like Asus' DCII sink performance and low noise level but I'm sure someone will release a decent aftermarket sink ( perhaps the Acclero Xtreme is compatible already... ? )
I was curious Sky, is the adaptive vsync a feature of the 300 series drivers themselves or is it exclusive to the 680? ( I'm assuming the former ) That along with optional 3rd party frame limitors is a huge attraction for me as I tend to use my tv with vsync as the tearing detracts from it alot more than my desktops display interestingly enough. However I recently played through the Darkness 2 and the performance droped much too often with vsync on so I had to turn it off.
I play a lot of single player games with a 360 controller so I'm a little more tollerable to the added input lag. On my desktop I tend to keep it off at all costs.
http://www.xtremesystems.org/forums/...2&d=1331802528
http://alienbabeltech.com/main/wp-co..._No_Therma.jpg
I've noticed some parts were removed in the final PCB. Early leaked pics had those.
Why the thread title still GTX780 ??
Good job nvidia. The 7970 and 680 should be neck and neck when overclocked to the limit, TPU did a max 7970 vs 680 overclocked performance but the 7970 was without vmods and around 5% slower. With it, should be equal. Great power numbers for the 680 though.