450$ 7970 will be enought to make it attractive. Thats for sure. 400$ would be insane.
Printable View
450$ 7970 will be enought to make it attractive. Thats for sure. 400$ would be insane.
As usual, shops are making a few extra bucks ^^
http://i119.photobucket.com/albums/o...cis/gtx680.png
Yeah min price $700AUD here :(
Keen for GTX 685 however.
-PB
I couldnt help myself..... P10694 best so far.
:up:
GIGABYTE GTX 680 2GB - $600 or 450Euros.
Quite reasonable
http://desmond.imageshack.us/Himg837...jpg&res=medium
Here in Spain, are more costly and we have no availability .... ....
http://skinflint.co.uk/eu/?cat=gra16...GTX+680#xf_top
For EU buyers.
Cheapest option is MSI from 455€ (tax included) + shipping.
Where are these "reports"? I've never seen anything to indicate a game will lower the base clock and I had the clock speed monitor running through every benchmark I did.
As for the clock frequency, yes it does constantly move as the resources within the GPU shift however, most games "lock in" at a standard clock speed. Case and point, Dirt 3:
http://images.hardwarecanucks.com/im...TX-680-121.gif
All right, this is a bit of a complicated answer so I will try to make it as short as possible.
First of all, increasing the offset WILL NOT guarantee that the GPU Boost will increase clock speeds to that level all the time. It will only do so when there is ample TDP / Power headroom which as we have been discussing is largely based off of the usage characteristics of every game / benchmark. A good example of this is Vince's 1800MHz overclock. If you look carefully at the Clock Speed, there are several instances where the GPU throttles down to drastically lower frequencies as it hits a power capacity wall.
This is why increasing the Power Target is so important; it basically gives you additional overhead to work with. However, the Power Target can only get you so far.
I'll give you and example:
Say you increase the Offset by 150MHz.
This means that games that normally Boosted to 1150MHZ would now be running at 1300MHz and games that ran at 1058MHz would essentially run at 1208MHz, hence the linear offset NVIDIA showed in their presentation slides.
HOWEVER, it will only increase clock to those levels if there is extra overhead for it to go above the default TDP. Hence why you need to increase the Power Target.
Without increasing the Power Target, there is a very good chance your Offset clock won't translate into actual in-game overclocks.
This is also why increasing ONLY the Power Target can lead to higher performance. Here's why:
Typically, the GPU will dynamically boost its clock speeds to stay as close to the default TDP ceiling as possible. Now what happens if you INCREASE the level of that ceiling? Well, the clock speeds have that much more headroom. I have seen situations where raising the Power Target to 120% will increase some in-game frequencies by 7-10%.
Hope that helps. :)
Hah and I even recognised this shop based on that layout. xD If you check my location it makes sense. :) The other cards are more reasonably priced, a bit high but no higher than the typical finnish pricing. However this one's different cuz it's in stock though... talk about price gauging haha.
Yikes! Check this out!
GeForce GTX 680 Release Driver Limits PCI-Express to Gen 2.0 on X79/SNB-E Systems
http://www.maximumpc.com/article/new...ci-e_20_speeds
If this is true, then Nvidia = FAIL on this launch
301.10 fixes that.. also there is a registry fix for 300.99..
http://www.4gamer.net/games/022/G002210/20120323002/
nonetheless i'm disappointed by GTX680, esp because of GPUBoost.. you cant disable it and it is enough to make me never gonna buy any nvidia card unless someone manages to hack this :banana::banana::banana::banana:.. after 4 years break hello to red team again..
well we all know that pcie3 only helps cards with 3GB of memory or more
/sarcasm
this affects only a few people, but it would be nice if they fix it and then we can see how much perf was lost on ivybridge systems. chances are its 5% or less.
everything was nice and dandy until learnt there is no way to disable.. i want full control over my card.. you are stucked at what vendor decided for your card, even the old way was similar at the end, it still feels like there was more freedom.. for starter they should have give me at least +60% power, not +32%.. afaik there is no chance to undervolt too..
Try X8 X8 X8 (X8) Tri or Quad Fire multi monitor set ups along with some SSDs in RAID and you'll soon see the difference. I understand that its only 2% of the users. And I understand that it might not be huge differences at this point. But if the lame ass driver support team at AMD can figure out SB-E support, Nvidia should have. It is simply sloppy and does not inspire confidence when the new High End card cannot run full speed on the current High End platform at launch. I'm not a fanboy of any brand camp so I find stuff like this disappointing and don't make excuses for it. Its not the end of the world stuff...just disappointing.
I frequent your site, visit it a few times a day, you guys made an impressive site. Good to know more of the big guns who participate in our community.
I read the review, you did a pretty good job and the follow up article was quite a good read too. Read the Hardwarecanucks review as well, Sky delivered as usual and I liked his conclusion, was curious about the patched shogun 2 results as that's the only game I play these days.
I didn't even have to take a look at the [H] FPS graphs after those reviews, it was quite obvious the 680 is the card to get. I expected to be reading reviews for hours and didn't even look at anand, or guru3D the picture was clear, NV won on all fronts.
Really good to see NV made a turnaround and delivered this efficient, silent, small and still the fastest card ever.
I am glad I have been waiting what to get and boy it did worth the wait.
just arrived---
great driver
low temperature and noise
great performance
http://www.hwmaster.com/forum/T-Thre...79270#pid79270
i am really happy now!