This may be true but two seperate cards nearly always outperform the single card/dual gpu solution too.
:)
Printable View
This may be true but two seperate cards nearly always outperform the single card/dual gpu solution too.
:)
I think the question is more, do they want to realise a reference card with 375W ? Its not cause there's 2x 8pin on the card ( like all dual card: 590-6970-5970 etc ), they want to aim at this 375W.. This dont mean too they need to underclock it for stay way below this: the 680 is far under the power consumption of a GTX580... 2x at full speed should not be a problem at all..
Dont forget the 680 have this turbo, ideal for dual cards... ( the cores speed can go up and down in function of the TDP limit, they can play with it for save a maximum of performance whatever is the base clock )
http://en.expreview.com/2012/04/11/m...ked/22383.htmlQuote:
After the release of GeForce GTX 680, what will the new product follow? We got from some source that dual-GPU GTX 690 will debut in May.
GTX 690 will be based on two GK104 cores, features dual-8pin external power ports, three DVI ports, a mini DisplayPort, supports PCI-E 3.0, Adaptive Vertical Synchronization, GPU Boost and TXAA technology.
Additionally in order to secure stable operation of the system, it’s necessary to use PSU of at least 650W. However, the other specifications are awaited.
Am I the only one still wondering what's going on with the 7990? I would have expected AMD to release it by now to slow down the gtx 680's thunder - isn't it odd that we haven't heard anything?
Disagree. There is a safety margin per GPU (eg: 20w) when they come up with the 195w number.
Adding two GPUs and subtracting a nominal power saving of 20w for combining the PCB's we get 2*195-20 = 370w. 20w per GPU margin has not changed.
If you drop the clocks, the per GPU margin goes up.
In terms of PCB power you are way closer to pushing the theoritical limit. Even though you do have seperate sets of capacitors for each gpu, you still are feeding all the power from the same entry points. Why would the per gpu safety margin even be of concern in the first place?
TO prevent AMD from starting crap like this:
http://www.geeks3d.com/public/jegx/2...be_grilled.jpg
and this:
http://www.legitreviews.com/images/r.../egg_start.jpg
While funny, I was specifically talking about the gtx 680 lol. We all know how hot the 480 ran
As for the Catleap monitors, here's a 680 pushing 133Hz @ 2560x1440....
http://cdn.overclock.net/6/65/65b4f2a9_17.jpeg
Is that yours? So it's a 2B build?
:D
I wish, I still haven't gotten anyone to buy my 3x 24" monitors :( Not that 7950s will run past 85Hz because of software and past 100Hz because of hardware, lol.
Yeah, he has a 2B. He could only do 126Hz before, but its apparently warmer so he thinks he can go higher.
I want to buy a Catleap but I don't like the idea of taking a chance on not getting a 2B without a quality warranty/return policy. If the new batches are also quality overclocking models I might splurge myself
I'm happy with my 60hz QH270. For just over $300 you can't go wrong. Its just such a nice display. That said I'm not a huge multi-player fps gamer. IDK, I'm happy with my purchase.
I'll seriously think about it then. I heard the next batch is supposed to come in about a month
Any idea about response times, input lag etc on these? While I certainly wouldn't need or even want 2560x1440 @ 120Hz+ for gaming (the maintenance cost of assuring you get sufficient FPS is just not worth it to me), I wonder how it does at 1920x1080 though, any1 with such monitors tested it, I guess it should work fine on most Catleap monitors running 1920x1080@120Hz? Also does it really allow true 120Hz so the frames doesn't just get dropped?
Well put RPG. I don't have the graphics card to push 120 frames at that resolution with modern games, but certainly input lag would be an issue of interest regardless. I've also heard stories of gamma and brightness gradients because these are A-,A panels that effectively failed the specifications required for what Dell and Apple use. I don't really think stuck pixels will be as much of an issue as correct color representation etc
Any real information for 780 ? I keep opening this thread every time someone posts, in hope for actual info and I get frustrated when I see there is nothing new...
Well, question is what is "real".
Latest info I have is
4GB GDDR5
512bit bus, 64 rops
3072 Cuda Cores
128 TMUs
"Cuda Next" abilities
2+ TeraFLOPs of DP performance
And of course much lower clocks than GK104. Maybe 800 MHz or so.
they should place a holder for fats to run.
The even worse part is that you can't do the 120Hz @ 2560x1440 in SLI or CF :rofl:
I don't think anyone has really done 1080P on the 2B models because I'm pretty sure they lack a scaler.
It really does 120Hz if you can push 120FPS :) After that you get interesting lines and artifacts because the panel's limitation is being reached.
That seems to be the general consensus, approximately just all of gk104 doubled. Any idea on when it'll come out? I really want to get a gk104 chip as a midrange card.
To be fair, the monitors that do have a scalar cost $1k. You get what you pay for in the end lol. Honestly my concerns lie purely with making sure I don't get a dud as these are literally supbar panels. If I can get one that has zero dead pixels, no gamma problems and low backlight bleed I'd be very content for just $350. 2ms and 100 hz would certainly be really nice, but as I understand it those who really want to overclock can just buy the additional pcb+cables anyways (though I'm not sure why only the 2B came with them in the first place).