http://www.pcper.com/news/Graphics-C...eForce-GTX-690
Same shader count, same ram speed, only SLIGHTLY cut down core speed. Looks like AMD have their work cut out for them.
http://www.pcper.com/news/Graphics-C...eForce-GTX-690
Same shader count, same ram speed, only SLIGHTLY cut down core speed. Looks like AMD have their work cut out for them.
915mhz with turbo aimed at certainly 1000mhz .. no worry for AMD.... this explain maybe they release it before wait AMD have release his own dual 7970 ..
I really dont understand why they release it with lower clock speed, with the fixed TDP of the 680. they had a goo margin.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
damn card looks gorgeous
2x Asus P8Z68-V PRO Bios 0501
i7 2600K @ 4.6GHz 1.325v / i5 2500K @ 4.4GHz 1.300v
2x G.SKILL Ripjaws X Series 8GB DDR3 1600
Plextor M5P 256GB SSD / Samsung 840 Pro 256GB SSD
Seasonic X-1050 PSU / SeaSonic X Series X650 Gold PSU
EVGA GTX 690 (+135%/+100MHz/+200MHz/75%) / EVGA GTX 680 SC Signature+ (+130%/+80MHz/+200MHz/70%)
A bit more info: http://www.hardwarecanucks.com/forum...0-preview.html
AMD won't be able to match this for the foreseeable future. But then again, New Zealand could become a lower priced alternative.
Last edited by SKYMTL; 04-28-2012 at 08:22 PM.
I have a feeling AMD is not going to be able to top NVidia this time. That card looks crazy good.
PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.
We knew it.
\Project\ Triple Surround Fury
Case: Mountain Mods Ascension (modded)
CPU: i7 920 @ 4GHz + EK Supreme HF (plate #1)
GPU: GTX 670 3-Way SLI + XSPC Razor GTX670 water blocks
Mobo: ASUS Rampage III Extreme + EK FB R3E water block
RAM: 3x 2GB Mushkin Enhanced Ridgeback DDR3 @ 6-8-6-24 1T
SSD: Crucial M4 256GB, 0309 firmware
PSU: 2x Corsair HX1000s on separate circuits
LCD: 3x ASUS VW266H 26" Nvidia Surround @ 6030 x 1200
OS: Windows 7 64-bit Home Premium
Games: AoE II: HD, BF4, MKKE, MW2 via FourDeltaOne (Domination all day!)
Time to retire my 570 SLI for a single 690 WC'd? I think so!
Oh, $999... I guess not, LOL!
You must [not] advance.
Current Rig: i7 4790k @ stock (**** TIM!) , Zotac GTX 1080 WC'd 2214mhz core / 5528mhz Mem, Asus z-97 Deluxe
Heatware
Anyone need a kidney? Damn that is just AWESOME!!!
CPUID http://valid.canardpc.com/show_oc.php?id=484051
http://valid.canardpc.com/show_oc.php?id=484051
http://valid.canardpc.com/show_oc.php?id=554982
New DO Stepping http://valid.canardpc.com/show_oc.php?id=555012
4.8Ghz - http://valid.canardpc.com/show_oc.php?id=794165
Desk Build
FX8120 @ 4.6Ghz 24/7 / Asus Crosshair V /HD7970/ 8Gb (4x2Gb) Gskill 2133Mhz / Intel 320 160Gb OS Drive, WD 256GB Game Storage
W/C System
(CPU) Swiftech HD (GPU) EK HD7970 with backplate (RAM) MIPS Ram block (Rad/Pump) 3 x Thermochill 120.3 triple rads and Dual MCP355's with Heatkiller dual top and Cyberdruid Prism res / B*P/Koolance Compression Fittings and Quick Disconnects.
What I find interesting is that they went back to 300 W instead of staying at 375 W. Given the 690's clocks, I would think a 375 W variant could be like the 3870 X2, a dual-card with even higher clocks than the corresponding single chip. Then AMD would really have their work cut out for them (at least performance-wise).
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
The GTX 590 and HD 6990 were both 375-watt max cards, and no one seemed to complain about it.
I think PCIe spec is just a theoretical target. The companies themselves can dictate what sort of power draw they want from their devices. The issue comes when a nice power supply isn't good enough for a decent card. That's when people start balking about power requirements. As it stands right now, a Corsair 750-watt or more is pretty standard fare for most enthusiasts, so even 375-watt power ceilings are well within reason.
PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.
Nice looking card. Can't wait to see how it performs when the full review comes out.
"If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
-- Alexander Hamilton
Finally Nvidia openly acknoledges the stuttering (so far I have only seen them do it in one interview with Nvidia Germany). I was under the impression that Nvidia already employs some kind of frame metering technology (see the techreport article on this issue for example). Maybe it was software based before and is now improved and hardwarebased, utilizing double the bandwidth due to the PCIe 3.0 chip between the GPUs.Improved Frame rate Metering
Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering.
The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent.
I hope all reviews will touch on this subject and will continue to do so in the future. It's high time that the methodology of looking at fps exclusively is thrown over board. Reviews should also focus on the quality of said fps.
Posted this in the Nvidia section a little more info.
http://www.geforce.com./whats-new/ar...ticle-keynote/
I don't particularly like this decision for a whole host of reasons... It's a sweet card, but considering everything that went on with the GTX 680, lack of enough supply, still not enough cards for partners, and the fact that the 680 launched last month. Not to mention, NVIDIA stands to make their board partners happier by selling them 680s and letting them customize and sell them with higher profit margins, etc.
This sounds like a decision that was too soon and I think they're trying to preempt AMD with this rather than make good business decisions. That is, unless they didn't think they would be able to compete on price.
Did you guys see the keynote speech live? For Nvidia, it was done pretty sloppily and felt entirely of a marketing show. They had camera problems at first with the teleprompter, as well as not having audio/stable video streams. Even with these problems, the speech was over 20 minutes late. I was also let down on how they basically paraded developers on stage, showing almost silly scenes/demos, throwing buzzwords and completely ignoring everything interesting about the games in question. What sealed the deal for me (in terms of cheesiness) was the fact that some of the "world first" videos they showed really weren't world firsts at all.......
That being said, it looks like they are actually serious about the 690 GTX. The heatsink looks very well designed, compared to previous dual-gpu cards. I also did like the fact that there is still room for overclocking. Just now what pricing are we looking at......................
http://www.hardware.fr/news/12267/nv...e-gtx-690.htmlAlthough the TDP is 300W and the target GPU Boost is limited to 263W (maximum value so that it can increase the frequency, but it does not reduce it under 300W), the GeForce GTX 690 is supplied through two connectors PCI Express 8-pin (375W), enough to be able to leave more room for overclocking. Drivers allow an increase of 35% of the target GPU Boost or 355W maximum.
At such price, two 680's look a lot more tempting Not to mention faster ^^
Asus Crosshair IV Extreme
AMD FX-8350
AMD ref. HD 6950 2Gb x 2
4x4Gb HyperX T1
Corsair AX1200
3 x Alphacool triple, 2 x Alphacool ATXP 6970/50, EK D5 dual top, EK Supreme HF
That cooler does look nice, first time I think I've seen such a reference card with a nice stylish looking cooler. Wonder how loud and how hot it is at load.
AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160
So, at first midrange sized die of GTX 680 for $499, now we get two of these on a single PCB with a single cooler for double the price?
Isn't this going to be the most expensive dual GPU card ever (reference, I am not talking about custom solutions)?
It's a really nice card, but the prices... damn. Nvidia's ripping off as hard as it can.
Sexiest card i've seen any years. Better than any custom card. Amazing job by NVIDIA.
If the card equal or less 2~4% the performance of 2X GTX680 sli, its my next card since two gtx680 = $1100 ~$1300
Gaming rig;
ASUS RAMPAGE IV BLACK EDITION
I7-4390K
G.SKILL Trident X 16GB 2400
Intel 530 240GB
2x Asus GTX780
Corsair AX1200
HP ZR30w 30
Win 8.1 pro
Sound rig;
Auzen X-Fi H.T. HD --> Yulong D100 MKII --> D-7100
Bookmarks