I'm following this ! Very interesting and you have done some hard work Vega.
-k0nsl
Printable View
I'm following this ! Very interesting and you have done some hard work Vega.
-k0nsl
ASUS RAMPAGE IV EXTREME or EVGA X79 Classified?
Both have MB water blocks available from EK.
i have no opinion on what MB to get cause im not sure about perf and overclocking of those chips. but if we know that IB info is coming soon, id wait to see what happens. but if your already spending this much, i guess its ok to take a slight risk and be wrong and deal with the cost of correcting it.
also, have your other GPUs come in yet? mine is due tomorrow and it feels like i ordered it forever ago.
No, they are still inbound.
I wonder if X79 with it's native 16x/8x/8x/8x bandwidth would edge out the Z77 Sniper 3's PLX 8x/8x/8x/8x when I am running four highly overclocked 680's at high resolution and refresh rates.
I guess I could get an X79 and 3960X now and get a Sniper 3 and 3770k when the release and pit them against each other and then sell the loser LOL.
I did some research a while ago on X58 nf200 vs native vs P67 nf200 and the results were varied but not by more than 2% and sometimes in favour of 1 or the other dependant on the workload/title.
It has to do with the fact the NF200 (and now PLX chips) actually can reduce CPU/PCIe traffic as long as you have enough width! (ie 16x) And in some cases it will be faster because the data is sent once from the CPU to the bridge chips and the bridge chips then broadcast the data to the GPUs removing traffic from the CPU/PCIe bus. But only in some instances!
Differences are very small and will probably be within the margin of error on Z77 using PCIe 3.0 vs X79 PCIe 3.0
I also think that from memory the result will vary slightly because of the different SLI/xfire communications protocols. I cant remember which but I think nvidia now uses parallel and ATI/AMD uses serial comms over the SLI/xfire connectors which can cause differences too. Cant find anything current on the SLI/crossfireX bridge configs from Nvidia/ATI so forgive me if Im wrong or out of date! ;)
Put simply I dont think you will see much difference. The different CPU architectures and other onboard/bios latencies are more likely to have an influence.
https://www.evga.com/forums/tm.aspx?...e=1&print=true
http://www.tomshardware.com/reviews/...re,2910-9.html
I say go with the Z77 just because the overall cost of the MB and chip should be less and I bet the performance difference will be very little if any. If I had to go x79 today though I would get the Rampage over the EVGA just because Shamino is behind it and it should overclock quite well and bios support should be better. Personally, I am not too impressed with the x79 series. The x58 series was way better than the P55 series, but I don't think that is the same with the x79 and the Z68/Z77 series, especially if the z77 series is able to do 4 way SLI/CF.
Good info guys. :up:
Sweet, got a Rampage IV Extreme and a nice clocking 3960X inbound and the other three GTX 680's in the mail. Now I just need to find some of that 2666 MHz DDR3 4x 4GB. Or do you guys think 2400MHz will suffice?
I might have to re-think my cooling loop on the CPU side to give it more breathing room now. ;)
2400MHz will suffice but when you've already got all the best for your project would be nice to see the best DDR in it too. OFC, if search and cash don't show as big problems...
you are xtreme :D
But i just love what you are doing so I will keep an eye out on this build ;)
For sb-e from what sin was saying the IMC usually struggles above 2133, so yeah 2400 should be fine, although when you move to IB it should be able to handle a good bit more, but then you can always upgrade then anyway. My 3930 has been sitting here for a month and I haven't had time to finish the build lol :(
I went with the Team Group 2400 MHz 4x 4GB kit that runs 9-10-10-28. I think that should be pretty good.
Well I broke down and have four of these inbound:
http://www.overclock.net/image/id/20...326/height/245
I can't help it, EK water blocks draw me in like crack to a crack-whore! :D
I am going to hold off on the Rampage IV Extreme motherboard VRM and chip-set liquid blocks and the four RAM blocks until I am sure that is the route I want to take for my permanent setup.
The VRMs on the Rampage IV are getting burning hot when overclocking. I even had some throttling over 5.0 with the stock heatsink.
If you don't install a waterblock, then you will need a fan directly on those, or they will get toasty pretty fast. Installing the EK block on my RIV solved that problem. No more throttling.
SO be sure to compare your results at let's say 4.6 vs 5.2. On air, for me 4.6 was faster because of the toasty VRMs. With waterblocks, 5.2 was alot faster. :)
Ya, I will keep the clocks down until I am sure I want to keep the RIVE/3960X instead of the Z77 setup that I will test. If I keep the RIVE it will get the EK blocks.
My 680's run extremely quiet. But then again I run the fan at 100% as I don't care about the noise with the computer being in a different part of the house from where I sit to use the computer. The GTX 680's also max out on stock air at 57 C. Pretty crazy!
I bought Loud_Silences 3960X:
http://hwbot.org/submission/2236457_...in_57sec_610ms
Hopefully I will be able to get it to at least 5.2GHz under chilled water.
Double Post
it has a pretty decent IMC, 2400 runs with no problems on multiple sets of memory:up:
http://hwbot.org/submission/2233841_...sdram_1320_mhz
Working on the ambient side of the loop today. Setting up to test the 3960X and RIVE under water.
http://i119.photobucket.com/albums/o...SANY0014-3.jpg
Mocking up components to test routing of liquid lines. 1/2" ID - 3/4" OD Norprene tubing is hard to bend and fit in tight places.
http://i119.photobucket.com/albums/o...SANY0015-2.jpg
The Team Group 2400 9-11-11-28 RAM came in (4x 4GB).
http://i119.photobucket.com/albums/o...SANY0017-2.jpg
Working on some of the supply/return valve systems.
http://i119.photobucket.com/albums/o...SANY0018-1.jpg
Made a custom stand out of some old Ikea speaker stands.
http://i119.photobucket.com/albums/o...SANY0011-4.jpg
Reservoir up top with a couple silver kill coils.
http://i119.photobucket.com/albums/o...SANY0012-3.jpg
Liquid line routing. The open ended valves that are shut will attach to the Geo-thermal section of the cooling loop.
http://i119.photobucket.com/albums/o...SANY0019-1.jpg
Testing out the loop and checking for leaks.
http://i119.photobucket.com/albums/o...SANY0020-2.jpg
Getting rid of air in the system as been a huge PITA. I am going to have to come up with some sort of custom pump/system to force water through the system and flush all the air out under pressure. The Iwaki RD-30 is a beast of a pump in a closed system but if there is some air in the lines it has a hard time getting going. The system already used 1 gallon of distilled water and I ran out so I wasn't able to fire the rig up. Tomorrow is another day.
Manual control? I'd have thought the amount of skill and technology you're throwing at this thing, you'd be using solenoid valves controlled by an ardunio or the like... actuated by capacitive switches, or something equally swish.
LOL, I was thinking about that until I realized that would be a lot of trouble and expense for something I wouldn't be moving but a couple times per year ;) I can always upgrade in the future though if I get bored. I was thinking along the lines of:
http://singervalve.com/sites/default...a-CA-1_800.jpg
I am thinking of re-arranging the cooling loop so that:
Branch #1 = CPU > X79 > VRM/MOSFET
Branch #2 = 680 > 680 > 2x RAM Sticks
Branch #3 = 680 > 680 > 2x RAM Sticks
I think the balance between those would be fairly close. The VRM/MOSFET coolers are really low restriction and would pretty much balance the resistance of the 2x RAM sticks. So essentially it will be one CPU block versus two GPU blocks to balance resistance. Anyone think the resistance balance would be way off on the above configuration? (Doesn't need to be perfect)
Subbing in, just happened to stumble upon this build log from your last build log.
Lately, I've been wondering if Quad SLI/CF PCI-e 3.0 x16 would have a considerable advantage over x8. I will definitely keep an eye on this project as it has many similarities to my next high-end gaming/workstation build coming soon.
Keep up the beast work, Vega. :up:
Got my RAID 0 setup (boy that was a nightmare on X79), Win 7 installed. Games are downloading. Got three GTX 680's now. This is why I love nVidia:
http://i119.photobucket.com/albums/o...00Surround.jpg
Even something as complicated as running three CRT's in portrait is a snap. Install driver, hit configure Surround displays, bam - organize screens and your done. Even these early drivers work really well. Thankfully nVidia allows each card to use it's own RAMDAC for each FW900, something AMD cannot do.
Setup is kinda a mess while I install and test stuff:
http://i119.photobucket.com/albums/o...ANY0001-17.jpg
The 3960X at stock setting under maximum-Intel Burn Test only reaches a max temp of 41 C on the cores using the ambient radiator. I used Thermaltake Chill Factor III this time around and it appears to be doing quite well.
Expect PCI-E 3.0 vs 2.0 tests, what is required to reach 2GB of VRAM and what happens when that VRAM is reached tests etc. So far in BF3 the results are pretty bad news once the memory reaches 2048MB! (Although it takes quite a bit to surpass the 2GB VRAM amount even at extremely high resolution and settings). More to follow...
I love this new nVIdia Surround. It keeps the desktop task-bar only on the center monitor and when I maximize windows it only maximizes on the center screen. Awesome features! With the simple registry edit, I've got all of the cards running at PCI-E 3.0. In the BIOS I can switch between 1.0/2.0/3.0 at will so this will make for some nice tests.
Who want's to bet there will be appreciable differences on my setup? ;)
coming together nicely vega:yepp:
Hi Vega,
Mega setup you have there and must say love you builds !
I'm just ripping my old system apart I7 920 and 2x3Gb 580Sli's I've seen 2150Mb Memory usage in B3 at default Ultra settings on my single 30" Dell I was looking at replacing the two 580's with the 2Gb 680's but have been worried about the 2Gb of memory they have. Here in the UK we will soon have the 4Gb Versions (aPalit GeForce GTX 680 Jetstream 4Gb) though I would hang on for the EVGA's but the price is going to be scary :(...
Barjoysee
You will be fine with BF3 with the 2GB cards. I saw the prices on the Palit 4GB, pretty steep! But if you want to be really future proof then yes the 4GB might be worth it. With the 3-4GB "big kepler" cards supposedly coming out later this year, not sure if it's worth paying a premium of the 4GB 680's. Depends on how deep your wallet is I guess. ;)
I'll proably sell the 3Gb 580's on while there worth something :) then look at the review's for the 4Gb's ..it is a tough one with the 110 coming out in a few months ! As they will be a good jump going on specs !
What's this?
Only 2GB of VRAM on a glorious multi-display setup? Yech, I just hurled all over my Logitech Comfort Wave 450.
SAID THE MAN USING THREE GTX 480s CIRCA 2010 WITH A MERE 1.5GB OF VRAM FOR HIS NVIDIA SURROUND SETUP, ROFL!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Ohh, the irony...
http://www.learnersdictionary.com/art/ld/iron.gif
Well, the results of my PCI Express 2.0 versus 3.0 on my 4-way SLI GTX 680 FW900 Surround setup are in. The results are so incredible I had to start the tests over from scratch and run them multiple times for confirmation! :eek:
Test setup:
3960X @ 5.0 GHz (temp slow speed)
Asus Rampage IV Extreme with PCI-E slots running 16x/8x/8x/8x
(4) EVGA GTX 680's running 1191MHz core, 3402 MHz Memory
nVidia Driver 301.10 with PCI-E 3.0 registry adjustment turned on and off for each applicable tes
GPU-Z 0.6.0
http://i119.photobucket.com/albums/o...ANY0003-12.jpg
After PCI-E settings changed, confirmed with GPU-Z:
http://i119.photobucket.com/albums/o...ga/PCI-E20.gif
http://i119.photobucket.com/albums/o...ga/PCI-E30.gif
All settings in nVidia control panel, in-game and in benchmark, EVGA precision are all UNTOUCHED between benchmark runs. The only setting adjusted is the PCI-E 2.0 to 3.0 and back and forth for confirmation (Reboots obviously for registry edit).
http://i119.photobucket.com/albums/o...PCI-ETests.jpg
I kid you not, that is how much PCI-E 2.0 running at 16x/8x/8x/8x versus PCI-E 3.0 bottlenecks BF3 and Heaven 2.5 at these resolutions. I attribute this to the massive bandwidth being transferred over the PCI-E bus. We are talking 4-way SLI at up to 10-megapixels in alternate frame rendering. Entire frames at high FPS are being swapped and PCI-E 2.0 falls on it's face.
The interesting part was that while running PCI-E 2.0, the GPU utilization dropped way down as would be typically seen if you are "CPU" limited. In this instance I am not CPU limited, nor GPU limited. We are really at a point now that you can be PCI-E limited unless you go PCI-E 3.0 8x (16x PCI-E 2.0) or faster on all GPU's in the system. GPU utilization dropped down into the ~50% range due to PCI-E 2.0 choking them to death. As soon as I enabled PCI-E 3.0, the GPU utilization skyrocketed to 95+% on all cores. I was going to run more benchmarks and games but the results are such blow-outs it seems pretty pointless to do any more. It may interest some of those out there running these new PCI-E 3.0 GPU's in which they think they are CPU limited (below 95% GPU utilization) yet might have PCI-E bandwidth issues.
Down to the nitty gritty; if you run a single GPU, yes; a single 16x speed PCI-E 2.0 slot will be fine. When you start to run multiple GPU's and/or run these new cards at 8x speed, especially in Surround/Eyefinity, make sure to get PCI-E 3.0. ;)
i have no word!
WOW!
i buy an ivy bridge @ the day one for my 680 sli!
PS: do you want to flash your card to evga sc+ bios? (here: http://www.techpowerup.com/vgabios/1...48.120329.html)
Wow indeed! I'm starting to wonder if my 2-way SLI is bottlenecked by having x8 2.0 for each card then... it looks possible (2560x1600). My GPU utilization is high (90%+) at almost all times in games though. Regardless, looking like a day-1 buy for Ivy Bridge for me to enable PCI-E 3.0 on my Z68 Gen3 board!
Is there any chance I could trouble you for a 2-way SLI run at 3600x1920 (PCI-E 2.0 vs 3.0)? Pretty please? :p: That's most comparable to my resolution... though still far greater (4mp vs 7mp).
Callsign_Vega: Can you post the pci-e 3.0 reg fix here in this thread. Can somone make a .reg file for this :D
Great findings about the bandwidth !
Now that I'm watching more closely I am seeing GPU usage hit as low as 80% each GPU sometimes even 75% in SLI in BF3... x8 2.0 for each card. Usually 80-85%, still a lot of lost usage it seems like. :eek:
Funny. How does the registry key modification work for you but so many people have been reporting major issues? Did you do something different?
Do you know if that will allow any tweaking of the settings above what I can already tweak them to with precision?
Sorry, you need three GTX 680's minimum to run my three CRT's due to RAMDAC connection limitations.
Registry key will be hard because everyone's VID ID is different and it changes based on if you are in SLI or not. The manual way only takes a few seconds. ;)
http://api.viglink.com/api/click?for...13338147983421
Whenever your GPU is less than the high 90's%-wise, you either have an artificial cap applied like VSync or frame-rate limiter or your CPU and/or PCI-E bus is a bottleneck. You could test by turning up and down your CPU frequency and if the numbers don't change, it's the PCI-E bus.
I have major issues with it. Every time I power down my machine it won't boot up into windows again. I have to remove one of the monitors, then reboot and reconfigure my Surround setup every single time. Such a PITA that's why it took me like 5 hours to get these benchmarks lol.
Interesting. I get AVG 96FPS at 6048x1080 in Heaven with default graphics settings and GPUs between 78-99% utilisation. Thats about 380K less pixels than your 3600x1920 setting. So it ends up about the same performance with GTX580s at PCIe 2.0 16x8x8x8.
What are the GTX 680 temps under load with the stock cooling option?
Hitting VRAM limit and PCI-E 2.0 vs 3.0 battle video tests. Make sure to watch in 480P instead of lower (I forgot to set the camera back to 720P and I don't feel like recording everything all over again lol). Sorry about the video quality but it is still view-able.
http://www.youtube.com/watch?v=S0-xc...cWAKgkFdz3LBo=
http://www.youtube.com/watch?v=tkZzs...Txt4tbGt5H9Yk=
Wow, that's crazy! :eek: I know you are really pushing the resolution to extremely high settings, but how do you see a person running no more than 1920x1080 with only two cards (big Kepler) in 2-way SLI running PCIe 3.0 @ 8x/8x? Do you foresee any PCIe bandwidth limitations in 4-5 years in 2-way SLI, crystal ball required.
Question. These are my benchmarks that I have taken in the past 24 hours. I just got my 680 Friday afternoon.
https://docs.google.com/spreadsheet/...kJZaElRdTM3cFE
Is my PCI 2.0 being a bottleneck for the games where the GPU usage is under 80%? I started benchmarking some of my games and noticed the GPU wouldn't try so hard on certain games. Espcially in WoW, when I'm standing around in Stormwind I get like 150+ FPS with 100% GPU usage. Then I go into raid fights and it drops down to 70% and the FPS drops down to 70+ AVG. I don't know if this is due to GPU Boost self-clocking down or something else. If it's something else, how can I force my GPU to run 100% so I can benchmark the titles.
CPU: i7 2700k OC to 4.8 GHZ, MOBO: ASUS P8Z68-V Pro, RAM: 8 GB of G.Skill Ripjaws X Series 8-8-8-24, SSD: 256GB M4 Crucial Solid State Drive
People said I was mad (mad, I tell you) when I bought the 990X/Asus Rampage Formula with three x16 slots early last year when there were no mulitple X16 slot SB boards out yet.
Who's laughing now?! Who's laughing now?!
(in Ash/Evil Dead 2 voice)
More seriously, kudos and good work Callsign_Vega, it looks like with 680s and very high resolutions one can indeed hit the limits of the PCIE2 x8. Very nicely done.
Nice, glad to see someone else backing up my tests. Although the difference isn't as dramatic as mine due to a lower resolutions and card count, they are still very impressive differences. :thumb:
http://cdn.overclock.net/1/10/10b499c8_DIFFERENCE.jpeg
From user: psikeiro.
save the image to your pc and you will see the difference :D
Right click, View Image should give you full size without saving to the PC.
What does each color stand for???
Here is his post guys:
http://www.overclock.net/t/1232473/o...ners-club/1690
You can see at single monitor it doesn't make a big difference but it sure does at multi-monitor as I've always said.
Yes 8x 2.0 is garbage :)
I forgot how much work it is to get these large water setups done properly. :thumb:
Assembly line done.
http://i119.photobucket.com/albums/o...SANY0007-9.jpg
X79 chip-set, RAM and MOSFET/VRM blocks installed.
http://i119.photobucket.com/albums/o...SANY0009-8.jpg
Setting up GPU's.
http://i119.photobucket.com/albums/o...SANY0011-5.jpg
http://i119.photobucket.com/albums/o...SANY0012-4.jpg
One of the GPU's temperature is a lot higher than the other three. It is quite strange as I assembled all blocks exactly the same, with the same TIM and quantity. So far the good ones are idling at 25-27C and under load around high 30's C. Not looking forward to draining and re-doing card #2. Hopefully I don't have a bad block. :mad:
how much higher is it?
could be air bubbles stuck in there
It was a stand-off that wasn't fully seated from factory. Corrected and now all GPU's idle 21-25 C and full overclock load 33-38 C. All button up and ready to roar:
http://i119.photobucket.com/albums/o...SANY0008-8.jpg
I've had success getting a custom resolution and frequency working in Surround! Running 3600x1877 @ 95Hz. That corrects for a perfect GW900 aspect ratio for each screen, increases the Hz a nice bit above stock settings and the whole display setup us almost a 2:1 width/height ratio which is just about perfect.
I am running these settings right now and they seem to be doing pretty well:
3960X @ 5 GHz @ 1.47v.
105 MHz Base-clock for a little more PCI-E "oomph".
2240 MHz @ 9-11-11-28 on the 16GB RAM. May try and get those timing a bit lower in the future.
Idle @ 28 C and game load 45-55 C. Prime95 full load temps are ~57 C. I also purchased the Intel overclock warranty for $35, small price I think for some piece of mind and they cross ship if you let them put a reserve amount on your credit card.
Nice numbers update, Vega! Thank you.
Now, time for some more videos, please? :D
On my 4-way GTX 680 setup using EVGA reference cards with EK water blocks, my stable overclock is 1202 MHz core and 3534 MHz memory. Pretty much spent all day testing those clocks using Heaven 3.0, BF3, Crysis 2, Metro 2033 and Skyrim with some mods and HD texture pack would crash at the lowest frequency. So modded Skyrim set my ceiling. I found though that performance scales linearly with increased memory clocks and there is no "sweet spot". Get that memory frequency as high as it can go!
Those number are still pretty good considering 4-Way SLI is usually a bit harder to highly overclock. 4-way 680 just destroys Skyrim in Surround:
http://www.youtube.com/watch?v=FvShh...Txt4tbGt5H9Yk=
I am just astounded by how smooth the GTX 680 handles Surround using PCI-E 3.0 slots. There is no micro-stuttering, pauses or jitters. I don't think I could tell the difference between a single GPU running a single monitor and this 4-Way SLI Surround setup. Combine that with the CRT's and this is by far the smoothest and best Surround/Eyefinity setup of mine to date. ;)
It will be interesting to see if any of these new EVGA non-reference cards will allow the voltage to rise above software 1.175 (I think really the only way to get more clocks out of these and just not more power phases).
The great news is that with driver 301.25 I no longer have the Surround setup not working on cold boot-up problem. Everything is working great now in Surround mode.
I've noticed that Witcher 2 has a problem when I load up the game, instead of it filling the screen, it's just a narrow landscape type view into the world. Anyone familiar with Surround/Eyefinity in Witcher 2 know how to fix that?
Can you show us what general-purpose PC usage looks like on that type of setup? Web browsing, the Windows desktop, f00bar, GiMP, Steam, etc, etc?
If these guys at WSGF cant fix it then its an inherent problem with the game. It works in landscape but not portrait.... sorry! http://widescreengamingforum.com/for...er-box?page=44
Hey guys, found this thread the other day and am curious about the PCIe bottleneck thing you guys are talking about. I am thinking of upgrading my X58 rig and I'm not sure whether to get X79 or Z77.
I plan to upgrade to 2 or 3 GTX 680s later on and was wondering about this PCIe 3.0 bottleneck thing. I've currently got 3 screens in surround at 5760 x 1200, would Z77 be a bottleneck for this if I used 2 cards with PCIe 3.0 x8 x8? How about the Z77 MBs with the PLX chip that can do PCIe 3.0 3way SLI at x16 x8 x8?
Is this PCIe bottleneck thing mainly for 3 - 4 cards or does it affect even 2 cards in Surround resolutions?
Two cards I would get Z77. Three cards I would get X79.
Still the X79!
The 2011 socket (X79 chipset currently) is still to get the Ivy bridge Enthusiast treatment. Z77 is the mainstream refresh of Sandy bridge. So you probably wont get another CPU refresh for Z77 where as the X79 is still to get Ivy-E CPU later this year or early next and it will be pin compatible with X79.(hopefully a bios flash to support on current X79 chipset) I cant see anything that the Z77 really has over X79 feature wise so I dont understand why you would go Z77. Just because its new doesnt make it better... See how long X58 lasted as the best platform... It wasnt until Sandybridge last year that the X58 stopped being competitive, at least for 2&3way SLI. But it still ruled for 4way and even now the reality is X79 is only 20% better performance wise than X58 at best. Most games are still GPU bound at the resolutions we play at these days, so a lot of games just cant take advantage of the extra CPU speed or even more than 4 cores.
If you dont mind spending the cash now, i still say go X79 it will have at least another 2 years left in it. If you want to go the slightly cheaper route, go Z77, but you'll not get quite the peformance of the X79 on 3-way and 4-way setups.
In theory the X79 should last longer.
...
Yes, Vega?
I'm still unsure if the PCI-E lanes in Z77 (PCI-E 3.0 x8/x8 in SLI) will be enough bandwidth for two big Kepler GPU's at 1920x1080...?
Yes it will be. 1080P is not demanding on the PCI-E bus. You will be fine.
On another note:
Ordered this to test out:
http://store.sony.com/webapp/wcs/sto...erview/theater
Cross-talk, ghosting and glasses flicker is what has always turned me off with 3D. This supposedly eliminates all three of those problems. I thought it would be cool to test out. Plus it will be cool to see .7 inch OLED screens.
Glad to see OLED making some headroom.
so when are you going to upgrade to these:
http://www.pccasegear.com/index.php?...ducts_id=20122
? :D :D :D
http://www.youtube.com/watch?v=jsX7ogE8Nhg
Make sure to view in 720P.
Guild Wars 2 is as good as I thought it would be, and one of the main reasons I created this unique NVIDIA Surround setup.
Sorry about the black borders between the screens, I do not have a wide-angle video camera lens to accurately capture the seam-less image of the lenses at this time. To see that effect in action, please view the Fresnel Lens video in my channel. In a normal playing/seating position, the three images come together to create one "world" image and the Fresnel lenses add a slight depth effect. It is like looking through a large window out into the world and makes for one amazing gaming experience.
As you can see in the video, SLI is not working properly (developers are working on fix). It is only using around one GTX 680's worth of processing power, but it will read around 3x 33% as it has to send the frames to three different cards in this Surround setup.
My 4th EVGA GTX 680 malfunctioned and I have a replacement on the way. Amazingly, as you can see in the video it runs pretty darn smooth.
I do a quick tour of Divinity's Reach, the most impressive city I've ever seen in any MMORPG. Then I head out to the World vs World vs World area called the "Mists". Here you get bumped up to level 80 but you can see my Ranger still has noob clothes and skills. You still need to level up to properly play WvWvW. I chase some player down and they go hide in the fort as we try and smash the door down. Those doors are pretty strong and take a good while to destroy so I switch over to a PvP "mini-game".
Now in the PvP mini-games, you get leveled to 80 but you also get 80 PvP armor and skills. Here you can see some of the cool effects. I was playing off to the side craning my neck so the camera had a good view. Don't laugh at my playing skill because of this! ;)
The PvP mini-game plays very well and I think I even managed a kill in there somewhere. I am glad I pre-purchased the collectors edition. GW2 is shaping up to be quite the game.
Guild Wars 2? Nahh, bro. It's all about TERA.
I noticed my eyes starting to get pretty tired viewing such a large image through those Fresnel lenses so I am taking my build to a much scaled back and different approach. You really can have too large of a display setup for gaming! I also find myself playing more competitively in games like BF3 and Counter-strike on a single display system (CRT).
Single FW900 @ 2048x1280 / 90 Hz, no Fresnel, single water-cooled GTX 680 overclocked to max, Asus Maximus V Gene and 3770K, 2x Vertex 4's for Raid 0 inbound. Everything will be water cooled, hopefully they come out with a MB water block. Phase change won't be a part of the loop anymore, just the current ambient side and the Geothermal side is still a go in the future.
It's amazing how well a single overclocked GTX 680 runs games on a FW900 @ 2048x1280 / 90 Hz. The smoothness and lack of any sort of lag, display response or input is incredible. I can tell the difference between the frame lag that 4-way SLI introduces and single GPU. I am back to the single GPU / single display fan club until those high-resolution / high refresh rate OLED screens come out.
I will make a post in the market section and should have some good stuff for sale. Everything from A+ FW900's, to 680's with EK blocks on them (and I have one sealed box EVGA GTX 680), Vertex 3's, a really nice 3960X, RIVE, etc etc.
I know, quite a 180 deg turn.. I may have ADD. :D
And now you can say "yes, I really tried that" and have an awesome story to tell about how well it worked (plus the pros and cons of both ways).
I agree.
The immersion factor in games like GW2 on the setup was just crazy awesome. Far beyond any of my LCD Eyefinity/Surround setups. It was also incredibly smooth. The magnified image was so large though and in order to get the aspect ratio with the lenses just right I had to sit pretty close. It really is like being in the world but I found myself having to move my eyes and head way too much and it was actually a hinderance in PvP. With a single top screen like the FW900 you can get virtually all of the action in central vision and you can percieve and react to things faster. That's why I think you can have too large an image for gaming unless you are going for pure immersion.
Really there are two ways you can go: immersion and competitiveness. It is very hard to do both. I am quite a competitive person so I am going with the new simpler build for that. The instant response time and no lag of a single GPU and FW900 is really incredible to play games on.
While I was fleshing out all of my GTX 680's to find which was the best over-clocker I found out I have a beast card. I always wondered why this card all the way at the end in my 4-way SLI config operated so much cooler than the others. It is crazy efficient.
1328 MHz core, 3580 MHz memory at stock voltage! Also EVGA Scanner X and OCCT stable at these frequencies. My other three GTX 680's would top out around 1240-1260 MHz core when used on their own.
This core is so efficient it idles at 18 C! Anyone else see clocks like these? It will make a nice single GPU for my new mini-build.
http://www.youtube.com/watch?v=th4pN...ature=youtu.be
wish I could afford one of those 24" CRTs :(
You can, pick one up at Accurate IT. This may not be as good as the ones that are calibrated by Unkle Vito @ hard forum, but you can calibrate it yourself with Sony WinDAS (run in Windows XP.. I run it in a virtual machine), and a USB -> TTL cable. I bought one of these from eBay when Accurate IT was out of stock for about $550 shipped 2 years ago.. It was totally worth it.
I gamed for quite a while on a 92" projector setup, the immersion for me was superb. Biggest con was the dark room requirement was fairly depressing, I recall reading that newer screens and HD projectors have come a long way.
I tried Eyefinity myself and the few games I tried just didn't have the correct aspect despite fiddling with them and I gave up.
When I have the space again, I will definitely setup another projection based Gaming room with something that will work with a bit of ambient light.
man, you have a monster GTX680. Any update on your rig?