Yeah... I understand where you're coming from dude. It's a tough decision, for sure.
Printable View
Yeah... I understand where you're coming from dude. It's a tough decision, for sure.
http://www.theinquirer.net/gb/inquir...v-release-woes
"Oh yes, we probably should mention that the 9800X2, or whatever they call it, has also moved from CeBIT to the end of March, for now. Before you get all hot and bothered, this is basically a couple of downclocked G92s on 2 PCBs".
Looks like the end of March for now, insted of March 18th?
And "basically a couple of downclocked G92s on 2 PCBs" dosen't sound good as the intro for the new King's arrivial...
I don't think this is what any of us was expecting when the new GTX and GX2 arrived.... :eek:
Yup it's looking like 8800 Ultras or the new ATI cards are the way to go.
There are a few new products that I am waiting for, 790i mobo for SLI or TRI SLI and if Nvidia drops the ball with garbage releases then I am taking a hard look at ATI on a Intel board.
I have not had a ATI card before so that alone is a major step for me.
http://biz.yahoo.com/bw/080222/20080222005768.html?.v=1
SUNNYVALE, Calif.--(BUSINESS WIRE)--AMD (NYSE:AMD - News) today announced that systems utilizing ATI Radeon™ HD graphics cards established new records in Futuremark 3DMark05 and 3DMark06 benchmarking standards, demonstrating once again that ATI Radeon™ HD graphics cards are preferred by overclockers around the globe.
On Feb. 16, MemoryExtreme Team Italy, a group renowned for record-breaking overclocking achievements and better known online as giampa, Leghorn and giorgioprimo, set a new world record 3DMark05 score of 39,133, surpassing the previous record also set using ATI Radeon™ HD graphics cards. The same day, the team captured the 3DMark06 record by posting a score of 30,662, edging the previous world record by 56 points.
The benchmark scores may be accessed at http://www.xtremesystems.org/forums/...d.php?t=177374.
“The scalability, speed and stability of dual-card ATI Radeon™ HD configurations are critical to the success of our record-setting attempts,” said Gian Paolo Collalto, (known as giampa), a MemoryExtreme Team Italy founder. “The continued improvements in Crossfire™ multi-GPU performance backed by industry-leading stability make ATI Radeon™ HD the best choice for overclocking.”
The ATI Radeon™ HD graphics line, is a clear favorite of the overclocker community. As of Feb. 20, 2008, ATI Radeon™ graphics cards take a clean sweep of the current top 20 records in 3DMark05 and 16 of the top 20 3DMark06 scores, for an aggregate 36 of the 40 best 3DMark05 and 3DMark06 scores posted on the Futuremark website.
“The enthusiast community continues to be an incredibly important audience for AMD,” said Matt Skynner, vice president, Marketing, AMD Graphics Products Group. “Our unmatched results in industry benchmarks demonstrate that we listen and deliver when enthusiasts ask for improved multi-GPU scaling, greater overclockability and industry-leading stability.”
Note: AMD is not liable for damages caused by overclocking (even when overclocking is enabled with AMD Overdrive™ software).
About AMD
Advanced Micro Devices (NYSE:AMD - News) is a leading global provider of innovative processing solutions in the computing, graphics and consumer electronics markets. AMD is dedicated to driving open innovation, choice and industry growth by delivering superior customer-centric solutions that empower consumers and businesses worldwide. For more information, visit http://www.amd.com.
AMD, the AMD Arrow logo, ATI, the ATI logo, Radeon, CrossFire, ATI CrossFireX and combinations thereof, are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners.
Contact:
AMD Public Relations
Dave Erskine, 905-882-2600, ext. 8477
dave.erskine@amd.com
Impressive.... :)
The order of cards in the nVidia side of the fence is supossed to look like this, fastest to slowest:
9800 GX2
9800 GTX
8800 Ultra
9800 GT >= 8800 GTX
8800 GTS 512
8800 GT
I need to see where the ATI cards fall in there, and if the GTX and GX2 can actually beat an Ultra...
I have never owned an ATI card myself... :)
We both could be changing camps soon.... ;)
So ATI takes the new record speeds, and nVidia is trouble getting more than 14K out of their new GPU's....
Man how things change...
http://www.vr-zone.com/articles/GeFo...ared/5614.html
http://img170.imageshack.us/img170/1...s3870x2od4.jpg
VR-Zone has gotten some preliminary 3DMark06 scores on the upcoming GeForce 9800 GX2 and 9800 GTX cards. The setup : Core 2 Quad Q6700 2.66GHz processor on a P965 board with Forceware 173.67 drivers and Catalyst driver version 8.451. GeForce 9800 GX2 scored 14225, 9800 GTX scored 13167 while Radeon HD 3870 X2 scored 14301. Nvidia is still tuning up the drivers and the clock speeds aren't finalized yet so we should be seeing some improvements when launched. Currently, 9800 GX2 is slated for March 18th launch while 9800 GTX is slated for end March.
Not looking good :(
I wish they would throw a 8800 GTX on that so we can see how it stands up to those 3.
Yepp...
Unless we see some miracle Drivers released, the 8800 Ultra will still be king (As the fastest single GPU), until the true new tech (Non G92) GPU's get released.
I am amaised. :eek:
Could be the end of the year....
It could very well rule for an unmatched 2 year reign.
Well for now I can just throw my 2 8800GTX's in SLI and wait this thing out...
That is exactly what I would do if I were in your situaction... :up:
I finally ordered my waterblocks for my 3870 X2's last night!
The build is coming together.
I should post up a log. (I've got a couple of logs going in a couple of forums)
I'll definitely post up pics of my loop in your other thread Talonman.
Yeah I would like to see that also :clap:
and how those 3870 X2's perform !
I look foward to your picts...:up:
FYI - He is talking about this one. ;)
http://www.xtremesystems.org/forums/...d.php?t=162085
I guess they played a few tricks with the 9600 GT's clocks to boost performance...
http://www.techpowerup.com/reviews/N...Shady_9600_GT/
This is probably part of the reason the 9600 GT (G94) looks so close to the new 8800 GT 512MB (G92) GPU's level of performance...
ATI's 3870 X2 card is A11 silicon based
Whilst the above image will be A12 silicon based & would be higher operating frequency & greater overclocking, enhance power consumption & improve temperature control. Obviously, the new revised version of 3870 X2 can compete with Nvidia G92 or even better with GDDR4.
The new version RV670 A12 silicon based will be using PLX or rather PEX 86XX series PCIe 2.0, the core & clock memory will be increased. Will not be surprised ATI could even upgrade the memory capacity to 2GB.
RV670 core A12 version is expected to be completed in the second quarter.
http://phyxion.net/hardware/news/ati...ision-a12.html
The price will be higher than current HD3870 X2
http://www.guru3d.com/newsitem.php?id=6572
"It seems that MSI has been goofing up (well they have the reputation for it) by releasing photo's and details on NVIDIA's upcoming GeForce 9800 GX2, this is NVIDIA's top-of-the-line to-be, so it's never too early to start nerding out. The best way to think of the GeForce 9800 GX2 is a 8800 that's been shrunk down to 65nm and SLI'd onto a "single" card. The card is supposed to be at least 30% faster than a 8800 Ultra, and will apparently support Quad SLI. The details over at MSI show the product with 1GB of GDDR3 memory, MSI's GeForce 9800 GX2 should get a GPU core clock of 660 MHz, the Shaders domain at 1650 MHz and memory at a beefy 2400 MHz".
So now our best guess is:
Hypothetical 9800 GTX --------------------------- Hypothetical 9800 GX2
Stream Processors= 128 ---------------------------- A full 128 stream processor on each card, 256 Total.
Core= 675MHz ------------------------------------- Core= 600MHz
Shaders= 1683MHz --------------------------------- Shaders= 1500MHz
Memory= 2200MHz 512MB GDDR3 (136-pin BGA) ------ Memory= 2000MHz 512MB x 2 (1000MHz GDDR3)
256-BIT BUS --------------------------------------- 256-BIT BUS
Two DVI-I and one DHTV-out ----------------------- Two DVI Out
Two SLI Connectors -------------------------------- Supports Quad SLI Configs.
Two 6-Pin PCIe Power Connectors ------------------- Not sure.
Total Board Power= 168W.
CoolerMaster TM67 Cooler (I believe it's this one: http://en.expreview.com/?p=276)
Man...those new ATI chips are interesting.
I don't know if I'll go for the revision or not, because I'm very close to getting my second card already, and if they're more expensive they'll probably exceed my budget...
Excellent job man. Love the Silverstone case.:up:
Thanks Snatch... ;)
zlojack, That new ATI Chip is making think allot more about the new revised version of 3870 X2...
I wonder how much more performance that revision would offer? :)
I can't help it... I keep going back to the GX2 as my favorite choice...
My god.. I think I am a nVidia fan boy and didn't know it... :)
I think I have GX2 feaver or somthing??
I can't get passed that the 9600GT was faster at 1920 x 1200 than the HD3870 in Call of Duty4. It makes me think nVidia is still actually ahead in the speed game no matter what 3DMark06 says...
Look at the size of this Monster, and tell me it should play a mean Crysis...
It's just gotta...:p:
New pictures: http://forums.vr-zone.com/showthread.php?t=243866
It looks like the Hummer of Video Cards?
I also believe in the thinking that OC video cards are actually tested to insure that the chips on that GPU can take the OC. I further think that a certain % of cards don't accept the OC and get sold as stock speed cards. That is why I think buying an OC'ed card is not a waste of money, and considered then to be 'cherry picked' cards, or at least garenteed to take the factory OC, and mabey more...
My ideal GX2 would be factory OC'ed, and with a high end waterblock pre-installed with 1/2" fittings! ;)
Let's hope somebody's working on that already.. lol :up:
Realistically, unless this thing needs it, I am starting to accept that my Flame thrower of a NB, and OC'ed Q6600 will probably end up being all the blocks in my loop.
At least I know I didn't overload the Keg? :rofl:
Yup it's huge :)
I just want to be able to run 3 of those cards hehehe :ROTF:
Will need a big case to run those puppies that's for sure !
My God ... 3! You would have to keep a bucket of water near, just to keep the smoke down... :shocked:
Do you think I should get a decent amount of techno shock stepping into that from my 8600GT? :)
I wonder if I could keep my SB fan, and the GX2 both, if I install it on the top blue slot?
Do you think a GX2 will fit in my rig?
http://img182.imageshack.us/img182/6571/pic115ee2.jpg
Man...something about that card just screams "oven" to me.
I hope it turns out to be a beast...
Oven!!! hehehe I hear you.... :rofl:
Mabey I can put 1 super hot air thing in my TJ09 withought taking too much of a performance hit? ;)
Yes Talon I believe that our cases will handle these with NP, they should because I don't think there are many cases that are bigger :rofl:
Mine will be going in a different case, a TJ07 for my new build. Once I get that one up and running I will work on this one. I have to keep this machine running for work until I can get things all sorted out with the new build.
I hope some WC setup can be worked out with these GX2's because they do sound like they will get warm.
I installed a digital thermometer in my case. It reads 3 or 4 degrees F, above my room at all times wilst system is running...
In my cool room I should be golden with 1 hot card? :D
I will easily be able to tell how much the 'Oven Card' moves my case temp...
Here is the unit: http://www.lacrossetechnology.com/9124/
The remote sensor is the one in my case, and the base unit is what has been giving me my ambient temps, on my OC'ing reports.
I just hope the distance between my Ram and SB fan dosen't bite me in the butt with the GX2!!
I have grown attached to the fan, and would like to keep it. ;)
I pritty much think we are going to have some GX2 numbers from this member...
I still see some sites holding to the 18th. I don't know if the release is the end of the month, or the 18th now...
Man that temp sensor is a good idea :up:
Thanks :)
Currently the unit reports: Room temp is at 65.1F, and Case is hovering at 68.5F...
Ya just Have to love data!
One new picture of the EVGA GX2.
http://forums.vr-zone.com/showthread.php?t=244211
I do like the looks OK. :)
FYI for you ATI Boys- 3870X2 performed up to 25% better on pci-e 2.0 than 1.1? Could be BS too...
http://www.pczilla.net/post/89.html
I don't know if the GX2 would be the same or not, but my Maximus is up for the job... :)
Check out this thread, CEBIT pictures :)
http://www.xtremesystems.org/forums/...d.php?t=179209
Thanks for the link!!
This thread has now become Video Card Central!! :)
Wow.. The waterblock is out!! :up:
http://www.webshop-innovatek.de/asse...530bfea01.html
And it's on a Card... http://www.hexus.net/content/item.php?item=12130
I wonder how good of a block that is? :)
Speed tests...: http://www.hardware-aktuell.com/view...cle=42&seite=3
Yeah I like that WC option :up:
Actually I thought this was the all things Talonman thread :rofl:
It's all about me baby!! This is my thread to be bad!! :p: :rofl:
One of the Beast on a mobo: http://www.hardware-aktuell.com/view...0_GX2_Test.jpg
I hope his SB is higher on his mobo than mine, or I am going to have to say bye-bye to my fan.
I think it looks like it will fit with my SB fan installed, judging from the room above the SATA ports.
It will be tight though...
I need to see a GX2 mounted to a Maximus, with a straight down angle to see how close the card gets to the SB. :)
Lol !
I bit more evidence that it may be an Oven Card? :)
http://www.hexus.net/content/item.php?item=12124
If heat is an issue... At least it's G92 heat. It could be worse...
I just may need to get the GX2 on water?
That last Dude in the thread reported:
"it is ment to perform 1.3x better then a single 8800GTX, i believe the report hasnt changed".
That would be incorrect. It is reported to be at least 30% faster than an 8800 Ultra, just for the record! :)
A picture of a 9800 GTX next to an 8800 Ultra:
http://nebulamods.com/terraphantmftp...Kb9AWoaypi.jpg
The smaller chiped bottom card is the 9800 GTX.
The odd thing about it is it's PCB simply is much more complex than GTS 512's...
It also has two power connectors, and it's powersupply-system is even heavier than the 8800 Ultra.
We are currently hoping we are missing some feature about this card...
That is allot of changes for a GPU that is supossed to be nothing more than a faster 9800 GTS... :shrug:
The 9800 GTS with less capacitors in the power section:
http://bbs.unihw.com/attachments/for....jpg.thumb.jpg
If some secret thing was known about the 9800 GTX, that made it darn near as fast as the GX2, but no SLI and thinner...
I could flop back over to the GTX camp.. ;)
Decissions ... decissions...
Man we need reviews!!
Note that the GTX was still on revision 2A of G92: :(
http://bbs.chiphell.com/attachments/...EdLjA66RM7.jpg
9800 GX2 Video: :) Not in action though... :(
http://www.hexus.tv/show/2008/03/EVG...at_CeBIT_2008/
More video's:
http://www.youtube.com/watch?v=R0BujA55Thw
http://www.youtube.com/watch?v=a4xXwkUwlpw&NR=1
http://www.youtube.com/watch?v=8w8vgtKKu0A
Better GX2 Picture:
http://img.hexus.net/v2/internationa...lbaGX2-big.jpg
Another GX2:
http://www.engadget.com/tag/9800%20GX2/
Trivia: This report indicates that a 1GB 8800 GT would run no faster than a 512MB GT. http://techreport.com/articles.x/14230/1
Some feel this supports the idea that the BUS is the bottleneck on these cards.
The GX2 will have an extra advantage here with 2 cards working togeather...
A neet chart...
http://img260.imageshack.us/img260/74/pricesak7.png
Note that the 8800 GT and the 9800 GT will be about the same card...
Conformation on the GX2 price of $599.00.
http://www.nordichardware.com/news,7396.html
"Considering the Radeon HD 3870X2 costs little more than half that, and with the upcoming launch of CrossFireX, it does seem a bit expensive. Especially when you realize that this is basically two down-clocked GeForce 8800GTS cards slapped together and underclocked. The G92-450 core does sport some architectural improvements, such as OpenGL 2.1 support, but the 600MHz core, 1500MHz shader and 2000MHz memory clock is not going to impress many unless SLI delivers some kind of magic".
Makes me think the GX2 might be faster than we know? http://xtreview.com/addcomment-id-43...rk-result.html
"Today associate published new data about the speed level of geForce 9800 GX2. Tests were in 64-bit version Windows vista, supposedly with the four core processor core 2 Extreme QX9650 (3.0 GHz). The default video card frequencies was 600/1500/2000 MHz and obtained the following results":
3DMark'05 - > 17 600 points;
3DMark'06 - > 14 400 points.
GeForce 9800 GX2 Overclocked to frequencies 730/1500/2080 MHz allow to improve the achievements:
3DMark'05 - > 19 400 points;
3DMark'06 - > 16 100 points.
We note that the frequency of shader domain was not increased.
News Flash!!
:http://www.techarp.com/article/Deskt...idia_4_big.png
The 9800GTX is now listed with a 512 bit bus, but the GX2 is listed as 256 bits x 2? If this were true... it could makd the GTX faster than the GX2?
Info could be wrong on the site... but interesting none the less.
Nice work GPU Reporter on the go Talonman !!
I am finding out all I can... :)
I need to see if the GX2 or GTX will do better in 1920x1200 res, but it is going to be hard waiting on the GTX....
I think the smart money would tell me that for shere animal speed at 1920x1200, the GX2 would do better.
I really don't want to wait for the GTX'es release if it takes too long. It will take all the restraining power i have to not click buy on the first GX2 I see for sale....
I will get the EVGA card, so I have the full purchase price to step up if the GTX2 turnes out to be a bad card. It is kind of a way to cover my bet.
2x EVGA 9800gx2's ordered today :)
No way!!! You Rock!! I am waiting 2 weeks until EVGA makes me the OC model.
You know I think we have a winnner on our hands...
Using this review's numbers: http://publish.it168.com/2008/0315/20080315000604.shtml
It still holds true that the 3870x2 does well in 3DMark06, but slower in actual game play.
If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)
-------------------------------------- 3870x2 ------- 9800 GX2 --- %Diff
3DMark06 ------------------------------ 14,190 -------- 14,167 ------ 23 Points lower
Crysis --------------------------------- 13FPS -------- 22FPS ----- 69% faster
Lost Planet ---------------------------- 24FPS -------- 31FPS ----- 29% faster
Lost Planet 8xAA/16xAF---------------- 11FPS --------- 13FPS ----- 18% faster
Company of Heros --------------------- 56FPS --------- 77FPS ----- 37% faster
Word in Conflict ----------------------- 34FPS --------- 41FPS ----- 20% faster
Word in Conflict 4xAA/16xAF ----------- 14FPS --------- 26FPS ----- 85% faster
Half Life ----------------------------- 111FPS -------- 115FPS ------ 3% faster
Half Life 8xAA/16xAF ------------------ 73FPS --------- 76FPS ------ 4% faster
NeedForSpeed ------------------------ 48FPS -------- 102FPS --- 112% faster
NeedForSpeed 4xAA ------------------ 33FPS ---------- 91FPS --- 175% faster
Unreal Tournament ------------------ 112FPS --------- 109FPS ----- 2% slower
Quake4 ----------------------------- 115FPS --------- 120FPS ----- 4% faster
Lost Planet -------------------------- 11FPS ----------- 13FPS --- 18% faster
FEAR 4xAAx16xAF ------------------- 87FPS ------------ 79FPS --- 10% slower
On average, the GX2 is currently 41% faster than the 3870x2 at 1920 x 1200 res.
I still think the GX2 is the best single card to have right now...
Using the anandtech review's numbers: http://anandtech.com/video/showdoc.aspx?i=3266&p=1
Putting the 8800 Ultra against the 9800 GX2...
If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)
----------------------------------------- 8800 Ultra --------- 9800 GX2 --- %Diff
Call of Duty ------------------------------- 70FPS ------------- 110FPS ---- 57% faster
Call of Duty 4Xaa------------------------- 55.7FPS ----------- 83.1FPS ----- 49% faster
Crysis High Quality------------------------ 25.8FPS ----------- 39.4FPS ----- 52% faster
Oblivion ----------------------------------- 46FPS ----------- 84.3FPS ----- 83% faster
Oblivion 4XAA ---------------------------- 36.1FPS ---------- 63.9FPS ------ 77% faster
Quake Wars ------------------------------ 85.2FPS -------- 120.4FPS ------- 41% faster
Stalker ---------------------------------- 48.8FPS ---------- 73.3FPS ------- 50% faster
Word in Conflict ---------------------------- 30FPS ----------- 33FPS ------- 10% faster.
On average, the GX2 is currently 52% faster than the 8800 Ultra at 1920 x 1200 res.
The 8800 Ultra is no longer the single fastest card, and nVidia hit their 30% minimum speed increase over the Ultra.
Look at my PM I sent you. I have many more surprises hehe :)
Let me try and make a list of upgrades, it might take a minute LOL
Ok had it allready, I forgot haha
Ordered and being shipped so far.
Silverstone TJ07 – receive tomorrow
QX9650 – receive tomorrow
Ordered today:
2 each EVGA 01G-P3-N891-AR GeForce 9800 GX2 1GB 512-bit GDDR3 PCI Express 2.0 x16 HDCP Ready SLI Supported Video Card
EVGA 132-CK-NF79-A1 LGA 775 NVIDIA nForce 790i Ultra SLI ATX Intel Motherboard
G.SKILL 4GB(2 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Dual Channel Kit Desktop Memory
Thermaltake Toughpower W0133RU ATX12V / EPS12V 1200W Power Supply
HITACHI Deskstar 7K1000 HDS721010KLA330 (0A34915) 1TB 7200 RPM 32MB Cache SATA 3.0Gb/s Hard Drive
1 Single Stage Phase Unit - Ordered, 2-3 weeks lead time
Now just waiting on:
OCZ OCZSSD64GB 2.5" 64GB SATA Internal Solid state disk (SSD) - Waiting for release
Wow!! heavy duty stuff!!
Sweet!!
Oh shoot I forgot a floppy disk drive :rofl:
lol... you will get by withought it!:p:
Well it's all here :)
Still to come... OCZ 64gig SSD and SS Phase unit.
Still no reply from the WaterKeg guy :( I will hold off on that for a later project if this system seems to run cool enough.
Gasp!!!
Wow man... just Wow!
Quality supplies for the making of a monster!
Yeah a Monster it is so far HAHA !
So many new toys it's hard to decided what to look at first :)
...
......
..
..
Thats insane !!!!!!!!!! post more pictures please!!!!1 :clap: Oh man beautifull components :D
Buckeye, you will be posting a build log, right? Please? :D
We will need to see pictures.
hehe oh man, it takes a brave man to post a build log :) I will be a total newbie at some of this build, first time Phase user but I want to learn it all.
But yes I will do that. I still have some lead times on a few components, SSD and Phase unit, still a few weeks off for those. I cannot mod the case until the Phase unit arrives so I can see where holes and cuts need to be made.
I was planning on holding out for the new OCZ SSD unit and I believe it will not be out until the 31st.
If that falls through then I will go with a MTRON unit from here http://www.dvnation.com/ as they seem to be the only ones who stock these babies.
Then build will go like..
First build, 1 9800gx2, everything at stock to get base line numbers
Start Ocing to see what that can do.
Back to Stock settings, then add second 9800gx2 for base line numbers
then finally 2x 9800gx2's and OC
Ok talked with the guys over at http://www.dvnation.com/ about there SSD's today, very cool people and very helpful !
I decided on a MTRON PRO 3.5 32GB SATA SSD to see how that goes, option two would be to order a second SSD and raid 0 them together if it appears that it is needed. He said his 64GB loads MicroSoft FlightSim in like 4 seconds so I am thinking that one unit will be enough for speed but if it looks like I will need more space I will order a second unit later.
So 1x MTRON PRO 3.5 32GB SATA SSD on order today, Woot :up:
LOL I also talked with RamSan and recieved a quote
But after receiving a quote I have decided that I might have to pass on these units. I may have deep pockets, but for the price of these units I could put a down payment on a house, buy a new car or some other more practical item.
So here you go, take a look and see what you can get if you wanted to spend the cash
http://www.superssd.com/products/ramsan-400/
RamSan-300
The starting capacity of a RamSan-300 (16GB) is $20,000. It includes:
-16GB DDRRAM storage
-one dual-ported 4Gb Fibre Channel controller
-hot swappable hard disk drives
-hot swappable and redundant power supplies
-redundant battery and fans
-IBM Chipkill in memory (redundant RAM)
-1 year return to factory warranty
Each additional 4Gb FC controller is $3,000 (up to 2 in each chassis).
The RamSan-300 can upgrade to 32GB for an additional $11,200.
RamSan-400
The starting capacity of a RamSan-400 (32GB) is $35,000. It includes:
-32GB DDRRAM storage
-one dual-ported 4Gb Fibre Channel controller
-hot swappable RAID 3 hard disk drives
-hot swappable and redundant power supplies
-redundant battery and fans
-IBM Chipkill in memory (redundant RAM)
-1 year return to factory warranty
Man, the 400 has:
Over 400,000 random I/Os per second.
3000 MB/s random sustained external throughput.
Gasp..
I can't imagine what that would be like, nor could I afford one! :)
Fun to look though... :D
LOL yeah they are a bit over the top price wise :ROTF:
I had a little box waiting for me when I got home today.
Oh Yeah ! :up:
You got one? :D :shocked:
Super monster!!
I stepped out of my GX2, and into a GTX 280. I currently have a single loop with two blocks cooling my system. I was consirned how my temps would be adding a 280 into the same loop.
My Cooler: The WaterKegIII Xtreme with ThermoChill PA120.3 RAD running with three 70.5 CFM Yate Loon's, and Laing D5 pump:
http://img525.imageshack.us/img525/213/pic083vj7.jpg
My two existing blocks: D-TEK FuZion CPU, and EK S-MAX Acetal for my NB. (D-Tek Quad nozzle and gasket installed in CPU block.)
http://img207.imageshack.us/img207/3924/pic108og8.jpg
My simple loop with the air cooled GX2 installed.
http://img114.imageshack.us/img114/2016/gx22vj8.jpg
My PC next to the cooler:
http://img527.imageshack.us/img527/6796/pic142so0.jpg
My current CPU temps that I enjoy with a Q6600 @ 3.8GHz:
http://img516.imageshack.us/img516/4846/run30py7.jpg
My EVGA Hydro Copper 16 Waterblock, waiting for my 280...
It came with the 3/8" and 1/2" fittings. I will be using the 1/2"ers.
http://img360.imageshack.us/img360/530/280blockwb4.jpg
The full cover block with backplate is $179.99
http://www.evga.com/products/moreInf...%20Waterblocks
BTW: If you would like to see the inside of the block, look here: http://www.legitreviews.com/article/726/4/
It has tiny E's inside for EVGA! :p:
Note: With this block the backplate helps to absorbe heat from the memory chips, and transfer it to the waterblock:
You put some TIM on the backplate to better transfer the heat to the front.
Description:
Featuring a sleek-modern look and a full copper design, EVGA continues to incorporate only the best attributes that make up the Hydro Copper Waterblock Series. An extreme high flow path design with a unique, integrated pressure point allows the Hydro Copper 16 Waterblock to keep your GTX 280 or GTX 260 as cool as can be while under even the heaviest of graphical loads. Exclusive only to EVGA, patent pending -flow technology aids in dispersing heat from GTX 280 and GTX 260 graphics processing unit (GPU). EVGA also provides full coverage with the addition of a heat-piped back plate that links with the Hydro Copper 16 Waterblock using patent pending "Co-op" technology.
Here is the official EVGA Hydro Copper 16 Waterblock install guide: http://www.evga.com/products/pdf/200-CU-HC16-B1.pdf
This is where my system speed was currently at:
NOTE: System had to Prime 2Hrs error free before numbers were recorded.
Note: The only thing OC'ed on my GX2 SSC is the fan! ;)
FSB -- DDR2 ----- BIOS ----- CPU-Z ------ BIOS ----- Everest - CPU-Z --- 3DMark06 --- SM2.0 -- SM3.0 --- CPU ------ Vantage --- CPU --- Graphics
420 -- 840MHz -- 3.79GHz -- 3.7801GHz -- 1.51875v -- 1.50v -- 1.496v ---- 19,296 ----- 7,661 --- 9,062 --- 5,375 ----- P10,553 -- 13,167 -- 9,898
421 -- 842MHz -- 3.80GHz -- 3.7895GHz -- 1.51875v -- 1.50v -- 1.496v ---- 19,305 ----- 7,647 --- 9,078 --- 5,385 ----- P10,553 -- 13,164 -- 9,899
422 -- 844MHz -- 3.81GHz -- 3.7979GHz -- 1.51875v -- 1.50v -- 1.496v ---- 19,364 ----- 7,690 --- 9,113 --- 5,369 ----- P10,583 -- 13,281 -- 9,912
423 -- 846MHz -- 3.82GHz -- 3.8074GHz -- 1.53125v -- 1.51v -- 1.512v ---- 19,164 ----- 7,550 --- 9,042 --- 5,358 ----- P10,595 -- 13,441 -- 9,896
424 -- 848MHz -- 3.83GHz -- 3.8164GHz -- 1.53125v -- 1.51v -- 1.512v ---- 19,226 ----- 7,570 --- 9,047 --- 5,419 ----- P10,574 -- 13,254 -- 9,906
425 -- 850MHz -- 3.84GHz -- 3.8252GHz -- 1.53125v -- 1.51v -- 1.512v ---- 19,503 ----- 7,766 --- 9,141 --- 5,428 ----- P10,574 -- 13,262 -- 9,905
426 -- 852MHz -- 3.85GHz -- 3.8346GHz -- 1.53750v -- 1.53v -- 1.528v ---- 19,499 ----- 7,762 --- 9,123 --- 5,447 ----- P10,564 -- 13,232 -- 9,899
427 -- 854MHz -- 3.86GHz -- 3.8430GHz -- 1.53750v -- 1.52v -- 1.520v ---- 19,237 ----- 7,598 --- 9,015 --- 5,385 ----- P10,475 -- 13,515 -- 9,745
428 -- 856MHz -- 3.87GHz -- 3.8525GHz -- 1.53750v -- 1.52v -- 1.520v ---- 19,592 ----- 7,795 --- 9,185 --- 5,457 ----- P10,453 -- 13,486 -- 9,724
For each FSB test, I rebooted and waited for all disk activity to stop. I ran Vantage first, then 3DMark06 second. Each program only received 1 run, and data was recorded.
Chart data recorded:
FSB selected - What DDR2 Frequency will be required - How fast the BIOS reports when booting - How fast CPU-Z reports from Vista - Lowest voltage required to Prime 2hrs (BIOS setting) - Everest reported Core voltage - CPU-Z reported core voltage - 3DMark06 score using default settings - 3DMark Vantage using default settings.
Currently with just my CPU and NB in my loop, my max temp's don't cross 60C two hours into Prime! Usually around 55C most of the time on all 4 cores. It only spikes to about 59C for a few seconds, even when I'm hitting my chip with 1.5v's.
My 280:
http://img329.imageshack.us/img329/8023/280bpq2.jpg
http://img329.imageshack.us/img329/6846/280cfn4.jpg
http://img329.imageshack.us/img329/5428/280emq6.jpg
http://img329.imageshack.us/img329/1998/280gnx0.jpg
I figured I better just install her as is, to make sure she worked before I took it apart for the block installation.
She installed just fine... :cool:
http://img360.imageshack.us/img360/2...stalledce8.jpg
How she looks idle, with stock clock settings, and fan set to 75%. Room temp was 70.3F.
http://img360.imageshack.us/img360/6997/280idlevy5.jpg
15 minutes of some smooth COD4!!
It ran my temp up to 64C.
http://img382.imageshack.us/img382/2118/cod4wz7.jpg
One Vantage run using stock settings, just to see... :)
http://img61.imageshack.us/img61/3509/280vantageej3.jpg
My cover is on the PC, and I have moved my fan up to 100%.
I want to know the best the fan can do before I go water cooled.
I'm trying to give it every advantage, to see how much my block can actually lower my temps beond the stock coolings ability.
I ran 3DMark06 at default settings, and kept Precision open taking a GPU temp reading after each test completed.
Temp after Return to Proxycon: 59C
Temp after Firefly Forest: 59C
Temp after CPU1 test: 51C
Temp after CPU test2: 49C
Temp after Canyon Flight 59C
Temp after Deep Freeze: 60C
Not too bad... :)
1 Hour of UT3, Max detail settings 1920x1200 and V-Sync on 57C.
I do have the UT3 Physx Mod installed too. ;)
62 FPS non-VSync Mode
60 FPS with V-Sync on.
For testing I went after the big dog of heat generation, Crysis.
I set it up at 1920x1200, all high except Object Quality was Medium. (4xAA no V-sync)
With the fan set to 100%, and playing from the beginning for 1 hour, 70C was the max I could get GPU temp up to.
I was convinced my card was functioning within normal paramaters. (Time to mod it!)
Install went well:
Removed 2 black DVI bracet screws, 2 black rubber plugs, and 10 screws from the back. You have to pry off the heatsink:
This is what I found underneath.
http://img92.imageshack.us/img92/854/001xe5.jpg
I cleaned off the memory chips:
http://img92.imageshack.us/img92/1567/010am9.jpg
We now remove the heatsink, from the GPU.
http://img92.imageshack.us/img92/5764/018wc2.jpg
Cleaned the gunk off:
http://img92.imageshack.us/img92/7176/026gn6.jpg
I put some MX-2 on, and I know, most like to just put a dot...
I can help myself, I had to distribute it on the face of the chips too!! :eek:
You attach the block with the provided mounting bolts, insuring you line it up correctly when it contacts the card. You want your TIM to contact the raised spots on the block.
http://img92.imageshack.us/img92/6209/043er9.jpg
For the memory chips, and under side of GPU, I used the supplied thermal pads:
http://img92.imageshack.us/img92/151/047jw4.jpg
Attached the backplate:
TIM is applied where the 2 bolts join backplate to block, near the barbs.
http://img92.imageshack.us/img92/3612/049ht2.jpg
I set mine up like the example in the lower right corner of the page.
As you look at the barbs in the block, OUT should on the Left hand side, and IN on the Right. Having the barbs pointing up or down makes no difference. :)
http://img240.imageshack.us/img240/6751/087rc9.jpg
or, Out is E, In is A. ;)
I did use the 1/2" barbs, and it's all ready for install: :cool:
http://img92.imageshack.us/img92/3651/052ep5.jpg
Added er into the loop!:
http://img380.imageshack.us/img380/8610/080bg2.jpg
How she looks idle, on water:
http://img380.imageshack.us/img380/2...lewateror4.jpg
Looks like at idle:
CPU Core 1 through 4 are all checking in at 1C higher. No big deal...
NB had no change... (I have always liked this NB block.)
SB is also reading 1C higher... (And not in the loop...)
GPU was 38C, and now is 34C. I lost 4C off my GPU temp at idle... :p:
On to load testing! :cool:
3DMark06 on water... Current basement temp: 71.6F
I kept Precision's on screen display active, and watched the screen like a hawk recording the highest temp I reached during each test.
3DMark06 at default settings:
Temp during Return to Proxycon: 43C (16C cooler!)
Temp during Firefly Forest: 44C (15C cooler!)
Temp during CPU1 test: 39C (12C cooler!)
Temp during CPU test2: 39C (10C cooler!)
Temp during Canyon Flight 44C (15C cooler!)
Temp during Deep Freeze: 44C (15C cooler!)
I have to say that I am impressed getting up to 16C cooler temps, with no fan noise under load. I had to run the fan at 100% to generate my first set of numbers. :p:
I am thinking that keeping my loaded temps below 50C means I'm doing pritty good. ;)
Another thing I noticed was when re-filling my loop, my flow still seemed plenty strong jetting my water through the lines. I was consirned about having enouigh flow for the 3 blocks. I now think it's a non-issue.
It still looked plenty strong to me... :D
If the question is asked if you can cool an over-clocked Q6600, a 280, and NB all on a Thermochill PA 120.3, my answer would be YES...
It might also be said that as expected, the full cover block shines under load, rather than just at idle on temperature reduction.
The 4C cooler temp at idle is nice, but the 16C less under load stands out to me as more valuable to the OC'er... :D
30 minutes of Prime let me hit a high temp of 60C.
http://img159.imageshack.us/img159/986/primees3.jpg
Before the block install 59C was my max on the same test... :cool:
To be honest though, I was use to seeing my CPU hover at about 54C Priming. Now it hovers at 59 or 60C. My max temp may be showing only 1C higher so far, but my 'Normal Prime Operating Temp' went up about 5C. Adding the 280's block into the loop didn't come for free as far as the CPU is consirned. ;)
Also note that my 280 is now showing an idle temp of 40C insted of the 34C I see when not Priming. Having all 4 cores at 100% is dumping some extra heat into my loop. :D
I guess I can roll with that!! :p:
My new MAX GPU temp for UT3 with the same settings is 44C.
That's 13C lower, than what I had on the stock cooling. :)
Using the same settings I had on the air test, fired Crysis up again. 48C looks to be about the most heat Crysis will now generate...
This is right after exiting the game, and the highest GPU temp I have been able to generate since the install:
http://img519.imageshack.us/img519/581/crysisye5.jpg
That's a Mighty 22C lower than my air test playing Crysis. :D
Impressive for sure!! :clap:
OC results:
I used Precision's on screen display and recorded the highest temp I could hit during each test of Vantage.
Even if the temp only displayed for a micro-jiffie, it was recorded if it was a new high.
I kept upping my clock settings, to see how it would change my load temps.
Vantage using default settings:
280 Set To: -- Stock --- SC --- SSC --- FTW ---- HC
____________ Speed _ Speed _ Speed _ Speed _ Speed
Jane Nash ----- 44C ---- 45C -- 46C ---- 46C --- 47C
New Calico ---- 44C ---- 46C -- 46C ---- 47C --- 47C
Airplanes1 ----- 39C ---- 40C --- 41C --- 41C --- 41C
Airplanes2 ----- 42C ---- 42C --- 43C --- 43C --- 43C
Colors1 -------- 44C ---- 45C --- 45C --- 46C --- 46C
Colors2 -------- 43C ---- 44C --- 44C --- 45C --- 45C
Mountains ----- 45C ---- 46C --- 47C --- 47C --- 47C
Flags ---------- 42C --- 42C ---- 43C --- 43C --- 43C
Dots ---------- 42C ---- 43C --- 43C --- 43C --- 44C
Fog ----------- 46C ---- 47C --- 47C --- 47C --- 48C
Stock 280 scored= P12178 (602/2214)
280 to SC speed scored = P12567 (621/2268)
280 to SSC speed scored = P13013 (648/2322)
280 to FTW speed scored = P13586 (670/2430)
280 to HC speed scored = P13878 (691/2430)
I haven't noticed any screen artifacts so far... :cool:
My 280's Vantage score set to HC speed:
http://img377.imageshack.us/img377/523/wcscorezz4.jpg
I have added almost 2K on to my Vantage score just from the OC!
Still running at HC speed, I ran 3DMark06.
45C was the highest temp it could generate on my GPU during the entire run. :p:
Most of the time it ran around 40C.
http://img398.imageshack.us/img398/2...hcscorehy7.jpg
I decided to run Core=702MHz, Shaders=1512MHz, and Memory=1269MHz on my 280.
Still no artifacts, and Vantage is still not taking my GPU temp above 48C. :cool:
My new 'Keeper' settings gives me a 14K score in Vantage. :D
http://img67.imageshack.us/img67/7478/14key9.jpg
Once again, what my 280 scored using the default clock speeds:
http://img61.imageshack.us/img61/3509/280vantageej3.jpg
A difference of 1,823 Vantage Points! :D:cool:
Ya gotta love the OC!
I love the block. Understand too that keeping your 280 below 50C is not this blocks performance limit. One of the boys at the EVGA site is keeping his OC'ed 280's load temp below 40C. He is running in the 30's under load, but dosen't have a CPU in the same loop.
Keeping a OC'ed 280 below 50C is just my limit. Well, it is for now until my cold a$$ basement returnes to about 59F come winter! (Snicker...)
The RAD is really going to love it this year.
I am ok with this setup and my 1 loop. I think it should keep me out of trouble...
Core=702MHz, Shaders=1512MHz, and Memory=1269MHz producing 14K in Vantage with loaded temps under 50C is the end of the story for me... :D
NOTE: It is my opinion that even though when OC'ing from stock speed, to HC speed, I only had a 2 to 3C change in load temp, I don't believe using the stock sink/fan cooling, would produce the same result.
Crysis for example was much hotter on air than on water. I think this is a direct result of that huge slab of Hydro Copper. It flat out handles the heat better, and like mine for example, HATES to go over 48C.
Crysis on the stock cooling with fan set to 100% took me to 70C.
I would expect the same OC on the stock cooling, would move your max temp more than it did on the water rig. If you are running the stock cooling, you probably shouldn't be expecting the same tiny 2 to 3C change in load temp, if you OC from stock to HC speed.
For trivia: My TJ09 has (5) 120mm Scythe S-FLEX Case Fans, a Corsair CMXAF1 ram cooler, A 60MM fan on top of that pointing at my coils, (1) 120mm Silverstone fan that came in my case, suspended from my GPU's power cables, pointing directly up at the GPU. My Ohio basement was also a cool 70F.
All of that to let you know that my case get's good cool air flow. The stock cooling had allot of things going it's way.
I just wouldn't want to NOT give the block it's due credit, when it comes to reducing your loaded temps. That is after all, it's specialty. :)
The stock fan can't run with this sucker!
The End...
http://img362.imageshack.us/img362/1839/094pg8.jpg
Wow...great stuff Talonman. I hope you enjoy your 280!
Thanks... :)
Still playing with OC'ing my card...
These are some of the valid numbers I can lock my core into:
691
702
713
729
My issue is that when I set my Core to 702, it would cause Vantages Crash'n'Burn Physics test to crash my system about every fifth run. I fired up Vantage and turned all other tests off, just to speed the process up. I also set the test up to run 3 times when I click go. With the core set to 702 I would keep running that single test to see how many successful runs I could complete in a row. Five was about it before the system would re-boot.
Setting the core back down to the next lowest number that would lock in, was 691 (HC speed). With the core set back to 691, Vantages Crash'n'Burn test ran clean 15 runs in a row for me. When my core was 702 it would crash about every 5th run, and core at 713 would crash about every other run...
Like it or not, if I want an error free Vantage, 691 has to be my core's number. :p: (So far...)
Shaders were successfully working for me at 1512. I went up to the next higher number that I could lock in, which is 1566. With my victory number 691 locked into the core, I gave the 1566 shader setting a try. Jane Nash had a big problem with this, and promptly caused the system to re-boot about 20 seconds into the action.
Setting the shaders back to 1512 corrected all issues.
My memory was successfully OC'ed to 1323, and is still working fine for me at that setting. :up:
Core=691, Shaders=1512, and Memory=1323 are my current settings.
Note: A stock 280 HC is set to Core=691, Shaders=1458, and Memory=1215. I still am running a wee bit over that... :)
Still getting 14K in Vantage, with cool GPU temps under load, and still lovin my Vanilla OC'ed Water Cooled 280! :surf:
Give us a look at the X preset, if you get a chance. I'd like to see what it does when it's pushed a bit!
Sweet benchies.
Nice clean rig :eek:
Perkam
Thanks! :cool:
I have what I think is more favorable info on the block, and watercooling in general.
I was reading a thread about some users having 280 heat issues, and one post recomended running FurMark to test how well your 280 handles heat.
http://www.ozone3d.net/benchmarks/fur/
I did just that, and on my system it was able to generate more heat then Crysis. :eek:
Current basement temp was, 71.2F.
NOTE: I was still playing with 702 core here...
First I ran the two Benchmarking tests, Time Based, and Frame Based.
Using Precision's OSD recorded my temps.
Options: I checked the Fullscreen box, and selected 1920x1200.
Both the Time Based and Frame Based ran my temps up to 49C. (Still darn good I think..) :cool:
http://img516.imageshack.us/img516/8017/furmarkzt7.jpg
Next, I went after the Stability test set at 1920x1200, Fullscreen, and 2xMSAA Samples.
From close to the beginning, to 134 sec= 50C
135 to 230 sec= 51C
231 to 331 sec= 52C
332 to 500 sec= 53C
http://img516.imageshack.us/img516/2...abilityty2.jpg
53C is my new highest GPU temp that I have hit on water...
Considering the GPU temps I hear this program can generate, I think this is still doing very well... :cool:
You have Aero On? :eek: :shakes:
Perkam
Yes, should I not? :)
Thanks for the tip... Now turned off!! :p:
As requested... My lone 280 with Vantage on X setting.
Core=691, Shaders=1512, and Memory=1323 are my current settings.
http://img516.imageshack.us/img516/3822/lone280ew4.jpg
Sweet score!
Thanks.. :p:
I gave "Video Card Stability Test" a run to see just to see how I would do on my temp...
http://freestone-group.com/video-car...ility-test.htm
It ran my water cooled 280 clean up to 46C! ;)
http://img147.imageshack.us/img147/3690/heatzw3.jpg
Ya gotta love watercooling... :D
I scored 1277...
http://img183.imageshack.us/img183/5249/1277hh2.jpg
Very Nice Talon !
Glad to see you were able to setup. I purchased mine on release day and the new cards came out 1 day after my setup expired, EVGA said no to setup for me !
Glad to see that you rig is doing very well. How did you manage to keep the insides dust free all this time ? :rofl:
Air compressor... :) 1 freekin day... Man.... That is rough.
Holding out for the SSC saved my butt or I would have been right there with you.
GX2 SSC released April 1st, I baught April 1st. :p:
I knew I would step-up if I could... That's why I was holding off on buying a GPU block. I wanted to buy for the card I would end up with.
I figure I will keep my 280 for a year or so, might as well keep er running cool.
I think EVGA did a good job with this block.
I am also glad that my RAD can handle the load... :up:
I wonder within the next 12 months, how many games will be made that a single 280 won't be able to provide an enjoyable gaming experience on?
I'm thinking probably 0.
The ATI cards are attractive, but I can be happy for a while sitting on my H20 280.
I wonder what will be 'THE' card to have in July of 2009? :)
Put me down for 1!!
Update: Using the new version of Precision I was able to set my 280 to higher clock settings, and still be stable. I believe this directly related to the new version. I works better for me and others now too. :)
Now currently running error free @ Core= 738, Shaders= 1512, and Memory= 1323, I went for more.
For my Core: The next highest number it would lock into was 741, odd thing was I needed to enter 742 before the core's clock would actually jump to that speed. After the core jumped to 741, I changed the 742, to 741, and the core returned to 738. Unexpected.. :D (I put the 742 back in!)
At the same time I moved my shaders up from 1512 to 1566. The last time I tried my shaders at this speed, Jane Nash rebooted my system. That was on the old version of Precision.
So with Core= 741, Shaders= 1566, and Memory= 1323, launched Vantage for a run. Jane Nash rebooted my system again, not too far into the action.
I kept my Core at 741, and backed my Shaders down to 1512, and still kept my Memory at 1323. I suspect my Shaders just won't run at 1566 on my 280. I have tried a few times, and have never made it passed Jane Nash even once. I ran Vantage again to see if any test would take issues with my 741 Core.
I am happy to report that so far Vantage likes!! :clap:
P14863 with Core= 741, Shaders= 1512, and Memory= 1323:
http://img295.imageshack.us/img295/7226/p14863ll1.jpg
With thoes settings it ran my 280 clean up to 40C!! I'm gunna have to keep an eye on that... NOT! :D
I have no intention of upping my memory any higher. It has ran for me error free the entire time at 1323. I think I have more bandwidth than my GPU probably can use already.
I also wonder if the Shaders jumping from 1512 to 1566 is controlled by the driver, or the hardware itself. I would like to try out a shader speed in between the two. If the Shaders valid speed increments available to lock into, is controlled by the driver, make a driver with mabey a 1525 Shader speed option. :)
I would like to take my Shaders out for a spin at that speed. I think 1566 is just out for me. :)
Still Way Happy. I am going to run for a day or two this way to insure it's stable on games too. I still may try to coax more out of my core down the line, but not today... :p:
How do I reconcile this, to having the system rebooting at core 702, and 720 using the old Precision...
Same driver, and no BIOS changes have been made to the system in any way....
I was running unlinked Core and Shaders in both versions of Precision too.
Again, I have no clue why it now works for me, it just does... :D
Wow Talonman, pretty sweet stuff there!
You've really ended up with quite the system there.
I am happy with it for sure... Thanks. :p:
Update: After 1 hour of some COD4, my system re-booted on me while playing.
I think my having my core set to 741 may be a wee bit too much.
Going back to Core= 738, Shaders= 1512, and Memory= 1323.
Time for more testing! ;)
http://img386.imageshack.us/img386/5151/1268rm3.jpg
I love the option to save multiple profiles in the new Precision. They also have been smartly implemented into the GUI, with keeping ease of use in mind....
Well done! You just set your clocks, hit apply, and right click on the profile number you want to remember your current settings. Done...
The Blue Edition skin has a handy lock icon to protect your saved profiles settings too...
http://img137.imageshack.us/img137/5746/lockvq0.jpg
I currently have my 10 slots set up like this for my 280:
Profile ___ Core ___ Shaders __ Memory __ Comments:
Profile 1-- 602MHz - 1296MHz - 1107MHz - Stock 280 speed.
Profile 2-- 621MHz - 1350MHz - 1134MHz - SC speed
Profile 3-- 648MHz - 1404MHz - 1161MHz - SSC speed
Profile 4-- 670MHz - 1458MHz - 1215MHz - FTW speed
Profile 5-- 691MHz - 1512MHz - 1215MHz - HC speed
Profile 6-- 702MHz - 1512MHz - 1323MHz - For testing.
Profile 7-- 720MHz - 1512MHz - 1323MHz - For testing.
Profile 8-- 738MHz - 1512MHz - 1323MHz - For testing.
Profile 9-- 741MHz - 1512MHz - 1323MHz - Max I can bench, but COD4 rebooted.
Profile 0-- 738MHz - 1512MHz - 1323MHz - Current settings I am running.
My 0 profile is the one that I will be running at 99% of the time, but as new drivers come out, I may do some re-testing on the faster settings...
I did successfully play COD4 for 1 hour before profile 9 rebooted my system.
Mabey the next driver won't?
Note: At a 720MHz core speed with linked shaders, shader speed tries to jump to 1566MHz. I don't believe my 280's shaders will run that fast error free.
I keep my shaders speed set to 1512MHz, even when I pass a 720MHz Core speed. That darn Jane Nash keeps rebooting my system consistently, when I allow my shaders to run at 1566MHz. ;)
When running with Core and Shaders linked, my shaders will fail first...
Precision Art: I call this one, "She's A Runner!!" :D
http://img244.imageshack.us/img244/1310/artdc6.jpg