Any joy with the drivers touGe ??
Printable View
Any joy with the drivers touGe ??
Thanks for your reply, that makes a lot of sense, you know when you just get that feeling but we need to hear somebody else.
I was so close to buying an Enermax 1000w PSU today when I got a reply from Asus Tiawan telling me to use Award Bios Flash to flash the old fashioned way.
But It gets worse, I have tried to update the BIOS using awdflash version 1.33 [lastest version] but it has frozen at 50% . I have not turned off the power or reset the system, as I am awaiting instructions from Asus or any other experienaced user who can get me out of this loop without frying the BIOS CHIP with a restart mid-upgrade.
If I have to currupt the BIOS will I have to send this 3rd SIIE mobo for RMA or do you guys know a way of saving this FLASH upgrade? The thought of disconnecting my W/C loop for the 4th time is getting me down .
Any AWARD FLASH SAVING TRICKS appreciated .
A photograph of the AWARD BIOS FLASH frozen at 50%:
&*(^%$ what a predicament, sorry to hear - good luck!
never scrimp on psu - get a good psu - a beefy single 12v takes the guesswork outta figuring out what to connect to what rail on the multi rail psu's
holy crap! i'd been interested to find out if there is a solution for this problem - all you need now is a blackout
i gotta say this is one heckuv a video card once you start getting the hang of it -
it may be a pain switching modes back and forth
but 1 24" panel in sli on intel board can really rock in a deathmatch game - and thats stock clocks for sli
using hardware raid so sli with 2 cards is not an option for me
frigg what will i do if they cut out the 1 slot 2gpu configuration - go back to dating 60 year old chicks
No, I haven't tried yet. I ended up going into work a little early yesterday so didn't get around to it yet. I've got some testing to do on other things and, hopefully, get a few hours of sleep so I probably wont get around to it until later tonight.
Did you try what was suggested earlier, extract the files to your C directory before installing?
I have 2x EVGA 9800GX2's and i have no clue as to what is going on. Basically, i get the performance of slightly less than that of a Single GX2.
I have tested so much it's not even funny...
Installed and tested various different drivers ranging from 174.53 upto 174.9x, changed chipset drivers, flashed my mobo's Bios, tested each card individually and tested each PCI_E slot also. I even ran 2x PSU's just out of curiosity.....nothing nada....still performs the same.
And before anyone mentions my CPU clockspeed....i have several ppl posting results with the same clockspeed and they perform more than 14FPS AVG in the Crysis Bench @ 1900x1200 NO AA, ALL VERY HIGH DX10.
whats your 3dmark06 score !
3Dmark06 isn't really relevant to be honest as it won't show much. But here's what i get, Single Card 1 and 2 represent individual testing of a particular GX2
3Dmark06 Defaults 1280x1024
Quad SLI
SM2.0 Score: 5682
SM3.0 Score: 7853
CPU Score: 4023
Overall: 15,348
Single 9800GX2 1
SM2.0 Score: 5708
SM3.0 Score: 7370
CPU Score: 4144
Overall: 15,043
Single 9800GX2 2
SM2.0 Score: 5750
SM3.0 Score: 7353
CPU Score: 4085
Overall: 15,018
1920x1200 NO AA, 16x AF
Quad SLI
SM2.0 Score: 5392
SM3.0 Score: 7123
CPU Score: 4136
Overall: 14,525
Single 9800GX2 1
SM2.0 Score: 5475
SM3.0 Score: 6136
CPU Score: 4200
Overall: 13,726
Single 9800GX2 2
SM2.0 Score: 5475
SM3.0 Score: 6148
CPU Score: 4222
Overall: 13,753
up that q6600 to 3.5 - 3.6ghz
i wasn't too impressed with the gx2 at first but once the quad hit 3.5ghz it was playoff time, regular season was over
still seem to be driver issues but what the heck - could be worse
ey guys, hows the x64 drivers for gx2? any good driver out there? im having problem benchmarking with 174.74 vista 32bit, now im adding 2x1gb so going on 64bit Vista OS now...
which Driver should you suggest and performs well in gaming and benchmarking?
Thanks!
For those using quad sli (dual GX2) and not getting good results, this is why:
- No forceware yet natively supportss quad SLI.
- No CPU up to this date can perform on par with the GX2 thus, bottlenecking.
Even if newer forceware improves quad SLI, you'll only get a minor performance increase.
The single GX2's main purpose is so people don't have to go SLI in the first place, hence, using intel boards.
Sorry to bust your bubble :)
- Oh, try overclocking both the cards. that should get you an extra 4,000 points in 3dmark 06.
Loving mine :D
Clocks.. 750/1150/1900 on air with a 120 delta fan going like lordie at it (mods happening next week) cpu @ 5.3-5.4ghz. Check my hwbot for scores.
Quad sli won't really happen unless you push 5.4ghz+ ;)
no quad sli for me = love my hardware raid too much
single gx2's as close to sli i'm gettin
you're right u need a honkin cpu for throughput
I use Lavalyst Everest version 4.50 to monitor gpu temps. It can also monitor VRM and mem and everything else (diods, etc).
What setting do u guys use for the pre rendered frame option in the nvidia options?? Do u use the default 3 or not...
pre-rendered? what benefit is that?
Not sure !! Have a look in the nvidia control panel under the 3d options and see what you make of it..
I've gone back to 174.74
174.83 was unstable, getting the nv4 dll error.
this may not be true but ive read that 0 is a performance hit and 8 will help with performance. But xp only supports 3 or below and thats why nvidia has set it at 3. Another thing is that the higher you go with it it can start to cause lag with the mouse and movement in game. I have set mine to 0 and i cant tell any difference between that and 3. I think its more for low end cards to help with low fps but i may be wrong....
The GX2 doesn't use anything like 300W. TDP is 197W, actual power used would be less. I've clocked a GX2@780/2250 and QX@4.7Ghz quad on a 500W (34A@12v) PSU with no problems at all. A 60A PCP&C should be able to power 2 of those without any effort.
Likewise you should recalculate the cpu usage, my Q6600 used approx 200W@3.5Ghz.
http://img167.imageshack.us/img167/7...quationpg6.png
My experience thus far with my EVGA GX2.
I am running XP Pro, QX6700 3.4ghz, 1200W TTPSU, 4GB Crucial Ballistix Tracers 4-4-4-12
Latest drivers from EVGA site.
When I run Multi GPU, I get stuttering and random FPS drops.
When set to mulit display............oddly enough my FPS maintains and games run fine.
Not to pleased with not being able to set mulit GPU.
I saw mentioned in a previous post that the +12v A is something that is important and I am saying based on exp that this is very often over looked. Especially when someone goes out and buys a 700+W PSU, only to get one that hardly supports 17-22 amps, though seems like alot, actually isn't.
When looking for a PSU please take the time to double/triple check the minimum amps for the Card. It will you save you having to call in and save people headaches, frustration and aggrivation.
174.93 xp64 beta installed - little wonky at install but working so far - after playing around with minor settings
- went first to multi-gpu mode
- now multi monitor mode
drivers show a lot of promise - but they need work and if you know what you're doing they could be really good
- I do not have the experience but I think I'll keep them
- I'm limited to multi-monitor mode because of it
there is a way for multi-gpu by modifying inf files but I do not have that confidence
if you can do the registry mods I think multi-gpu mode could be very good with these 174.93x64 xp
Is it me or does this card and the .74 drivers not play nice with AA enabled in most games ?
I play cod4 with 4 x AA and full AF and sometimes i get stuttering with good fps and when i drop the AA down to 2 x it seems to be better so i think it is a driver issue for now.
I reverted back to 174.74x64 - seems only stable set so far for me
yeap me too...on XP.
and also finished my tests with a mini SS (-19C) on the E8500..
that is a SWEET score dude, looks like i'll be aiming for a e8500 since u got 5ghz on 1.55v? that's insane.
I got 4.4ghz at 1.6v, jeez.
-----------------------------
On another note, I have gone back to stock settings for my GX2. I keep getting the NV4 display error while running D3D (NFS carbon).
I think my core was oc'ed too high, first thought it was the memory.
Anybody else having issues oc'ing this card? maybe i was oc'ing too high..
Nope no issues for me. When I oc too high I just artifact the crash not even able 2 load into a game.
Who here has the highest artifact-free clocks? so far my first GX2 did 750/1680/1050
i`m at 720/1050 with the fan at 55%...no need to stress the card more..
you and me too bro
takes 1.48 to 4.45 but over that it takes alot more.
as for my card,my clock at like 45c load cant get the 229 G.T has in gt3
have it up to 799/2000/1135 too with no arty's at all,if warmer it will arty at 777c 1940shader
http://img296.imageshack.us/img296/8012/db05cp8.th.jpg
card cover is off and i have 1/2 pound of dice on each side of card intake area fan at 100%
ill test the card more today since its cool.
I think 680 on the core is the highest retail card on sale that is the XFX XXX but i have mine at 700 and all seems well
http://benchmarkreviews.com/index.ph...&limitstart=15
is your Qx 65nm, or 45nm?
What voltage are you using for your Qx? I had to use over 1.5v to stabilize my crappy G0 at 3.5ghz.
the GX2 oc'ed does use almost 300w close on LOAD.
http://www.techspot.com/review/91-as...x2/page10.html
Maybe for your system, it's okay. but with my settings, it was not.
65nm quads use up a lot of power which is why they are now primitive.
AA bug is present with 174.74
AA bug is fixed with 174.83 but, unstable beta drivers.
How much of an improvement is there with AA enabled in games ?
image quality was way better, but i found 174.83 to be unstable. It could have been my OC though, but i think i'll wait for the next official release.
when the heck is that scheduled for anyways? damn nvidia takes so long.
ATI comes out with new catalyst pretty quickly.
Official Nvidia Beta 175.12 are out now...
Vista64: http://www.nvidia.com/object/winvista_x64_175.12.html
Vista32: http://www.nvidia.com/object/winvista_x86_175.12.html
any new drivers for Win XP 32? Im using now 174.74 but they are slower than 174.54...
Vista only unfortunately. These drivers are supposed to be the 3DVantage optimized drivers and as you know, 3DVantage is for Vista only.
Well guys i've been experimenting with forceware 174.93 again, XP-32bit.
I'm trying to see if these new beta drivers actually fixed the AA bug.
In 174.74, AA would not apply no matter what setting you tried.
But in 174.83, the edges looked smother. 174.93 is kinda unstable, getting stickyness on the desktop but runs pretty damn fast. I'm going to post screens with 16xCSAA Team Fortress 2 and I want you guys to tell me if AA is actually applied or not.
More Pics
BF2 everything high 8xAA
TF2 16xQSAA
Cool screenshots:up:
I initially tried 174.74, then 174.93. The 93's botched up sli (dual gpu), I could no longer enable it, so I dumped them and installed 174.88. So far the 174.88's have been the best driver for me, very stable, no issues and AA is working great.
As i stated before, I doubt that clock is stable at all. The highest I was ever able to bench at was 760/1890/1080 but that wasn't stable at all.
I'm currently trying out 750/1780/1080.
If you clock too high, the card actually drops in performance. Editing the bios to add more volts would help. I'm going to attempt that probably next month :up:
I try to find the max stable OC
and im at 750/1875/1050 , very stable in all games
Ok a noob question now but do you leave ..
Multi-GPU performance mode on "NVIDIA recommended (Multi-GPU)"
or do you use AF1 or AF2 ?
Never tried the other options, i doubt it'll perform better than 2 cores though (Internal SLI)
Im up to 19447 in 3D06 XP but Im stuck at 3.6Ghz for now , thats at 750/1750/1100 . Im surprised how little clocking the card up helps my scores , is this thing still cpu bottlenecked ?
CPU clocks make a large difference in my scores.
:up:
I know ive asked b4 but what temps are you getting on both cores, memory and vrm.
Im getting some artifacts but don`t know if its heat or driver related.
On average @ load GPU 1 -72 GPU 2 -68 Memory 68 VRM 56
idle 50 45 45 40
But i can hit 70-75 on the memory and vrm when gaming for 2 hours or so or when im on a large/full mp map on cod4.
cores go up to 70's C.
GPU VRM's i'm sure everest is reading them wrong because GPU1 VRM goes up to like 60'esh and VRM2 goes up to 90'esh, which i doubt is a correct reading.
Mem hits 60'esh.
What clocks are you using that's getting you artifacting? force fan to 80%.
EDIT- nvm i'm assuming the clocks you're using are the one in your sig.
try forcing fan to 80%. COD4 is the same as the COD2 engine, and the COD2 engine was very very sensitive to artifacting.
You could get no artifacts in any of your games but COD2, will artifact on any OC hardware, so it could be that.
Try playing other games that are just as sensitive to artifacting such as NFS carbon, see if you still notice artifacting.
or better yet, download ATITool and scan for artifacts.
Unless you get your Q6600 past 3.7ghz, it will severely bottleneck the GX2.
As i've stated before, most tests are single threaded, or dual threaded.
the Q6600 is nothing but 4 e6600's THUS:
most tests will run on your CPU and reconize it as a e6600 clocked to 3.5-3.6 or whatever GHZ you're clocked at.
Thus, Q6600 = primitive, and will bottleneck the GX2.
i heard there's like 150+ screws to remove the cooler, so screw that!
one thing i hate about this card is that the air intake is from the BACK, and it blows the air out INTO the case.
The 45nm quads are much better than the 65nm however, the multi's are only 8x.
the intake is from the fan M8..
the exaust is from the back of the card..
Been through all sorts of changes with this card.
First I removed the cover. I'm pretty cack handed, so I have to say really it's not that difficult.
Then I took the cards apart, cleaned up the cores and applied mx-2. I had to re-mount the second card once as temps weren't level initially.
I then tested it again, without putting the cover back on and seeing the difference, decided to leave it off. I'd say leaving the cover off really did help with temps.
Then I decided to do the volt mods, and upped vgpu cores to 1.2v.
Even just with air cooling, the bare cards held up pretty well. With the case open and aircon on I managed to do a 756/1944/1150 run in 3d mark 06. Previously I was limited to 730/1860/1150 or thereabouts.
However there was artifacting and i couldn't complete a 3dmark 03 with the shader clock this high. What I would say though is that I could now play GRAW2 with a reasonable overclock.
Now I've installed my watercooling. I did a few more tests. Case on, radiator fans on low. I managed to do 790/1998/1150 3d Mark 06 and 03. That's with warm ambients and no artifacts.
So in conclusion the temps make a big difference, and a little voltage helps.
RLM
I wouldn't bother. I tried it and it made no difference. Readings taken with a multimeter.Quote:
If you clock too high, the card actually drops in performance. Editing the bios to add more volts would help. I'm going to attempt that probably next month
RLM
Deleted....
how much better is it with the case off ?? I did repaste mine with mx-2 but found that it needed alot on the ram to make good contact everything else was ok. Also what is the other chip type thing on the 2nd card as that has its own heatsink aswell.?
Well referring to my initial post.Quote:
how much better is it with the case off ??
With case off and mx-2 (I left the thermal stuff on the memory initially) it kept me in the 70s, maybe upper 70s whilst playing graw. With aircon on and fan blowing in the case I got idle temps down to 43-45c. That's with 1.2v going to the cores instead of the default 1.15v. Note you have to take into account it's 30-35c all year round here.Quote:
Initially chuffed, but not so now. Fan set directly to 85% in Rivatuner. Sides off case and aircon on. Ambient still a bit warm. OC at 675/1676/1050
Tried playing Graw2 1900x1200 and it crashes out "nvspdll" or similar. Used to get the same error on my 8800 gtx in the early days, before I found upping fan speed sorted it.
Checked rivatuner monitor and it crashed out at 92 degrees. The bottom of the card is roisting. Idle is 55 sides off case and 60+ on.
Certainly worth it in my opinion.
As for 'Also what is the other chip type thing on the 2nd card as that has its own heatsink aswell.?', haven't a clue mate. I can tell you cheese is a type of meat though.
RLM
Need to squeeze some more out of the q6600 and memory. Certainly a bottleneck in the equation. 22k+ would be nice.
http://img147.imageshack.us/img147/9...k06rlm3di0.jpg
RLM
Q6600 @ 4Ghz lol thats very nice and you have the same wc kit as me. Have you got the swiftech gt water block or have you got something else?? Also what temps have you got on the cores as im thinking about putting a 2nd rad in so i can run my fans at a lower speed and get better cooling
Yes I like the swiftech kit. Had to have it brought out from the UK at the time, it was all thermaltake here.
A few changes though over time. EK Supreme Block(CPU) -> EK 9800 Gx2 Full Cover -> Res -> Pump -> PA 120.3 Rad -> MCR 220 Rad. One loop.
Temps depend greatly on aircon and fan speed, but more than sufficient for my sig overclock 24/7. Certainly 60/sub 60 full load for my cpu. 30-35 idle and sub 40 load for the GX2. I would have to do a proper test really though.
Here's a test I did a while back http://www.xtremesystems.org/forums/...&postcount=366, with the 8800gtx and stealth in the loop instead. Looking at the temps I could do with maybe a bit of a remount on my cpu block. Seems to be a bit hotter now.
RLM
Guys stick to the topic, this is about the GX2, not the Q6600.
I will say though, get an e8400 with that setup and you'll hit 22k.
Now, back to the GX2. Anybody going to take some pictures of the case off? I'm curious how some of you got such high clocks, how about a screenshot of ATITool showing no artifacts?
No it's not.
Not mine anyways, take a cleanex and stick it behind the GX2 where your DVI ports are, it'll STICK.
that means the air is being pulled in from the back.
put the cleanex ontop of one of the fans on the GX2 in your case, it'll float, or fall off. that means the air is being pushed OUT.
Completely wrong there mate. I said that I took volt readings with a multimeter. Give it a try though.Quote:
It probably made no difference because your Q is bottlenecking the card. If you were able to achieve higher clocks (performance gain or not) with the new 1.2v edit, then it does work, does it not?
'new 1.2v edit?' are we getting crossed wires? I soldered volt mods onto my cards. Nothing bios related for 1.2v.
The vmod thread for 9800 Gx2 http://www.xtremesystems.org/forums/...d.php?t=180992
Probably, but no so good for 3d rendering.Quote:
Guys stick to the topic, this is about the GX2, not the Q6600.
I will say though, get an e8400 with that setup and you'll hit 22k.
RLM
Firstly the GX2 300W in those links is system load, thats the ENTIRE PC, as I said the TDP is less than 200W. The actual power used is even less. And note: if they measure from the power socket the figures they quote includes psu losses.
http://img404.imageshack.us/img404/7...0gx2speze6.jpg
Secondly the Q6600, I even gave you the formula. I'll do it for you, I'll guess your VID at 1.2625v, correct it to suit. Your Q6600 is not using anything like 300W.
Pd = 95*(3500/2400)*(1.5/1.2625)² = 195W
With the clocks I've tried mine's using a lot of power, its still quite happy on a 500W psu.
http://i180.photobucket.com/albums/x...nowagain/3.jpg
fornowagain what motherboard are you using?
How are you guys getting lucky with your cards core running atleast at 740 and up? Mine tops out at 725 before it freezes up.
you should be able to clock higherQuote:
How are you guys getting lucky with your cards core running atleast at 740 and up? Mine tops out at 725 before it freezes up.
what is the load temp. ?
Does any1 know if BFG warrenty would allow me to remove the stock hs/fan ?
I only ask as EVGA and XFX allow this and it does not void the warrenty and BFG have put a nice sticker over one of the screws that i had to remove to get the case open....
BFG does not allow you to remove the HSF. that would void warranty.
XFX does though :up:
I will be getting my block soon, in a week or so, then I can start playing with my card and see what I can get out of her.
hey a-r and rlm what up!
man i have the cooling down for benches,maybe its time for a volt mod?
03 stable no artys 792/2000/1130
but not all cards are made the same it seems...a friends card wont go over 730/1800/1140 even in the artic circle(temps 28c 30c)
http://img247.imageshack.us/img247/7676/013fj7.png
well, the tem. is OKQuote:
After about an hour and a half of cod4 online my temps are at 80c, While running 3dmark they hover around 75 with the fan on auto
try 740/1850/1050 and set the fan speed to 75% it should work fine
Well i got one, but it don't work.
I can see bios and post information.
After that, all i get is a flashing cursor.
Can't get the safe mode menu either, and DVD's don't autoboot, like vistax64 dvd.
The power LED'S on the back are Green on the top, and Blue on the bottom.
Is that fine?
The PCIe connectors are both green.
Running it on a D975XBX2 PCIe 1.0 mobo with 16x bandwidth.
Flashed to latest 2831 bios, still did the same lockup thing.
I experiense the same problem when first time i use 9800GX2 , and i found out that , there is conflict between my sound card and the 9800GX2 :shrug:Quote:
Well i got one, but it don't work.
I can see bios and post information.
After that, all i get is a flashing cursor.
Can't get the safe mode menu either, and DVD's don't autoboot, like vistax64 dvd.
The power LED'S on the back are Green on the top, and Blue on the bottom.
Is that fine?
The PCIe connectors are both green.
Running it on a D975XBX2 PCIe 1.0 mobo with 16x bandwidth.
Flashed to latest 2831 bios, still did the same lockup thing.
and when i remove the sound card the system boot normaly
put bac the sound card the system wont boot :rofl::rofl:
what sound card is this, that is conflicting with the GX2?
Well i will give that a try, and just noticed one thing in the mobo manual.
The board has 2 pci slots.
The first one is close to the first PCIe, the second one is on the lowest part of the mobo.
The second PCI is SMBUS routed, whereas the first PCI is just pci.
I have the X-Fi in the SMBUS routed PCI slot, so im going to move it up to the standard PCI slot.
It might be a fix, but now ive gotta do alot of mucking around again......
Well, you were right.
After taking out the X-Fi, the 9800GX2 now works.
WEIRD.
Well your right i got my reply from BFG....
BFG Tech.
Unfortunately it would.
Any "tampering" (defined as any physical or software -firmware- modifications
to the card including, but not limited to, replacing the fan/ heat sync and
further overclocking or underclocking the card) will void your lifetime
warranty (which covers the card for YOUR lifetime, not just the "lifetime of
the product").
Well lets hope i dont need to send it back lol
yes man its weird , i dont know what the hell problem between GX2 and X-Fi sound card :(Quote:
Well, you were right.
After taking out the X-Fi, the 9800GX2 now works.
WEIRD.
I think it has something to do with the
Nvidia GPU SMBUS conflicting.
Because now in everest, i can't see most of the system activity, like fan speeds, mobo and ram temps, and mobo voltages.
My rig with 9800gx2 worked fine with XP but now with Vista Ultimate I cant run 3D06 or Vantage , 06 had errors for openAL which were corrected by running openalinst as admin but now it gives an error about 3direct or something - but now its running so I dont know .
Vantage gives an error about fullscreenstate failed, it doesnt seem to like my common 22" LCD . I think its to do with my KVM as it seems to error when I change focus away from the benching pc ...
:down:
I've allso had this problem with openAL under Vista x64, but try to instal this http://files.filefront.com/OALInstex.../fileinfo.html. It solved my problem and 3DMark is running ok.
Just a bit of info reg my E8400 and my Q6600 in 3dmark 06. With the Q6600 i get 19363 but with the E8400 i got 17860 now i did notice that i was getting higher fps in the tests than with the quad but i guess i lost points in the cpu test. My point is that the higher the oc on the cpu the better the card runs so a none overclocked cpu will probably bottleneck the cards performance.
fornowagain, that's top results you're getting there. 4.8mhz on water:eek:
Sweet result man. The cooling certainly makes the difference.;)Quote:
hey a-r and rlm what up!
man i have the cooling down for benches,maybe its time for a volt mod?
03 stable no artys 792/2000/1130
but not all cards are made the same it seems...a friends card wont go over 730/1800/1140 even in the artic circle(temps 28c 30c)
Finally with a bit of tweaking and one OS corruption, due to some over ambitious memset entries, I hit the 22K
http://img265.imageshack.us/img265/8...k06rlm5yc7.jpg
http://img141.imageshack.us/img141/4...k03rlm5fq9.jpg
RLM