need help to set volt to MEM in ATITOOL
MVDDC is 2.200V an MVDDQ is 2.099V
need help to set volt to MEM in ATITOOL
MVDDC is 2.200V an MVDDQ is 2.099V
Last edited by Martin.v.r; 10-19-2007 at 09:58 AM.
Dark Star IV i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830 Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2
Dark Star IV i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830 Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2
First of all welcome to XS.![]()
Thats not a *horrible* score but yeah it should be higher with your card at those clocks.
Are you sure it's not something simple/stupid like you have FSAA forced on in CCC? Even putting Mipmap detail on high performance will help a bit. I also quit all unneeded Windows Services etc before running any bench etc etc etc. Read a Vista tweak guide for sure.
Also try something 3DMark 01 which is not so GPU-bound and is more sensitive to overall system speed (cpu and mem clocks and timings). How that scores will provide more info as to where the bottleneck is.
Can overclock your CPU at all with that mobo?
Dark Star IV i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830 Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2
FSAA is set to application preference and Mipmap is on highest setting.
Trying it with '01 I get a result of 30090.
It looks as if I can, I haven't messed with it, but the settigns are in there to manually set voltage, multiplier, etc.
Hmm that seems a bit low too. I am not done tweaking/testing myself but my high so far in 01 is 57,800.
Good news on the mobo BIOS settings, see what you can tweak there such as mem clocks and timings. Other than that you're just gonna have to run other various benchies and just keep testing. Oh install Riva Tuner and set up the hardware monitoring. Can make sure you're properly switching to 3D clocks, for one thing.
Dark Star IV i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830 Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2
You have 33A on 12v rail
http://www.amdzone.com/pics/powersup...rstream520.jpg
Thats 5 more amps, which is a lot for a GFX card![]()
ASUS P5K-E/WIFI-AP, Q6600 @ 3.4GHz 1.3v, Tuniq Tower 120, 4Gb OCZ PC2 8500 Reaper HPC, Asus HD5870, 320Gb Seagate 7200.10 SATAII, 500Gb Sammy F1, NEC ND4550A DVD burner, Corsair TX 750w PSU, Antec P182.
How long do you test with ATI Tool?
As I've just finished running the artifact tool for 1 hour at 800/1000 and it found no problems.
But what I will say is I've noticed the 12v drop to 11.58v at that clock. So maybe it is my PSU????
ASUS P5K-E/WIFI-AP, Q6600 @ 3.4GHz 1.3v, Tuniq Tower 120, 4Gb OCZ PC2 8500 Reaper HPC, Asus HD5870, 320Gb Seagate 7200.10 SATAII, 500Gb Sammy F1, NEC ND4550A DVD burner, Corsair TX 750w PSU, Antec P182.
Last edited by WrigleyVillain; 10-19-2007 at 11:42 AM. Reason: typo
Dark Star IV i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830 Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2
"There's only 2 jobs in the Army, infantry...and those who support the infantry"
Intel 2600k @ 4ghz
Asus P8P67 Deluxe
16GB GSKill 1600mhz
MSI GF 560Ti 2GB
2x OCZ Vertex 3
2x WD Black 640GB RAID0
Windows 7 Professional x64
tried that method a few mins ago. i set stock xt-clocks 740/825 on both cards (fan@100% just to be safe) and veryfied the clock change with gpu-z. the clocks were set and the pixel fillrate, texture fillrate and mem bandwidth increased on both cards, but the fps in 3dmark06 dropped by 10fps. i didnīt run the full benchmark cause the first few seconds were enough to see, that the oc wasnīt good.
i really hope that there will be a clocking tool out soon that is fully supporting this card.
MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W
DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W
Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1
So I had loads of trouble getting one of these cards, and I will now publicly thank NickS for all the help he provided, not just with figuring out the issue, but also in helping me to activate my membership on here.
I learned the hard way that these cards for some reason don't work using the DVI ports until the OS loads, at least on my Sceptre X20WC-Gamer. I almost returned the second card but I tried an old CRT and it worked! I feel bad for Newegg b/c I probably unnecessarily RMA'd a working card (I'll never know). They were a real class act about the whole thing. How was I to know that I wasn't supposed to see a display @ POST? I had previously been on a BFG Tech 7600GT that did not have this issue. This is my first ATI card since the original All-In-Wonder.
It's a great card, but I found Sapphire's documentation completely unacceptable. They don't mention the power connectors at all!![]()
I got things worked out now by hooking one DVI port on the card to the DVI port on the monitor, and the other uses a DVI-VGA adapter with a vga cable to the VGA port on the monitor. When windows loads, the monitor automatically selects the DVI input. This is less than ideal, because it means I potentially would be forced to switch to VGA whenever I needed to do something different in BIOS or change boot order to get into Vista or Ubuntu, IF I had 2 LCD's hooked up (each using the DVI).
I can only imagine the headache this issue is causing both end-users and distributors who are seeing lots of these cards seeming to be DOA.
It was really difficult to find information on this issue, but with enough googling I found that ppl have had this same issue with other ATI cards, dating to at least 2005. I really hope they make a BIOS update to fix this.
I'm not here to just complain. These cards are amazing! I can run every game I have so far at 1680x1050 Max EVERYTHING. This includes Company of Heroes, Oblivion, and Bioshock (which looks especially nice I might add). This was in XP, Vista might change things a bit. This card even beats out my Core 2 Duo for the price/performance crown! $350 for the performance of ~$600 card, now that's a deal!
![]()
Does anyone recommend using ATI Tray Tools? I have so far tried RT and CCC, and for me CCC is the best option as 850Mhz didn't work (so no need to exceed the limit in CCC), but I'm wondering if ATT might have some other features that I would want.
Also, does anyone think I'm shortening the life of the card if the core hits the mid 80's Celsius, what about 90 or 95? I live in SoCal, so my temps will probably be higher than most ppl at the same settings. Unless ATI is now using something other than Si04, like Hafnium or something, it's hard for me to imagine that this core can operate safely up to 100C, while CPUs have a much lower threshold. IIRC, the R600 has more transistors than the Core 2 Duo, so I would surmise (see Foolishly Assume) that the tolerance for overheating is even lower. I suppose the chip's internal layout also factors in. I would love to get that cleared up.
Thanks to all for all the info!
Core 2 Duo E6300 @2.8GHz(400x7)1.31v Gigabyte GA-965P-DS3 rev. 2.0 Thermaltake TMG i1 2GB Geil DDR2-800 (4-4-4-12) @1.8vEVGA 8800GT @ 675/1687/1950 Thermaltake Toughpower Cable Management 750W PSU
"Fool me once, shame on, shame on you. Fool me...can't get fooled again."
Just adding another owner to the thread... sys in sig..
Sapphire 2900pro 512 gddr3 846/900
both pwr connections/ ati overdrive oc
3dmark06
![]()
Last edited by MikeB12; 10-19-2007 at 04:26 PM.
I see this card disappearing in almost all shops around here .
What i am wondering is : what if a card becomes faulty and needs to be RMA'd . What card will we get back in case they need to replace it ? Since they only had limited amount of pro chips , we might get an XT card with a pro bios on it , or maybe even an XT card if they dont bother flashing it to pro![]()
I am picking my card up this weekend , next week i will test .![]()
Last edited by CrimInalA; 10-19-2007 at 11:51 PM.
Main rig 1: Corsair Carbide 400R 4x120mm Papst 4412GL - 1x120mm Noctua NF-12P -!- PC Power&Cooling Silencer MK III 750W Semi-Passive PSU -!- Gigabyte Z97X-UD5H -!- Intel i7 4790K -!- Swiftech H220 pull 2x Papst 4412 F/2GP -!- 4x4gb Crucial Ballistix Tactical 1866Mhz CAS9 1.5V (D9PFJ) -!- 1Tb Samsung 840 EVO SSD -!- AMD RX 480 to come -!- Windows 10 pro x64 -!- Samsung S27A850D 27" + Samsung 2443BW 24" -!- Sennheiser HD590 -!- Logitech G19 -!- Microsoft Sidewinder Mouse -!- Fragpedal -!- Eaton Ellipse MAX 1500 UPS .
According to Newegg's return policy, if that happened, they would just turn it into a refund RMA instead. This is what happened to the first 2900 512MB I bought. I sent it back on Friday and by Monday they were sold out. I was so pissed. I had even called on Friday to make a special request that they hold one, but they apparently don't have that option as customer service is completely seperate from the warehouse. If I had been absolutely sure they would sell out before the RMA processed, I would have bought a second one on friday, but I was afraid to be out double the money and have an extra card I couldn't use. I guess I could have ebayed it but.
Some retail places would end up sending your card to the manufacturer and in that case it would probably come back refurbished (I think this applies to Ewiz.) Ditto if you RMA directly with the manufacturer, although I suppose it would be possible for them to take an XT and flash it like that. That would only be a situation where they couldn't refurb it I suppose, since they would be losing a big chunk of money based on the price difference.
I'm just really glad I was able to snag one of these before they were completely gone. Good luck with the new card!
Core 2 Duo E6300 @2.8GHz(400x7)1.31v Gigabyte GA-965P-DS3 rev. 2.0 Thermaltake TMG i1 2GB Geil DDR2-800 (4-4-4-12) @1.8vEVGA 8800GT @ 675/1687/1950 Thermaltake Toughpower Cable Management 750W PSU
"Fool me once, shame on, shame on you. Fool me...can't get fooled again."
Well, have some interesting news.
As I had previously reported, I was getting benchmark of around 9500-9600. This was on a newly installed Vista 32-bit.
My DVD arrived yesterday for the 64-bit, so I went and did a re-install with that and installed all the previous stuff that was on it earlier, benchmarked and it came out with 10325. So almost a 1000 difference with just going to 64bit Vista.
The reports that 3dmark give me are good on the video side. But the CPU is only giving a score of 1700. The two tests that it does in the benchmarking is so choppy its rediculous. 0-1fps. This is what is really dragging the score down.
Last edited by TheJaxx; 10-20-2007 at 11:01 AM.
i think the problem is rivatuner. although gpu-z shows the clockspeeds i set for both cards in rivatuner the way i set them, rivatuner sets the second card back to standard 2d clocks the moment i enable crossfire. this clockchange isnīt shown by gpu-z. i canīt get the second card back to 740/825 again then. funny thing is that rivatuner reports the stock clocks of the secondary card as 740/825 when i disable crossfire and expend my desktop to this card again. gpu-z on the other hand then reports the stock pro clocks for this card again. rivatuner is buggy as hell, thatīs the only reason i can think of why my fps in 3dmark are that much lower than at stock pro speed.
i so wish to have a working atitool, like i had with my x1800īs. nothing better than clocking both cards at the same time with cf enabled. i hope they get it done soon.
MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W
DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W
Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1
Flashed 2900pro to 2900xt
However, I inadvertently deleted my backup of my original sapphire 2900pro 512 bios. Can't find it online. Anyone known where I can get it?
Thanks
MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W
DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W
Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1
Can somebody chime in on my temperatures? Running 875 core with 1.225 core. (ATItool .27b) There are no ramsinks on mosfets or top set of memory chips. The memory chips closest to the mosfets are showing 60C, the ones towards the DVI inputs are at 45C. The big gray box mosfets are nearing 70C, the motherboard itself just past is at 60C from just the presence of heat. I can sense the heat from the system when I get within 12 inches of the card. GPU core is at 53C per ATItool (water). Am I going to damage something with these temperatures?
You won't damage anything but it may cause instability down the road if your ambient temperatures rise or something. I'd set a 120MM next to your card to blow across it. That'd solve your problems EZ.
Core i3-550 Clarkdale @ 4.2GHz, 1.36v (Corsair A50 HS/F) LinX Stable
MSI H55-GD65 Motherboard
G.Skill 4GBRL DDR3-1600 @ 1755, CL9, 1.55v
Sapphire Radeon 5750 1GB
Samsung F4 320GB - WD Green 1TB
Xigmatek Utgard Case - Corsair VX550
Hey there.
First off let me say to all the people talking about the CPU tests really bogging down. It's supposed to run from 0-5fps. I runs the graphics on your processor and doesn't use your video card at all from what I understand. That's normal. The card isn't going to help that. That's why it's a CPU test.
Just bought my first 2800pro and and quite happy with my purchase. For 279 you can't beat what I have basically upgraded to.
Here is my brief story.
Ok, I am running an MSI K-Neo 4 or something like that. I had to buy a replacement board for my 939 when my abit an8 ultra blew up
I had an x800xl 256MB GPU and a X2 3800+ running at stock 2.0ghz.
My score on 3dmark06 was an abysmal 1863.
Bought my new parts and with the 2900pro stock and the 3800+ stock I jumped up to 7094.
I then overclocked the the 3800+ to a measly 2.25ghz and upped my 2900pro to 833 core and 891 memory.
I then ran 3dmark06 and got
and a gpu-z validation at
http://www.techpowerup.com/gpuz/w7uv8/
I don't know a whole bunch of overclocking but I am learning. I have tried to use ntune to change my gpu but can't figure out how to use it so I'm stuck with using Ati tool.
If anyone has any good overclocking utilities that they can link me I'd greatly appreciate it.
I'm looking for a good program to use for the video card (and maybe how to use it lol) and some stresstesting utilities so that after I overclock the GPU I can run some tests to make sure it's stable.
I'll be talking to others in a different section to help me with the OC'ing of my 3800+. I have an original big typhoon and I'm sure I can reach a higher OC than 2.25 lol.
Thanks in advance.
Last edited by Tsaroth; 10-20-2007 at 04:37 PM.
Bookmarks