Page 19 of 51 FirstFirst ... 91617181920212229 ... LastLast
Results 451 to 475 of 1274

Thread: 2900 Pro Owners Thread

  1. #451
    Xtreme Enthusiast
    Join Date
    Feb 2007
    Location
    Denmark
    Posts
    547
    need help to set volt to MEM in ATITOOL
    MVDDC is 2.200V an MVDDQ is 2.099V
    Last edited by Martin.v.r; 10-19-2007 at 09:58 AM.

  2. #452
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by Draxx View Post
    How can you tell your PSU isn't up to the job?

    As I have a OCZ 520w Modstream with 28A on the 12v rail, so it's a little under spec even running XT speeds.

    But it does run at XT speeds fine, but when I take to say 810/950 it runs for a while, NO atrifacts, but the PC will just hard lock with not a sign of an artifact.

    Is this just a poor R600, or lack of amps?
    Well I'm running fine with a PowerStream 520 but it has 30A on the 12V rail. Does your modstream have the adjustable rails like mine?
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  3. #453
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by SteveLord View Post
    My 2900pro is on the way. I've heard conflicting reports about its power requirements.

    Does anyone have an OCZ PowerStream 520 or something similar? It does 33a on the 12v.


    I've had this baby for 2.5 years now. I am hoping it will be enough!
    Yes I run fine with the Powerstream 520. I thought it only had 30A? 33A after adjustment perhaps?
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  4. #454
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by TheJaxx View Post

    I've got a new Vista OS installed and only scored around 9600. The Vid is OC'd to 740/2000 (effective). The CPU test is where the system REALLY bogged down during the benchmarking.

    I've a feeling alot of it has to deal with the mobo itself and others, settings I've not made or tweaked as of yet, both bios and OS.

    What I'd like to know is there any really good guides on making adjustments? Doing searches gives you so many different opinions, it actually gets confusing.

    Thanks.
    First of all welcome to XS.

    Thats not a *horrible* score but yeah it should be higher with your card at those clocks.

    Are you sure it's not something simple/stupid like you have FSAA forced on in CCC? Even putting Mipmap detail on high performance will help a bit. I also quit all unneeded Windows Services etc before running any bench etc etc etc. Read a Vista tweak guide for sure.

    Also try something 3DMark 01 which is not so GPU-bound and is more sensitive to overall system speed (cpu and mem clocks and timings). How that scores will provide more info as to where the bottleneck is.

    Can overclock your CPU at all with that mobo?
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  5. #455
    Xtreme Member
    Join Date
    Oct 2006
    Location
    S.California
    Posts
    380
    Quote Originally Posted by WrigleyVillain View Post
    First of all welcome to XS.

    Thats not a *horrible* score but yeah it should be higher with your card at those clocks.

    Are you sure it's not something simple/stupid like you have FSAA forced on in CCC? Even putting Mipmap detail on high performance will help a bit. I also quit all unneeded Windows Services etc before running any bench etc etc etc. Read a Vista tweak guide for sure.

    Also try something 3DMark 01 which is not so GPU-bound and is more sensitive to overall system speed (cpu and mem clocks and timings). How that scores will provide more info as to where the bottleneck is.

    Can overclock your CPU at all with that mobo?
    no.. that score sounds about right for 3dmark.. i get 10200 at 887/1037 with my fx62 at 2.8ghz
    Cpu: Intel Core i7 920 @ 3.9 ghz (cooled w/ Apogee GTZ)
    Mobo: Gigabyte EX58 UD5
    Ram: G.SKill 3x1 GB DDR3 1600
    GPU: GTX 280
    PSU: E Power 1000 Watt

  6. #456
    Registered User
    Join Date
    Oct 2007
    Posts
    3
    Quote Originally Posted by WrigleyVillain View Post
    First of all welcome to XS.

    Are you sure it's not something simple/stupid like you have FSAA forced on in CCC? Even putting Mipmap detail on high performance will help a bit. I also quit all unneeded Windows Services etc before running any bench etc etc etc. Read a Vista tweak guide for sure.
    FSAA is set to application preference and Mipmap is on highest setting.

    Quote Originally Posted by WrigleyVillain View Post
    Also try something 3DMark 01 which is not so GPU-bound and is more sensitive to overall system speed (cpu and mem clocks and timings). How that scores will provide more info as to where the bottleneck is.
    Trying it with '01 I get a result of 30090.

    Quote Originally Posted by WrigleyVillain View Post
    Can overclock your CPU at all with that mobo?
    It looks as if I can, I haven't messed with it, but the settigns are in there to manually set voltage, multiplier, etc.

  7. #457
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by TheJaxx View Post

    Trying it with '01 I get a result of 30090.

    It looks as if I can, I haven't messed with it, but the settings are in there to manually set voltage, multiplier, etc.
    Hmm that seems a bit low too. I am not done tweaking/testing myself but my high so far in 01 is 57,800.

    Good news on the mobo BIOS settings, see what you can tweak there such as mem clocks and timings. Other than that you're just gonna have to run other various benchies and just keep testing. Oh install Riva Tuner and set up the hardware monitoring. Can make sure you're properly switching to 3D clocks, for one thing.
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  8. #458
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by nosboost300 View Post
    no.. that score sounds about right for 3dmark.. i get 10200 at 887/1037 with my fx62 at 2.8ghz
    Oops I didn't see this. Damn I'm surprised. I have almost broke 12,500 at 825/950.
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  9. #459
    Xtreme Member
    Join Date
    Mar 2004
    Location
    UK
    Posts
    315
    Quote Originally Posted by WrigleyVillain View Post
    Yes I run fine with the Powerstream 520. I thought it only had 30A? 33A after adjustment perhaps?

    You have 33A on 12v rail

    http://www.amdzone.com/pics/powersup...rstream520.jpg


    Thats 5 more amps, which is a lot for a GFX card
    ASUS P5K-E/WIFI-AP, Q6600 @ 3.4GHz 1.3v, Tuniq Tower 120, 4Gb OCZ PC2 8500 Reaper HPC, Asus HD5870, 320Gb Seagate 7200.10 SATAII, 500Gb Sammy F1, NEC ND4550A DVD burner, Corsair TX 750w PSU, Antec P182.

  10. #460
    Xtreme Member
    Join Date
    Mar 2004
    Location
    UK
    Posts
    315
    How long do you test with ATI Tool?

    As I've just finished running the artifact tool for 1 hour at 800/1000 and it found no problems.

    But what I will say is I've noticed the 12v drop to 11.58v at that clock. So maybe it is my PSU????
    ASUS P5K-E/WIFI-AP, Q6600 @ 3.4GHz 1.3v, Tuniq Tower 120, 4Gb OCZ PC2 8500 Reaper HPC, Asus HD5870, 320Gb Seagate 7200.10 SATAII, 500Gb Sammy F1, NEC ND4550A DVD burner, Corsair TX 750w PSU, Antec P182.

  11. #461
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by Draxx View Post
    How long do you test with ATI Tool?

    As I've just finished running the artifact tool for 1 hour at 800/1000 and it found no problems.

    But what I will say is I've noticed the 12v drop to 11.58v at that clock. So maybe it is my PSU????
    At least an hour. Some would say a lot longer. I believe that voltage drop is normal under load but I might be thinking of something else.
    Last edited by WrigleyVillain; 10-19-2007 at 11:42 AM. Reason: typo
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  12. #462
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Iowa
    Posts
    108
    Quote Originally Posted by WrigleyVillain View Post
    Yes I run fine with the Powerstream 520.
    Good to hear. Although I plan to replace my Core2 with a Q6600 here very soon. And that things gotta be at least 3ghz.
    "There's only 2 jobs in the Army, infantry...and those who support the infantry"

    Intel 2600k @ 4ghz
    Asus P8P67 Deluxe
    16GB GSKill 1600mhz
    MSI GF 560Ti 2GB
    2x OCZ Vertex 3
    2x WD Black 640GB RAID0
    Windows 7 Professional x64

  13. #463
    Xtreme Member
    Join Date
    May 2006
    Location
    Heilbronx, Germany
    Posts
    397
    Quote Originally Posted by Trike View Post
    I found an easier way to OC in crossfire mode, if you run dual monitors. I have one monitor hooked up to the primary card, and the second monitor hooked up to the secondary card.

    Disable crossfire.
    Go into display settings and extend desktop to the 2nd monitor.
    Launch rivatuner or whatever apps you use to OC. (I have only tried rivatuner)
    Select your cards, and overclock them. Shouldn't matter which card you choose first.
    Then enable crossfire. (this will disable the second monitor)
    tried that method a few mins ago. i set stock xt-clocks 740/825 on both cards (fan@100% just to be safe) and veryfied the clock change with gpu-z. the clocks were set and the pixel fillrate, texture fillrate and mem bandwidth increased on both cards, but the fps in 3dmark06 dropped by 10fps. i didnīt run the full benchmark cause the first few seconds were enough to see, that the oc wasnīt good.
    i really hope that there will be a clocking tool out soon that is fully supporting this card.


    MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
    2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
    OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W

    DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
    2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
    2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W

    Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1

  14. #464
    Registered User
    Join Date
    Oct 2007
    Posts
    40

    Red face Quite an experience...

    So I had loads of trouble getting one of these cards, and I will now publicly thank NickS for all the help he provided, not just with figuring out the issue, but also in helping me to activate my membership on here.

    I learned the hard way that these cards for some reason don't work using the DVI ports until the OS loads, at least on my Sceptre X20WC-Gamer. I almost returned the second card but I tried an old CRT and it worked! I feel bad for Newegg b/c I probably unnecessarily RMA'd a working card (I'll never know). They were a real class act about the whole thing. How was I to know that I wasn't supposed to see a display @ POST? I had previously been on a BFG Tech 7600GT that did not have this issue. This is my first ATI card since the original All-In-Wonder.

    It's a great card, but I found Sapphire's documentation completely unacceptable. They don't mention the power connectors at all!

    I got things worked out now by hooking one DVI port on the card to the DVI port on the monitor, and the other uses a DVI-VGA adapter with a vga cable to the VGA port on the monitor. When windows loads, the monitor automatically selects the DVI input. This is less than ideal, because it means I potentially would be forced to switch to VGA whenever I needed to do something different in BIOS or change boot order to get into Vista or Ubuntu, IF I had 2 LCD's hooked up (each using the DVI).

    I can only imagine the headache this issue is causing both end-users and distributors who are seeing lots of these cards seeming to be DOA.

    It was really difficult to find information on this issue, but with enough googling I found that ppl have had this same issue with other ATI cards, dating to at least 2005. I really hope they make a BIOS update to fix this.

    I'm not here to just complain. These cards are amazing! I can run every game I have so far at 1680x1050 Max EVERYTHING . This includes Company of Heroes, Oblivion, and Bioshock (which looks especially nice I might add). This was in XP, Vista might change things a bit. This card even beats out my Core 2 Duo for the price/performance crown! $350 for the performance of ~$600 card, now that's a deal!

    Does anyone recommend using ATI Tray Tools? I have so far tried RT and CCC, and for me CCC is the best option as 850Mhz didn't work (so no need to exceed the limit in CCC), but I'm wondering if ATT might have some other features that I would want.

    Also, does anyone think I'm shortening the life of the card if the core hits the mid 80's Celsius, what about 90 or 95? I live in SoCal, so my temps will probably be higher than most ppl at the same settings. Unless ATI is now using something other than Si04, like Hafnium or something, it's hard for me to imagine that this core can operate safely up to 100C , while CPUs have a much lower threshold. IIRC, the R600 has more transistors than the Core 2 Duo, so I would surmise (see Foolishly Assume) that the tolerance for overheating is even lower. I suppose the chip's internal layout also factors in. I would love to get that cleared up.

    Thanks to all for all the info!
    Core 2 Duo E6300 @2.8GHz(400x7)1.31v Gigabyte GA-965P-DS3 rev. 2.0 Thermaltake TMG i1 2GB Geil DDR2-800 (4-4-4-12) @1.8vEVGA 8800GT @ 675/1687/1950 Thermaltake Toughpower Cable Management 750W PSU

    "Fool me once, shame on, shame on you. Fool me...can't get fooled again."

  15. #465
    Attack Dachshund
    Join Date
    Jul 2007
    Location
    South Carolina USA
    Posts
    3,161
    Just adding another owner to the thread... sys in sig..
    Sapphire 2900pro 512 gddr3 846/900
    both pwr connections/ ati overdrive oc

    3dmark06
    Last edited by MikeB12; 10-19-2007 at 04:26 PM.

  16. #466
    Registered User
    Join Date
    Feb 2007
    Posts
    41
    Quote Originally Posted by MadDias View Post
    tried that method a few mins ago. i set stock xt-clocks 740/825 on both cards (fan@100% just to be safe) and veryfied the clock change with gpu-z. the clocks were set and the pixel fillrate, texture fillrate and mem bandwidth increased on both cards, but the fps in 3dmark06 dropped by 10fps. i didnīt run the full benchmark cause the first few seconds were enough to see, that the oc wasnīt good.
    i really hope that there will be a clocking tool out soon that is fully supporting this card.
    weird.. I find it alittle odd. i didn't get any major fps differences doing it both ways.

  17. #467
    Xtreme Cruncher
    Join Date
    Nov 2002
    Location
    Belgium
    Posts
    605
    I see this card disappearing in almost all shops around here .
    What i am wondering is : what if a card becomes faulty and needs to be RMA'd . What card will we get back in case they need to replace it ? Since they only had limited amount of pro chips , we might get an XT card with a pro bios on it , or maybe even an XT card if they dont bother flashing it to pro

    I am picking my card up this weekend , next week i will test .
    Last edited by CrimInalA; 10-19-2007 at 11:51 PM.


    Main rig 1: Corsair Carbide 400R 4x120mm Papst 4412GL - 1x120mm Noctua NF-12P -!- PC Power&Cooling Silencer MK III 750W Semi-Passive PSU -!- Gigabyte Z97X-UD5H -!- Intel i7 4790K -!- Swiftech H220 pull 2x Papst 4412 F/2GP -!- 4x4gb Crucial Ballistix Tactical 1866Mhz CAS9 1.5V (D9PFJ) -!- 1Tb Samsung 840 EVO SSD -!- AMD RX 480 to come -!- Windows 10 pro x64 -!- Samsung S27A850D 27" + Samsung 2443BW 24" -!- Sennheiser HD590 -!- Logitech G19 -!- Microsoft Sidewinder Mouse -!- Fragpedal -!- Eaton Ellipse MAX 1500 UPS .





  18. #468
    Registered User
    Join Date
    Oct 2007
    Posts
    40

    Depends on who does the RMA

    Quote Originally Posted by CrimInalA View Post
    I see this card disappearing in almost all shops around here .
    What i am wondering is : what if a card becomes faulty and needs to be RMA'd . What card will we get back in case they need to replace it ? Since they only had limited amount of pro chips , we might get an XT card with a pro bios on it , or maybe even an XT card if they dont bother flashing it to pro

    I am picking my card up this weekend , next week i will test .
    According to Newegg's return policy, if that happened, they would just turn it into a refund RMA instead. This is what happened to the first 2900 512MB I bought. I sent it back on Friday and by Monday they were sold out. I was so pissed. I had even called on Friday to make a special request that they hold one, but they apparently don't have that option as customer service is completely seperate from the warehouse. If I had been absolutely sure they would sell out before the RMA processed, I would have bought a second one on friday, but I was afraid to be out double the money and have an extra card I couldn't use. I guess I could have ebayed it but .

    Some retail places would end up sending your card to the manufacturer and in that case it would probably come back refurbished (I think this applies to Ewiz.) Ditto if you RMA directly with the manufacturer, although I suppose it would be possible for them to take an XT and flash it like that. That would only be a situation where they couldn't refurb it I suppose, since they would be losing a big chunk of money based on the price difference.

    I'm just really glad I was able to snag one of these before they were completely gone. Good luck with the new card!
    Core 2 Duo E6300 @2.8GHz(400x7)1.31v Gigabyte GA-965P-DS3 rev. 2.0 Thermaltake TMG i1 2GB Geil DDR2-800 (4-4-4-12) @1.8vEVGA 8800GT @ 675/1687/1950 Thermaltake Toughpower Cable Management 750W PSU

    "Fool me once, shame on, shame on you. Fool me...can't get fooled again."

  19. #469
    Registered User
    Join Date
    Oct 2007
    Posts
    3
    Well, have some interesting news.

    As I had previously reported, I was getting benchmark of around 9500-9600. This was on a newly installed Vista 32-bit.

    My DVD arrived yesterday for the 64-bit, so I went and did a re-install with that and installed all the previous stuff that was on it earlier, benchmarked and it came out with 10325. So almost a 1000 difference with just going to 64bit Vista.

    The reports that 3dmark give me are good on the video side. But the CPU is only giving a score of 1700. The two tests that it does in the benchmarking is so choppy its rediculous. 0-1fps. This is what is really dragging the score down.
    Last edited by TheJaxx; 10-20-2007 at 11:01 AM.

  20. #470
    Xtreme Member
    Join Date
    May 2006
    Location
    Heilbronx, Germany
    Posts
    397
    Quote Originally Posted by Trike View Post
    weird.. I find it alittle odd. i didn't get any major fps differences doing it both ways.
    i think the problem is rivatuner. although gpu-z shows the clockspeeds i set for both cards in rivatuner the way i set them, rivatuner sets the second card back to standard 2d clocks the moment i enable crossfire. this clockchange isnīt shown by gpu-z. i canīt get the second card back to 740/825 again then. funny thing is that rivatuner reports the stock clocks of the secondary card as 740/825 when i disable crossfire and expend my desktop to this card again. gpu-z on the other hand then reports the stock pro clocks for this card again. rivatuner is buggy as hell, thatīs the only reason i can think of why my fps in 3dmark are that much lower than at stock pro speed.
    i so wish to have a working atitool, like i had with my x1800īs. nothing better than clocking both cards at the same time with cf enabled. i hope they get it done soon.


    MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
    2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
    OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W

    DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
    2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
    2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W

    Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1

  21. #471
    Registered User
    Join Date
    Oct 2007
    Posts
    1

    sapphire 2900pro 512 bios

    Flashed 2900pro to 2900xt
    However, I inadvertently deleted my backup of my original sapphire 2900pro 512 bios. Can't find it online. Anyone known where I can get it?
    Thanks

  22. #472
    Xtreme Member
    Join Date
    May 2006
    Location
    Heilbronx, Germany
    Posts
    397
    Quote Originally Posted by Chri$ch View Post
    Try this Bios -> Download

    This is from my Sapphire HD2900Pro 512MB, def.Voltage is 1.20v
    lol.. cgluec... read the thread from the beginning and after a few minutes u come across things like the 2900pro bios.


    MSI 790FX-GD70 (BIOS 1.D4)//PhenomII 1090T
    2x 2GB G.Skill F3-12800CL7D-4GBRH//ASUS EAH5970
    OCZ Agility 120GB//2x Hitachi Deskstar (2x500GB) RAID0//ZALMAN ZM850-HP 850W

    DFI LanpartyUT RDX200 CF-DR (BIOS 12/23/05)//AMD Opteron 165 CCBBE 0616 XPMW 334x9 1.375Vx112%
    2x 1024 MB G.SKILL F1-4000BIU2-2GBHV PC4000//2x Sapphire HD2900PRO(modded bios 845/950) 512mb CrossFire
    2x WD Caviar RE2 WD4000YR (400 GB) RAID0//OCZ GameXStream 700W

    Motorola Milestone CyanogenMOD 6.1.0 RC0 Android2.2.1

  23. #473
    Xtreme Cruncher
    Join Date
    Jan 2007
    Location
    Massachusetts
    Posts
    715
    Can somebody chime in on my temperatures? Running 875 core with 1.225 core. (ATItool .27b) There are no ramsinks on mosfets or top set of memory chips. The memory chips closest to the mosfets are showing 60C, the ones towards the DVI inputs are at 45C. The big gray box mosfets are nearing 70C, the motherboard itself just past is at 60C from just the presence of heat. I can sense the heat from the system when I get within 12 inches of the card. GPU core is at 53C per ATItool (water). Am I going to damage something with these temperatures?

  24. #474
    I am Xtreme
    Join Date
    Apr 2005
    Location
    Upstate, NY
    Posts
    5,425
    You won't damage anything but it may cause instability down the road if your ambient temperatures rise or something. I'd set a 120MM next to your card to blow across it. That'd solve your problems EZ.
    Core i3-550 Clarkdale @ 4.2GHz, 1.36v (Corsair A50 HS/F) LinX Stable
    MSI H55-GD65 Motherboard
    G.Skill 4GBRL DDR3-1600 @ 1755, CL9, 1.55v
    Sapphire Radeon 5750 1GB
    Samsung F4 320GB - WD Green 1TB
    Xigmatek Utgard Case - Corsair VX550

  25. #475
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    406
    Hey there.

    First off let me say to all the people talking about the CPU tests really bogging down. It's supposed to run from 0-5fps. I runs the graphics on your processor and doesn't use your video card at all from what I understand. That's normal. The card isn't going to help that. That's why it's a CPU test.

    Just bought my first 2800pro and and quite happy with my purchase. For 279 you can't beat what I have basically upgraded to.

    Here is my brief story.

    Ok, I am running an MSI K-Neo 4 or something like that. I had to buy a replacement board for my 939 when my abit an8 ultra blew up

    I had an x800xl 256MB GPU and a X2 3800+ running at stock 2.0ghz.

    My score on 3dmark06 was an abysmal 1863.

    Bought my new parts and with the 2900pro stock and the 3800+ stock I jumped up to 7094.

    I then overclocked the the 3800+ to a measly 2.25ghz and upped my 2900pro to 833 core and 891 memory.

    I then ran 3dmark06 and got





    and a gpu-z validation at

    http://www.techpowerup.com/gpuz/w7uv8/

    I don't know a whole bunch of overclocking but I am learning. I have tried to use ntune to change my gpu but can't figure out how to use it so I'm stuck with using Ati tool.

    If anyone has any good overclocking utilities that they can link me I'd greatly appreciate it.

    I'm looking for a good program to use for the video card (and maybe how to use it lol) and some stresstesting utilities so that after I overclock the GPU I can run some tests to make sure it's stable.

    I'll be talking to others in a different section to help me with the OC'ing of my 3800+. I have an original big typhoon and I'm sure I can reach a higher OC than 2.25 lol.

    Thanks in advance.
    Last edited by Tsaroth; 10-20-2007 at 04:37 PM.

Page 19 of 51 FirstFirst ... 91617181920212229 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •