Page 14 of 61 FirstFirst ... 41112131415161724 ... LastLast
Results 326 to 350 of 1518

Thread: Official HD 2900 Discussion Thread

  1. #326
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by Lifthanger View Post
    I'm using ultimate x64 and overdrive is there and always was since 7.2.
    with x2900XT 512mb??

    im talking about x2900xt , my friend received one today and overdrive is not there , maybe because is using 6 pin , i read somewhere that only show overdrive using 8 pin ???
    Last edited by mascaras; 04-30-2007 at 03:52 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  2. #327
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    6pin + 6pin + pci-e=225w
    6pin + 8pin + pci-e= 250w

    need 250w to overclock.

  3. #328
    Registered User
    Join Date
    Jul 2004
    Posts
    82
    Quote Originally Posted by mascaras View Post
    with x2900XT 512mb??

    im talking about x2900xt , my friend received one today and overdrive is not there , maybe because is using 6 pin , i read somewhere that only show overdrive using 8 pin ???
    aw sorry, didn't understand that correctly.

  4. #329
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by cadaveca View Post
    6pin + 6pin + pci-e=225w
    6pin + 8pin + pci-e= 250w

    need 250w to overclock.
    Are all PSU's like this? Is that the standard?
    [SIGPIC][/SIGPIC]

  5. #330
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    is not cpu, is spec for the plug. 6-pin pci-e is 75w, supposedly. Slot provides 75w. add the three, you get 225w.

    8-pin pci-e plug is 100w(figure 25w/wire pair). This provides extra power needed when overclocking.

  6. #331
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by cadaveca View Post
    is not cpu, is spec for the plug. 6-pin pci-e is 75w, supposedly. Slot provides 75w. add the three, you get 225w.

    8-pin pci-e plug is 100w(figure 25w/wire pair). This provides extra power needed when overclocking.
    yea but with as much as 56A on my 12V line I think my pcie connectors can do a little more than just 75W

    maybe I missed this but does the hd2900xt come with a 6 > 8 pin converter? I've only got 6 pin pcie connectors.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  7. #332
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    105
    ATi HD2900XT - AMD Athlon 64 X2 3800+@2009Mhz - 1GB DDR2
    3DMark 06: 11335




    http://www.generation-3d.com/11335-s...-XT,ac8886.htm

  8. #333
    Xtreme Member
    Join Date
    May 2006
    Location
    prospekt Veteranov, Saint-Petersburg, Russia
    Posts
    494
    i cant beleive
    too high score

    if true this is obviously overclocked radeon
    Last edited by MAS; 04-30-2007 at 10:07 PM.

  9. #334
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    105
    Maybe that R600XT has the real proper drivers.

  10. #335
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Hong Kong
    Posts
    1,905
    Quote Originally Posted by MAS View Post
    i cant beleive
    too high score

    if true this is obviously overclocked radeon
    I'm going to hope it's true
    -


    "Language cuts the grooves in which our thoughts must move" | Frank Herbert, The Santaroga Barrier
    2600K | GTX 580 SLI | Asus MIV Gene-Z | 16GB @ 1600 | Silverstone Strider 1200W Gold | Crucial C300 64 | Crucial M4 64 | Intel X25-M 160 G2 | OCZ Vertex 60 | Hitachi 2TB | WD 320

  11. #336
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    105
    The Radeon HD 2600 XT have 25 - 140 % more performance that 1950 XTX
    http://www.generation-3d.com/La-rade...XTX,ac8885.htm

  12. #337
    Xtreme Member
    Join Date
    May 2006
    Location
    prospekt Veteranov, Saint-Petersburg, Russia
    Posts
    494
    if the ss is true then the system with stock QX6700 will reach 5*6963*4150/(1.7*4150+0.3*6963) = 15802 marks

  13. #338
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    105
    Screens of the R600 DX10 Demo


    http://www.hardspell.com/doc/hardware/36897.html

  14. #339
    Xtreme Member
    Join Date
    Jun 2006
    Posts
    246
    realtime dynamic GI and transparent raytracing? now that's cool !!
    if this could be used as an accelerator for render engines... wow

  15. #340
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by MAS View Post
    if the ss is true then the system with stock QX6700 will reach 5*6963*4150/(1.7*4150+0.3*6963) = 15802 marks
    If its an overclocked XT and true then it has some potential for clockers
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  16. #341
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Quote Originally Posted by alayashu View Post
    realtime dynamic GI and transparent raytracing? now that's cool !!
    if this could be used as an accelerator for render engines... wow
    It is Radiosity engine!!

    Now I know that graphics industry is going in right direction...
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  17. #342
    D.F.I Pimp Daddy
    Join Date
    Jan 2007
    Location
    Still Lost At The Dead Show Parking Lot
    Posts
    5,182
    I hope for ATI this is true and the previous crap was wrong!
    SuperMicro X8SAX
    Xeon 5620
    12GB - Crucial ECC DDR3 1333
    Intel 520 180GB Cherryville
    Areca 1231ML ~ 2~ 250GB Seagate ES.2 ~ Raid 0 ~ 4~ Hitachi 5K3000 2TB ~ Raid 6 ~

  18. #343
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    One more than quite possible fake SS, and everyone goes nuts again in hype. Its like when one hype is shoot down, a new emerge in some rapid hopes.

    Sound didn´t turn out well either. And if it was such a killer, it would be a 699$ card!

    Its beyond a tech discussion, its a "want to believe" thread now.
    Crunching for Comrades and the Common good of the People.

  19. #344
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    404
    shhhtt Shintai,please let us dream....lol

  20. #345
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by Shintai View Post
    One more than quite possible fake SS, and everyone goes nuts again in hype. Its like when one hype is shoot down, a new emerge in some rapid hopes.
    Its beyond a tech discussion, its a "want to believe" thread now.
    Actually not.
    people have commented on the dailytech benches, that seems to have the card already.
    That they are off.

    The tech is highly adpative for DX10, and not OLD archelogical OLD DX9.
    Its more adapative than Nvidias cards.
    Using drivers to enhance imagequality down the line.
    So, ati can enhance imagequality, Nvidia are stuck at current patterns until G90.

    There are a ton of people out there who want to play Crysis gen 2 engine game, when it is out.
    If ATI has better imagequality, a better card for DX10 and has a better future capability, well I dont know but for me it seems stupid to even buy Nvidia card from the 8800 series by those tech stuff alone.

    If I was an ATI employee, I would market it like that,
    AMD/ATI deliver you a stunning visual experience with windows Vista and Crysis 2 that will be even better down the line when we upgrade your eyecandy with drivers alone, yes the card you buy will be able to become better and better looking and you wíll not belive your eyes!
    (here then comes a text in how they will be able to tune eyecandy and imagequality over time)

    You can always buy another card than one from ATI/AMD but then your stuck at current imagequality since ati deliver the industry adaptive card for the next gaming platform.

    Windows Vista and DX10.1

    Oh, yes btw, we even support the next installation of Directx, something we are proud of.

    If you want to stay with a card for the next 2 to 3 years then AMD/ATI is the way to go.

    AMD/ATI the gamers choice for todays and tomorrows games.

    (in fact, if amd and ati want some advice I am open for work)
    Last edited by flopper; 05-01-2007 at 01:04 AM.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  21. #346
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    Quote Originally Posted by flopper View Post

    ...There are a ton of people out there who want to play Crysis gen 2 engine game, when it is out...

    ...AMD/ATI deliver you a stunning visual experience with windows Vista and Crysis 2 that will be even better down the line when we upgrade your eyecandy with drivers alone, yes the card you buy will be able to become better and better looking and you wíll not belive your eyes!
    (here then comes a text in how they will be able to tune eyecandy and imagequality over time)...
    CryTek is the developer of FarCry and Crysis, thier graphics engine is called CryENGINE. CryENGINE v1 was used for FarCry and CryENGINE v2 is used for Crysis.

    Crysis is the game not the engine lol
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  22. #347
    D.F.I Pimp Daddy
    Join Date
    Jan 2007
    Location
    Still Lost At The Dead Show Parking Lot
    Posts
    5,182
    Quote Originally Posted by Shintai View Post
    One more than quite possible fake SS, and everyone goes nuts again in hype. Its like when one hype is shoot down, a new emerge in some rapid hopes.

    Sound didn´t turn out well either. And if it was such a killer, it would be a 699$ card!

    Its beyond a tech discussion, its a "want to believe" thread now.
    Personally I myself could care less about that card! I am more interested in their success with it due to AMD'S involvement with owning ATI and the outcome of the well being of AMD.

    Besides I have better things to spend my cash on like my new server board, ram, 2 chips
    Last edited by Brother Esau; 05-01-2007 at 01:17 AM.
    SuperMicro X8SAX
    Xeon 5620
    12GB - Crucial ECC DDR3 1333
    Intel 520 180GB Cherryville
    Areca 1231ML ~ 2~ 250GB Seagate ES.2 ~ Raid 0 ~ 4~ Hitachi 5K3000 2TB ~ Raid 6 ~

  23. #348
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by Syn. View Post
    CryTek is the developer of FarCry and Crysis, thier graphics engine is called CryENGINE. CryENGINE v1 was used for FarCry and CryENGINE v2 is used for Crysis.

    Crysis is the game not the engine lol
    Well, semantics.
    You know what I mean

    Its based on cryengine 2, sure, its also the basis for crysis the game from that engine.
    Might not been clear for a new reader but for extrme ppl it is ;>)

    Its all how you measure it, crysis is the 2 game they done from the same ppl who more or less did the same engine twice.
    Its just prettier
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  24. #349
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    flopper, you are a dreamer

    And no card is DX10.1 ready. Since it would require SM5.0 among other things. How can they make a card for specs that aint finalized. Again, snap out of the dream.
    Crunching for Comrades and the Common good of the People.

  25. #350
    D.F.I Pimp Daddy
    Join Date
    Jan 2007
    Location
    Still Lost At The Dead Show Parking Lot
    Posts
    5,182
    @shintai.....I agree with you on all the gosip about this card I myself am sick and tired of it too, but don't be such a Killjoy all the time.
    SuperMicro X8SAX
    Xeon 5620
    12GB - Crucial ECC DDR3 1333
    Intel 520 180GB Cherryville
    Areca 1231ML ~ 2~ 250GB Seagate ES.2 ~ Raid 0 ~ 4~ Hitachi 5K3000 2TB ~ Raid 6 ~

Page 14 of 61 FirstFirst ... 41112131415161724 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •