Results 1 to 8 of 8

Thread: What next after GTX 295 ?

  1. #1
    Registered User
    Join Date
    Jul 2006
    Location
    Klaten, Indonesia
    Posts
    24

    What next after GTX 295 ?

    Man, I'm so tired with DX 10 it's too long for us to stay with this. A few year ago we had new architecture every 2 years From DX 8- DX9-DX9.0c ........ I need GT300 GPU because Dual GPU's like GTX 295 have so many drawbacks like :

    1. Microstuttering
    2. Not linier performance for every games, we need new driver or profile to support SLI
    3. Very HOT

    I don't care if GT300 support DX11 and DX11 interface come in 2010 with Win7, fast single card and single GPU, maybe in Q2 2009 with GT300 ?

    How about you ?

  2. #2
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Location
    Campbellsville, Kentucky
    Posts
    896
    there's only been a few previews for the 295 on beta drivers. Your being kind of pre judgemental with the card. microstuttering? how do you know?
    Main Rig
    • Intel Core i7 4790K CPU Stock @ 4.4Ghz
    • Asus Maximus VI Extreme Motherboard
    • 32GB GSKILL Trident X 2400MHZ RAM
    • EVGA GTX 980 Superclocked 4GB GDDR5
    • Corsair TX850W v2 TX Power Supply 70A 12V Rail
    • Swiftech Apex Ultima w/ Apogee Drive II & Dual 120 RAD w/integrated res
    • 2X Seagate 333AS 1TB 7,200 32MB HD's in RAID 0
    • 2X Samsung 830's 128GB in RAID 0
    • Windows 8.1 Pro x64
    • Coolermaster HAF-XB
    • Dual Asus ProArt PA248Q 24" IPS LED Monitors
    • Samsung 46" 5600 Series Smart HDTV
    • iPhone 6 Plus 64GB AT&T & Xbox One


    UNOFFICIAL Rampage II Extreme Thread

  3. #3
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by xlimit View Post
    Man, I'm so tired with DX 10 it's too long for us to stay with this. A few year ago we had new architecture every 2 years From DX 8- DX9-DX9.0c ........ I need GT300 GPU because Dual GPU's like GTX 295 have so many drawbacks like :

    1. Microstuttering
    2. Not linier performance for every games, we need new driver or profile to support SLI
    3. Very HOT

    I don't care if GT300 support DX11 and DX11 interface come in 2010 with Win7, fast single card and single GPU, maybe in Q2 2009 with GT300 ?

    How about you ?
    Oh wow... wow.. There are so many things wrong with this I'm gonna let someone else have fun.
    For reference DX9 has been around for 6 years and going for all the non Vista ppl.
    DX10 has been out for just 2...
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  4. #4
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    To assume that micostuttering must still be an issue on the 295, is to assume that extra memory and a faster bus mean nothing to GPU's running in SLI mode...

    Keep it real...

    So far, no preview that I have seen posted, indicated that microstuttering was an issue with the 295...
    In fact, most say they are extremely pleased or impressed.

    Mabey they all just missed it, and never heard about the 9800 GX2.

    Best to keep an open mind on the 295.

    The 780i mobo had the A1 revision of the SLI chip...
    The 9800 GX2 had the A2 revision...
    The 295 get's the A3 revision...
    Pic of the 295 with it's A3 SLI chip: http://en.expreview.com/2008/12/19/m...html#more-1662

    I believe with each new revision of this chip, we get a faster (lower latency) pci express controller.
    It's probably a wee bit faster of an SLI chip. Even if only by a little, we'll take it!

    http://www.bit-tech.net/hardware/200...tecture-dive/8
    "However that's not to say Nvidia won't get it's buck out of it - the certification process is rumoured to cost $5 a board - this is still far cheaper than the $20 it wanted to charge for the latest A3 revision of the NF200 chip, but it does mean that the consumer pays for SLI regardless of whether they want to use it or not. Nvidia's SLI brand is so strong that most companies are opting in, though".

    "NF200 can boost one x16 into two x16 lanes, offering a full set of x16-x16-x16. Nvidia likes to further elaborate and even suggests that two NF200s will offer four PCI-Express 2.0 x16 lanes, just in case you've lost the ability to judge good value and you want to invest in three-way SLI and another card for PhysX, or perhaps you need eight monitors? Either way, actually fitting that many lanes on a board and squeezing performance cards into the limited space is surely a task for the good Doctor and his TARDIS".


    I think the A3 has wider lanes for data transfer, than the A2 revision...
    Not sure myself... Opinions are invited!

    We also know how the Nvidia Driver Boy's have been doing as of late...
    http://www.bjorn3d.com/read.php?cID=1403
    From the Conclusion...
    "With Far Cry testing done and SLI scaling topping 60-70 percent average, and under well optimized conditions even higher than that, SLI could be coming into its prime. Nvidia has made a concentrated effort to improve SLI and the effort is paying off".

    Sounds like we have some solid improvements there too.

    Some of us feel that the 295 will be fun to play with until the 300 is released, then hopefully catch a step up into the 300.
    April is our current best guess as to when the 300 will be released. If Jan 8th is the 295's release day, we might want to hold off a few days before clicking buy, to give us more time in April to step into a 300.

    I stepped out of my 9800 GX2 into a 280...
    I think I just may be able to step from a 295, into a 300 too?
    Last edited by Talonman; 12-22-2008 at 02:36 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  5. #5
    Xtreme Enthusiast
    Join Date
    Oct 2005
    Posts
    567
    Which SLI chip is on the SLI-capable x58 boards?

  6. #6
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    London Ontario Canada
    Posts
    1,157
    most X58 dont use a chip its Software SLI and the ones that do like MSI and I think EVGA are Nforce 200

    the next card after 295 is supposed to be 40nm to my last inspections however the 295 will be a beast and for you to overlook it is obsurd
    Case: Corsair 400R
    PSU: Corsair HX1000W
    mobo: Maximus IV Gene
    CPU: 2500K @ 4.2ghz 1.19 volts
    RAM: Gskill Ripjaws 1866mhz 2 x 4 gigs
    OS Drive: Kingston Hyper X ssd 120 gig
    Graphics: XFX HD5850
    Cooling: Corsair H100
    OS: Windows 7 Pro 64 bit







  7. #7
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    As far as heat goes...

    http://guru3d.com/article/geforce-gtx-295-preview/
    "Cooling - though we cannot disclose numbers on noise and temperatures just yet (power management is not yet finalized) we can already tell you that we were at the very least impressed by its cooling".

    I expect it to be cooler than the 4870X2... (Time will tell.)

    One more good scrap of info...
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
    "One thing we can tell you is this video card is quiet. In fact, when I first turned it on I had to look at the fan on the video card and make sure it was spinning because I didn’t hear it at all at idle".

    This card is looking like a winner to us X38 boy's!

    I won't even consider buying it a water block until my 90 day's Step Up Window is over, so hearing that the stock cooling was doing the job was nice!

    Water blocks only come after I know that I am playing for keeps! (Water Blocks = Marriage Ring for GPU's!)

    I still am not sure if we will be together forever, or just going to have a 90 day fling...

    I will have to work hard at not getting too emotionally attached to this card right away... It does look easy to love!

    I thought I wanted a 285, but seeing what this new girl can do, and me not running SLI, I am being seduced by the 295.
    My 280 will just sit by her side and run in dedicated PhysX mode.
    Last edited by Talonman; 12-20-2008 at 07:00 PM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  8. #8
    Xtreme Enthusiast
    Join Date
    Oct 2005
    Posts
    567
    Quote Originally Posted by Ozzfest05 View Post
    most X58 dont use a chip its Software SLI and the ones that do like MSI and I think EVGA are Nforce 200
    How does an NForce 200 compare? How do you tell which motherboards have which SLI implementation?

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •