Page 1 of 12 123411 ... LastLast
Results 1 to 25 of 297

Thread: Nvidia GT200-successor on 22th October?

  1. #1
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374

    Arrow Nvidia GT200-successor on Q4 2008?

    The fact that Nvidia these days dismisses employees, is inter alia to the fact that the company in recent weeks and months, large and expensive graphics processors at relatively low prices had to sell. To get out of this mess, developed the GPU-producer GT200 currently preparing a successor in 55-nm technology, the cost can be saved.

    According to information on Nvnews were published, it may be that successor on 22 October 2008. That he also will replace the current 280 GTX, contradicts the idea that this is the GT206, based on a roadmap of us as a successor to the GTX 260 was seen.

    Thus, it is likely that this is the direct optical shrink of the GT200 and a "classic" GT200b. Under the Nvision 2008 were already slides into the public domain, the GTX-200-series incorrectly with up to 1080 Gflops auszeichneten, but otherwise the technical data of the GTX 280 corresponded.

    Would a GTX 280b in 55 nm mean, it could possibly on 22 Published October successor clock rates of 600 MHz for the GPU and 1500 MHz for the shader live - and thus the ratio of 2.5 times the shader clock against the nuclear clock, as the G9x graphics chips known recover.
    Source: hardware-infos VIA VR-Zone


    on my personal opinion I don't expect to be much more faster than the current GTX 280 I could be wrong tho...anyways /discuss!


    *UPDATE*


    Word is going around that NVIDIA's 55nm version of the GT200 GPU will arrive next month in the form of replacements for the GeForce GTX 260 and GeForce GTX 280 cards.

    We heard that they might be out by November, but this is not something we can confirm at the moment. There should be two products, one to replace GT280 and one to replace GT260.

    From the original prognosis this new card might actually end up marginally faster than Radeon 4870 and 4870 X2, but it won’t leave them in the dust which means that Nvidia will have to wait for 40 nm chips in 2009 to turn the situation around.
    Source: dvhardware

    regards,

    Blacky
    Last edited by Blacky; 10-23-2008 at 06:12 AM.

  2. #2
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Hmm interesting, almost seems like progress has stopped as far as real performance improvements in games. An 8800Ultra rivals the performance of a GTX260 in most applications. Its almost sad. My fantastic 2006 gaming rig (PD820, 3GB RAM, 8800GT (performance of a 2006 card, think 2900Pro or something) still blows away most games on max settings....
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  3. #3
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    It won't, really. The biggest change is supposed to be shader clock right? Bringing it back to the ~2.4x ratio with the core clock, you might see something like 1500MHz for the SPs. Compared to what the GTX 280 has right now, that's a hair over 15% better, similar to the improvement the Core 216 version of the GTX 260 gave over the original, except the Core 216 had the added advantage of extra TEX units.

    The big thing about this chip will be power and heat savings, so it won't look so bloated compared to the 4870 (which has always had the benefit of 55nm), especially in size.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  4. #4
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    I am guessing they will be skipping the alpha and beta stages, from the engineers to the factory to your door step.

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  5. #5
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Germany
    Posts
    1,592
    Awesome if true. I was waiting for GT200b, mainly because I hope for a lower wattage. Higher shader clocks will be benefitial for fah and the release date would be purrfect for a purchase right after christmas... =)
    The XS Folding@Home team needs your help! Join us and help fight diseases with your CPU and GPU!!


  6. #6
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    did i just read an article written by Yoda ?? sure sounded like it
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  7. #7
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    The original rumor that the GTX 200b (55nm revision) was pushed back to October and perhaps even as far back as December seems to fit with this current one.

    Nvidia desperately wants to take the performance crown back, they can try to do it with a GTX 290 (55nm, higher clocks, GDDR5) but to be honest it will still have trouble being a dominate winner.

    Nvidia would be smarter to make a revision to the GTX200 core by reducing what it can do, cranking up the clocks and making a Dual chip board. Something like a GTX 260 with its 8 'blocks' (vs the 10 on the GTX 280) however instead of just deactivating cores, making it without two of them. This should help with yields and costs which would allow them to make a GX2 chip, as well as get the clocks pretty high as well.

    To compete with AMD they don't need faster hardware, they need cheaper to make/produce.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  8. #8
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    Quote Originally Posted by Lestat View Post
    did i just read an article written by Yoda ?? sure sounded like it
    a drunk Yoda that needs AA.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  9. #9
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by Decami View Post
    a drunk Yoda that needs AA.
    hey, that's just the google translation no yodas were harmed during writing this article!

    ont: after OBR's tease i'm really looking forward to the 22th october.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  10. #10
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by BiFfMaN View Post
    I think Nvidia is also making a card to beat that also...powered with a tachyon wave dispersion field, Allowing faster then light transfer of information.
    OH shiizz! I will wait for Oct 24th for that one for sure!
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  11. #11
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Jakarta, Indonesia
    Posts
    244
    should I expect another nvidia's price slashing?

  12. #12
    Xtreme Member
    Join Date
    Nov 2004
    Location
    Germany, Munich
    Posts
    201
    Quote Originally Posted by Lestat View Post
    did i just read an article written by Yoda ?? sure sounded like it
    hahaha
    seeing this comment i read the text again
    thanks for the good laugh, its true
    Q6600 G0 | 4GB Mushkin | HD 4890 | X-Fi Fatal1ty
    Canon EOS 7D | 1000D | 70-200 F4 | Sigma 50 1.4 | 100 2.8 L IS Macro | 18-55 IS

  13. #13
    Xtreme Addict
    Join Date
    Jul 2008
    Location
    SF, CA
    Posts
    1,294
    this site should add a forum rule about BS news.
    it's starting to get annoying watching people shooting off about the "next-gen shizz"
    if i wanted PR stunts I'd join Quantum Force...

    ooh that was a little too mean. i take it back =].

  14. #14
    Xtreme Mentor
    Join Date
    Jul 2004
    Location
    Ontario
    Posts
    2,780
    Quote Originally Posted by [cTx]Raptor22 View Post
    Hmm interesting, almost seems like progress has stopped as far as real performance improvements in games. An 8800Ultra rivals the performance of a GTX260 in most applications. Its almost sad. My fantastic 2006 gaming rig (PD820, 3GB RAM, 8800GT (performance of a 2006 card, think 2900Pro or something) still blows away most games on max settings....
    Huh? How is this possible? The GTX 260 is higher in every aspect over the 8800 Ultra. More rops, memory, faster memory, faster memory bus, more shaders etc. I went from an 8800 Ultra overclocked to a GTX 260 and the difference is noticeable.
    Silverstone Temjin TJ-09BW w/ Silverstone DA750
    Asus P8P67
    2600K w/ Thermalright Venomous X Black w/ Sanyo Denki San Ace 109R1212H1011
    8GB G.Skill DDR-1600 7-8-7-24
    Gigabyte GTX 460 1G
    Modded Creative X-Fi Fatal1ty w/ Klipsch Promedia 2.1
    1 X 120GB OCZ Vertex
    1 X 300GB WD Velociraptor HLFS
    1 X Hitachi 7K1000 1TB
    Pioneer DVR-216L DVD-RW
    Windows 7 Ultimate 64


    Quote Originally Posted by alexio View Post
    From the hip and aim at the kitchen if she doesn't approve your purchases. She'll know better next time.

  15. #15
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Quote Originally Posted by cantankerous View Post
    Huh? How is this possible? The GTX 260 is higher in every aspect over the 8800 Ultra. More rops, memory, faster memory, faster memory bus, more shaders etc. I went from an 8800 Ultra overclocked to a GTX 260 and the difference is noticeable.
    Take a look at the THG review: http://www.tomshardware.com/reviews/...0,1953-18.html

    The GTX260 is never more than 5FPS or so ahead of the 8800Ultra...

    5FPS? Thats like an overclock, gimme a break nVidia...
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  16. #16
    Xtreme Addict
    Join Date
    Feb 2008
    Posts
    1,565
    wonder if these will be using DDR5 or DDR3 like the normal 280's.
    EVGA X58 Classified
    Intel i7 965
    Corsair Dominator 1600mhz 3x2gb
    Nvidia GTX 295

  17. #17
    Xtreme Member EternityZX9's Avatar
    Join Date
    Sep 2006
    Location
    Nursing Student -or- Beta Testing Escape From Tarkov
    Posts
    421
    Back on topic....does anyone else know anything that can confirm this information discussed in the original post?

    Cause I'd really like to get a GTX280 to play Clear Sky...but if a 55nm refresh is a month away I can see waiting...

    Bench? Perkam? Diltech?
    Last edited by Cooper; 09-20-2008 at 05:56 AM.
    Intel Core i7 7700K | MSI Z270 XPOWER G.T. | EVGA 1080Ti SC2 | 16GB DDR4 G.Skill Trident Z 3200 | Samsung S27A950D | 3 x Samsung 850 EVO (250GB, 2 x 2TB) | EVGA Supernova P2 1200w | Coolermaster Cosmos II

  18. #18
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    still hard to believe that nvidia could come with something to dethrone the 4870 X2 so easily before ending 08 anyways, back on topic so if this is the GT200 succesor what happened to the last roadmap revealed weeks ago? were the first 55nm GT200 was going to be the 260 and the next GTX 280 refresh was skipping 55nm to go directly to 40nm anyone?

  19. #19
    Xtreme Mentor
    Join Date
    Jul 2004
    Location
    Ontario
    Posts
    2,780
    Quote Originally Posted by [cTx]Raptor22 View Post
    Take a look at the THG review: http://www.tomshardware.com/reviews/...0,1953-18.html

    The GTX260 is never more than 5FPS or so ahead of the 8800Ultra...

    5FPS? Thats like an overclock, gimme a break nVidia...
    I can see where you are getting those numbers, however, that is ONE site out of many. Toms is nothing to hold a candle too in terms of their findings during tests. The 5fps is WORST case scenario as well, on 30" monitors which next to no one uses. You start going down to 1920 and less and suddenly that number turns into 8+ fps. Still not a lot either I know, but it is faster none the less, and the GTX 260 is less than half the money the 8800 Ultra was this time last year. These results are also ONLY showing in Crysis which we know is a pain in the rear to get decent scores from anyways. Any other game shows quite a sizable increase in performance. What I have noticed going from my 8800 Ultra to the GTX 260, is the 'smoothness' of the gameplay. Bench scores may not be that much higher (although they still are across the board), but gameplay certainly has a much smoother feel on the new card over the old card. If you are into overclocking, the GTX 260 clocks better than the 8800 Ultra ever would for even great performance. It also runs cooler and consumes less power to boot.
    Silverstone Temjin TJ-09BW w/ Silverstone DA750
    Asus P8P67
    2600K w/ Thermalright Venomous X Black w/ Sanyo Denki San Ace 109R1212H1011
    8GB G.Skill DDR-1600 7-8-7-24
    Gigabyte GTX 460 1G
    Modded Creative X-Fi Fatal1ty w/ Klipsch Promedia 2.1
    1 X 120GB OCZ Vertex
    1 X 300GB WD Velociraptor HLFS
    1 X Hitachi 7K1000 1TB
    Pioneer DVR-216L DVD-RW
    Windows 7 Ultimate 64


    Quote Originally Posted by alexio View Post
    From the hip and aim at the kitchen if she doesn't approve your purchases. She'll know better next time.

  20. #20
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    I don't have anything concrete to add, so I won't say a word, sorry.
    Only thing I know for sure ( 99.9% ) is that we'll see the GeForce 180.xx drivers in October.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  21. #21
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    San Antonio, TX
    Posts
    836
    Quote Originally Posted by Zaskar View Post
    wonder if these will be using DDR5 or DDR3 like the normal 280's.
    I would bet money on DDR3. Nvidia has already expressed a lot of faith in the fact that DDR3 still has plenty of life left in it.

    Ryzen 3800X @ 4.4Ghz
    MSI X570 Unify
    32GB G.Skill 3600Mhz CL14
    Sapphire Nitro Vega 64
    OCZ Gold 850W ZX Series
    Thermaltake LV10

  22. #22
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by Zaskar View Post
    wonder if these will be using DDR5 or DDR3 like the normal 280's.
    benchz calculations he showed on the last nvidia thread, i dont see why they nvidia would jump over GDDR5 since they can achieve the enough/same memory bandwith with GDDR3 @2200 + 512 bit than 256 bit + GDDR5 @4400 using Benchz example unless nvidia wants to go overkill and 512- bit + GDDR5 @4400+ ? isnt GDDR3 latency would be faster than GDDR5 as well? please correct me if i'm wrong

  23. #23
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Blacky View Post
    benchz calculations he showed on the last nvidia thread, i dont see why they nvidia would jump over GDDR5 since they can achieve the enough/same memory bandwith with GDDR3 @2200 + 512 bit than 256 bit + GDDR5 @4400 using Benchz example unless nvidia wants to go overkill and 512- bit + GDDR5 @4400+ ? isnt GDDR3 latency would be faster than GDDR5 as well? please correct me if i'm wrong
    GDDR5 for nVidia would only make sense if they made a 256-320bit bus or so. Not with 448-512bit.
    Crunching for Comrades and the Common good of the People.

  24. #24
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    I hope this date holds out and EVGA gets good GPU's. I'll buy the refresh for my new system in my sig.

    I like the info that was posted about the shaders and shader clcoks. I read in the benching section that the shaders are the wall on these cards. with the raise in shader, one can go higher on the core. If that's all we get out of this new refresh that will be wonderful guys.

    These are good cards. They're power eaters, but they are good cards. What I really like about them is the rate that they fold. My god man! They are folding monsters. They are just as competitive as the other brand in games also.

    These are great single GPU cards.

  25. #25
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by Shintai View Post
    GDDR5 for nVidia would only make sense if they made a 256-320bit bus or so. Not with 448-512bit.
    I don't think they even had the GDDR5 memory controller finalized when it was July. Whatever reason they didn't do it before, I'm very inclined to say it's technical. Otherwise even with existing GDDR4 they wouldn't have been so shortsighted with the GT200 obviously.


    nVidia really did NOT knew about this one. They thought the 4870 was going to be like the 3870, excessive waste on bandwidth.

    Plus they have a lot of other fish to fry too, including shader density, keeping the core clock >700Mhz, and oh wait, where's CUDA now?
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

Page 1 of 12 123411 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •