Page 2 of 2 FirstFirst 12
Results 26 to 48 of 48

Thread: nVidia GT200 is ready & will launch in 3 month

  1. #26
    Registered User
    Join Date
    Jan 2008
    Location
    UK
    Posts
    33
    Quote Originally Posted by Natalia View Post
    I really do hope this is true, and not another tease.
    Me too. They've gotta of been doing something for the last year!
    [SIGPIC][/SIGPIC]
    i7 920 @3.8, HD5850 Xfire @900/1100, Intel X25-M
    Water Cooled Home Theatre PC

  2. #27
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Melbourne, Australia
    Posts
    1,478
    Hopefully the GT200 is the card we've all been waiting for. I wonder what it'll be called, 9900GTX maybe?

  3. #28
    Xtreme Gamer
    Join Date
    Jun 2006
    Location
    Des Moines, IA
    Posts
    879
    Quote Originally Posted by twwen2 View Post
    Hopefully the GT200 is the card we've all been waiting for. I wonder what it'll be called, 9900GTX maybe?
    I'd hope it will either be a 10xxx series or G1000.

  4. #29
    Xtreme Gamer
    Join Date
    Jun 2006
    Location
    Des Moines, IA
    Posts
    879
    Quote Originally Posted by DerekFSE View Post
    I hope the come out with a new naming convention.
    Go back to their purchased roots (3dfx) go back to GeForce 1 lol.

  5. #30
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Quote Originally Posted by DerekFSE View Post
    I hope the come out with a new naming convention.
    They wont, they cant think past X200, X600, and X800 for the cards names, same naming conventions layed down by the Radeon 9000 line and the GF4 line will stay...
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  6. #31
    Xtreme Mentor
    Join Date
    Mar 2007
    Posts
    2,588
    june huh.. let the countdown begin then... 84 days!!

    on another note, I have to ask, what is the current or soon to be best workstation flavored video card from Nvidia ? (PaliT seems to be the best brand since they are the only ones offering the display port capability plus HDMI AND Dual DVI on their 9600gt cards... reason I say this is because I am in the market for a display port capable monitor anyways, so I might as well take advantage of the new technology)
    Last edited by hecktic; 03-07-2008 at 11:58 PM.

  7. #32
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    Quote Originally Posted by Anemone View Post
    Doubtful they will get it moving that fast. I'd guess that's an internal target date. If it would have been ready that fast there would be little need to have upgraded to the 9800 GTX.

    The R700 probably would be ready by then which is why these "leaks" happen, mostly because AMD has a lot to regain and has been working quite a while on the R700.
    so what your saying is ATI has been working hard on the R700 and Nvidia has been taking 3 hour lunches and using the rest of the work day to sit in the utmost concentration staring at the G92 thinking, could we clock it higher?

    ........sounds about right to me. But i dought it, they have been doing something for 1 1/2 years, im sure they will be ready. Can you not see the marketing in this? Its pretty obvious what Nvidia is doing. If R700 rolls around and they arent ready, someone is getting fired, believe that.

    as for the name, im guessing Geforce 9800GTX-R Ultra FX-9
    Last edited by Decami; 03-08-2008 at 05:10 AM.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  8. #33
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    It will be called 8800gts g92 the third.

  9. #34
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Down Under
    Posts
    125
    Quote Originally Posted by dengyong View Post
    It will be called 8800gts g92 the third.

  10. #35
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    9800gtx 400+, 9800gx2 600+, whats the gt200 going to cost 1000+

  11. #36
    Xtreme Enthusiast
    Join Date
    Sep 2004
    Posts
    650
    bet won't go over 15k 3d2k6 marks.

  12. #37
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Loque View Post
    bet won't go over 15k 3d2k6 marks.
    I bet it will do around 20k 06 marks stock (serious answer).
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  13. #38
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Quote Originally Posted by RPGWiZaRD View Post
    I bet it will do around 20k 06 marks stock (serious answer).
    I think so too, this isnt just another 'product refresh' chip.
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  14. #39
    Xtreme Member
    Join Date
    Jun 2004
    Posts
    436
    Quote Originally Posted by RPGWiZaRD View Post
    I bet it will do around 20k 06 marks stock (serious answer).
    Elaborate por favor.
    Home PC: Intel i7 4770K @ 4.6ghz l Asus Maximus VI Hero l Corsair Dominator Plantinum 2400mhz (4x4GB) l Asus GTX 690 l Samsung 840 Pro 256gb l 2 x WD Black 1T storage drive l WD MyBook 500gb External l Samsung SH-S203N DVD l Creative X-Fi Titanium HD l Corsair AX1200 PSU l Planar SA2311W23 3D LCD Monitor l Corsair 800D Case l Windows 7 Ultimate 64 bit l Sennheiser HD-590

    Water Cooling Setup: Swiftech 320 Radiator (3 X Gentle Typhoons 1450rpm 3 x Gentle Typhoons 1850 rpm) l Swiftech Pump w/XSPC Res Top l Heatkiller 3.0 CPU Block l Heatkiller GPU-X GTX 690 "Hole Edition" Nickel l Heatkiller Geforce GTX 690 GPU Backplate l Koolance 140mm Radiator l Danger Den 1/2ID UV Green tubing l EK EKoolant UV Green Liquid


    -Impossible is not a word

  15. #40
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,010
    My guess is just a tweaked 8800..nothing amazing.
    And they will release midrange first and wait for ATI.
    Everything extra is bad!

  16. #41
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Well my insight about GT200 is that you can't look at G92 and the 9800 cards to know what to expect at all. These are cards that appeared as a reaction to ATIs lackluster release again and just further span the cash milking GeForce 8 series sales and eventually to have something in response to ATIs X2 offer. GT200 however is probably the what should have been next highend series if ATI wasn't doing so poor.

    Furthermore NVIDIA needs something to battle R700 too right? It certainly won't be able to with 9800 series and not offering a competitor a competition is suicide in this business if you don't wanna sell cards for utterly low prices in order to get some sales although no profit (but still having sales number would be better than getting no sales or profit at all). Since NVIDIA shouldn't be able to know what R700 performs like (but perhaps the NVIDIA spies knows quite a lot about it already though) they just have to aim for a reasonable performance boost over current gen which I think talking 3DMark06 numbers only would be in the 20 - 21k range at stock in this case. What that translates to in actual gaming performance I don't know. lol
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  17. #42
    Xtreme Enthusiast
    Join Date
    Sep 2004
    Posts
    650
    most likely GT200 will be a new gen budget card at 200$~ range price

  18. #43
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    It looks like the GT200 will *NOT* feature DX10.1 support.

    That would really suck because DX10.1 has my dream features like incorporated (improved) FSAA support in all games, and many more.

    Yeah, perhaps the GT200 was preemptively designed against ATI's upcoming R600 and then Nvidia just decided to hold on to that design (since the HD2900XT sucked so bad) and work on a lower-profile 65nm G92 design to replace the 8800GTX's high manufacturing costs. And then the R670 was released by surprise with DX10.1 support.

    Lazy Nvidia resting on its laurels was not prepared for this, so had to release its current "prototype", the 8800GT. Notice the naming scheme? Just a GT, not even a GTS. The 9800GTX was supposed to be just a die shrink that replaced an 8800GTX, with equal performance--to be called something like 8800GTX 512 or 8800GTX 1GB.

    Why is Nvidia being so slow right now, after the 8800GTS 512 in December? The design of that G92 was not yet ready for Nvidia's planned dual-chip GX2 card. ATI's jumping the gun on 850Mhz 3870X2 cards showed Nvidia that they could not clock their GX2 G92 chips at like 500-550 Mhz as planned. Now that Nvidia has to up the voltage and clock it at least 600 MHz, Nvidia is pretty stifled with how to cool it properly and make the design stable enough at those speeds without enormous power consumption.

    That, with Nvidia's policy that the GX2 *has* to be released before the new GTX--pretty much explains it all, IMHO.

    Meanwhile, Nvidia has been tweaking and optimizing the GT200 core ever since but perhaps they did not have enough time to incorporate DX10.1 support due to the headaches on that 9800GX2? Let's see...

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  19. #44
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Washington State
    Posts
    1,315
    Quote Originally Posted by Bo_Fox View Post
    It looks like the GT200 will *NOT* feature DX10.1 support.

    That would really suck because DX10.1 has my dream features like incorporated (improved) FSAA support in all games, and many more.

    Yeah, perhaps the GT200 was preemptively designed against ATI's upcoming R600 and then Nvidia just decided to hold on to that design (since the HD2900XT sucked so bad) and work on a lower-profile 65nm G92 design to replace the 8800GTX's high manufacturing costs. And then the R670 was released by surprise with DX10.1 support.

    Lazy Nvidia resting on its laurels was not prepared for this, so had to release its current "prototype", the 8800GT. Notice the naming scheme? Just a GT, not even a GTS. The 9800GTX was supposed to be just a die shrink that replaced an 8800GTX, with equal performance--to be called something like 8800GTX 512 or 8800GTX 1GB.

    Why is Nvidia being so slow right now, after the 8800GTS 512 in December? The design of that G92 was not yet ready for Nvidia's planned dual-chip GX2 card. ATI's jumping the gun on 850Mhz 3870X2 cards showed Nvidia that they could not clock their GX2 G92 chips at like 500-550 Mhz as planned. Now that Nvidia has to up the voltage and clock it at least 600 MHz, Nvidia is pretty stifled with how to cool it properly and make the design stable enough at those speeds without enormous power consumption.

    That, with Nvidia's policy that the GX2 *has* to be released before the new GTX--pretty much explains it all, IMHO.

    Meanwhile, Nvidia has been tweaking and optimizing the GT200 core ever since but perhaps they did not have enough time to incorporate DX10.1 support due to the headaches on that 9800GX2? Let's see...



    I honestly see GT200 being a G92 based card with GDDR5 and a few more SP added. Throw in a 3 slot cooler and add 2 inches to the PCB length and width and your set.
    Phenom 9950BE @ 3.24Ghz| ASUS M3A78-T | ASUS 4870 | 4gb G.SKILL DDR2-1000 |Silverstone Strider 600w ST60F| XFI Xtremegamer | Seagate 7200.10 320gb | Maxtor 200gb 7200rpm 16mb | Samsung 206BW | MCP655 | MCR320 | Apogee | MCW60 | MM U2-UFO |

    A64 3800+ X2 AM2 @3.2Ghz| Biostar TF560 A2+ | 2gb Crucial Ballistix DDR2-800 | Sapphire 3870 512mb | Aircooled inside a White MM-UFO Horizon |

    Current Phenom overclock


    Max Phenom overclock

  20. #45
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by Bo_Fox View Post
    It looks like the GT200 will *NOT* feature DX10.1 support.

    That would really suck because DX10.1 has my dream features like incorporated (improved) FSAA support in all games, and many more.

    Yeah, perhaps the GT200 was preemptively designed against ATI's upcoming R600 and then Nvidia just decided to hold on to that design (since the HD2900XT sucked so bad) and work on a lower-profile 65nm G92 design to replace the 8800GTX's high manufacturing costs. And then the R670 was released by surprise with DX10.1 support.

    Lazy Nvidia resting on its laurels was not prepared for this, so had to release its current "prototype", the 8800GT. Notice the naming scheme? Just a GT, not even a GTS. The 9800GTX was supposed to be just a die shrink that replaced an 8800GTX, with equal performance--to be called something like 8800GTX 512 or 8800GTX 1GB.

    Why is Nvidia being so slow right now, after the 8800GTS 512 in December? The design of that G92 was not yet ready for Nvidia's planned dual-chip GX2 card. ATI's jumping the gun on 850Mhz 3870X2 cards showed Nvidia that they could not clock their GX2 G92 chips at like 500-550 Mhz as planned. Now that Nvidia has to up the voltage and clock it at least 600 MHz, Nvidia is pretty stifled with how to cool it properly and make the design stable enough at those speeds without enormous power consumption.

    That, with Nvidia's policy that the GX2 *has* to be released before the new GTX--pretty much explains it all, IMHO.

    Meanwhile, Nvidia has been tweaking and optimizing the GT200 core ever since but perhaps they did not have enough time to incorporate DX10.1 support due to the headaches on that 9800GX2? Let's see...
    I like your answer a lot, it explains many things rationally unlike some of the other possibilities. I especially agree that Nvidia was planning on riding on the g92 chip as an 8800gtx refresh, remember how there has been talk about a 65nm gtx since the launch of the r600? Nvidia most likely was expecting the r600 to perform better than the 8800gtx since it was taking so long and to be released on a 65nm manufacturing process, and thus needed a cheaper way to produce the g80 die. Though with the r600 flopping, they decided to pull an amd and ride the wave until ati surprised them with a 55nm r600 variation priced at a mere $220 and thus they were forced to release a prototype they never planned on launching to keep the cash coming.

    Think about it. G92. Not G90. G92 was never meant to be the next high end part imo, my guess is that they wanted to have a low clocked dual card that used slightly weaker and less power consuming dies for the gx2 like they did with the 7950gx2 until ati revealed they could put in full fledged rv670 dies with clocks without any problems.


    A possibility is that gt200 ended up being more powerful than the dual g92 card so they made a suped up version of the g92 card be the 9800gtx (instead of gt200) as no one would waste their money on the 9800gx2 if the 9800gtx was more powerful (unless they priced it less, but knowing nvidia they want to capitilize on profits).
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  21. #46
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    ^ yea your reasoning sounds very plausible I have to agree, oh well I'm sure we will hear more about this chip within 3 months time and see how it turns out but I hope you're wrong lol.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  22. #47
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Yeah.. G92 usually being the performance number whereas the G90 shoulda been enthusiast! The G90 must have been designed in case the R600/R670 would be faster than an 8800GTX.

    It would be interesting to see how many "design" strategies Nvidia had against ATI in 2007. Seeing how Nvidia or any other tech companies do not want to talk about their plans for the future, why not let us know what they worked on in the *past*, just for heck's sake? What was your Plan B? Plan C? LOL.. History buffs would love to know!

    Now, that I think about it, the 8800GT was most likely designed to be an 8900GTS with 112 SP's, due out in December (if ATI either released their R670 on 65nm instead of 55nm OR delayed R670 to like Feb. 2008). And the upcoming 9800GTX was supposed to be an 8900GTX, in 512MB and 1GB variants. Plus the 8900GX2 cards would have been released in Jan or Feb at 500-550MHz clocks, a few weeks before 8900GTX (now wrongly called 9800GTX).

    There goes a history lesson for you folks! LOL... not that all history teachers are 100% correct, ever!


    HONESTLY, I think that Nvidia was so pissed off at ATI's naming scheme when they decided to name those R670 cards the 38xx series, when it was not any faster than a HD2900XT. So, Nvidia just decided to name their sawed-off G92 an 8800GT in spite that it was actually faster than an 8800GTS 640MB, in some kind of vengeance on the naming scheme. There was a small chance that Nvidia actually thought 8800GT sounded better than 8900GT, but that just goes against the pattern from all those years, so I can safely assume that Nvidia was genuinely pissed.

    And Nvidia has not yet cooled off, deciding to give their upcoming cards a new generation of 98xx naming scheme (maybe because they did not like how ATI's 38xx cards were selling so well and thought it had too much to do with the number change). WAIT A MINUTE, NVIDIA... ATI actually incorporated several new features including DX10.1 support. What new feature did Nvidia include in their G92 chips that warrants a whole number generation leap? Nothing worthy of mention, except for a die shrink to 65nm, which was never enough for a generation change unless the number of transistors doubled or something like that.

    Bottom line: The 9xxx series from Nvidia is the most disappointing one ever, Nvidia. And the most lame one ever.
    Last edited by Bo_Fox; 03-08-2008 at 11:46 PM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  23. #48
    Registered User
    Join Date
    Mar 2007
    Location
    Canada
    Posts
    89
    Honestly, I am glad to see ati finally looking good since they were aquired by amd. This is really a full circle slap in the face after seeing what amd did when they sat on a lead >.< Competition is always a good thing for us =D all we can do is wait and see i suppose. Speculation will only get us so far.
    Amd athlon X2 3800+ @ 2.5ghz
    Corsair Value Select 2GB @ 208mhz 3-3-3-8 1t
    Asus A8N-E
    XFX 7800gt
    1x160gb sata
    17" samsung lcd

Page 2 of 2 FirstFirst 12

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •