MMM
Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 57

Thread: NVIDIA GT200 now known as GeForce GTX 280/260?

  1. #26
    Xtreme 3D Team Member
    Join Date
    Jun 2006
    Location
    small town in Indiana
    Posts
    2,285
    I vote for no numbers just names. and if any contain the word leet or uber or ultra, the ceo's should be fired
    QX 9650 5ghz with 1.55v 4.8ghz with 1.5v 24/7 in a VAPOLI V-2000B+ Single stage phase cooling.
    DFI LP LT X-38 T2R
    2X HD4850's water cooled , volt modded
    Thermaltake 1KW Psu
    4x Seagate 250GB in RAID 0
    8GB crucial ballistix ram

  2. #27
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    United States
    Posts
    1,546
    I really just wish they would create a simple naming system and stick to it, their current one is getting out of hand with all this renaming going on.

  3. #28
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    Los Angeles, CA
    Posts
    628
    I don't care what they name them as long as they really do come out next month and 2 of them rock my 30" like I've never sen before
    2600k @ 5ghz / z68 Pro / 8gb Ripjaws X / GTX 580 SLi
    2x Inferno raid 0 / WD 1TB Black / Thermaltake 1200w
    Dell 3008 WFP / Dual Loop WC MM UFO

  4. #29
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    yeah imma get one for sure..........hope its really launching in june.

  5. #30
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    North West London
    Posts
    776
    Quote Originally Posted by GAR View Post
    yeah imma get one for sure..........hope its really launching in june.
    If this is launching in June then what happened to the 9900?

  6. #31
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    Quote Originally Posted by DFI pit bull View Post
    If this is launching in June then what happened to the 9900?
    I dont think there is a 9900 anymore, nvidia had stated before they are going to simplify the naming scheme of their cards, hence why its the gtx 280/260, high the better.

  7. #32
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    287
    haha BBQ
    CPU:X3210 G0@3.6 (8x450 1.48v)
    MOBO: Asus P5K Premium, bios 402
    RAM: 4x 1GB DDR2-1066 @ 1080 5-5-5-15 2.3v
    CASE: antec 900
    GPU: XFX GeForce 7900gt @600/1900
    PSU: silverstone OP1000
    Cooling: MCP655, Apogee GT, MCR320-QP-K

  8. #33
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    I don't see why people think that this is so confusing to the normal consumer, if you look at the back of the box in most cases, it has the series of cards listed and how "better" they are in order, and if you really think about it. Comparing old series to new series cards. There really isn't any other way to do it. What Nvidia has picked is the less confusing possible for the market I think. Maybe we should have a brainstorm Geforce naming scheme thread.

    The only thing i can think of. is GS V1, V2, V3. GT V1, V2, V3....and so on. But that sounds dumb and is still pretty much the same thing as what Nvidia is doing with this new number letter scheme. If you really think about it, you have got to be an idiot to not figure out the new scheme. I mean S comes before T so obviously the GS is slower than the GT and GTX is obviously faster than GT and GTS is obviously between GT and GTX because of the Combining of the S and T. then you have numbers such as 240-260-280 to tell which is faster in that letter series. How much more plainly could you put it. I know its hard for the normal consumer to understand which is the faster card from the past series. But there really is no other way of doing it, besides them looking it up. unless they do a comparison on the back of the box to the older series of cards.
    Last edited by Decami; 05-16-2008 at 06:05 PM.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  9. #34
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    Los Angeles, CA
    Posts
    628
    boy I wish we had some actual specs or performance numbers so we could stop talking about something as meaningless as naming schemes :p
    2600k @ 5ghz / z68 Pro / 8gb Ripjaws X / GTX 580 SLi
    2x Inferno raid 0 / WD 1TB Black / Thermaltake 1200w
    Dell 3008 WFP / Dual Loop WC MM UFO

  10. #35
    Registered User
    Join Date
    Feb 2008
    Posts
    46
    Yeah... GT200 is exciting but we don't even know if it's going to be worth it.
    Enermax Infiniti 720W || ASUS P5K Deluxe || Intel Core 2 Quad Q6700 || 4GB OCZ Reaper DDR2 1066Mhz RAM || eVGA GeForce GTX 280 SC || WD Velociraptor 300GB || 2x Seagate 7200.11 1TB || WD Caviar SE16 640GB || Auzentech X-Fi Prelude || 2x LG DVD-RW || Samsung 245BW || Windows Vista Ultimate x64

    http://www.cuddlewar.com/

  11. #36
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by Decami View Post
    I don't see why people think that this is so confusing to the normal consumer, if you look at the back of the box in most cases, it has the series of cards listed and how "better" they are in order, and if you really think about it. Comparing old series to new series cards. There really isn't any other way to do it. What Nvidia has picked is the less confusing possible for the market I think. Maybe we should have a brainstorm Geforce naming scheme thread.
    how many people have the opportunity to look at the retail box before they buy the product?
    personally, i didn't have any retail boxes in my hands before buying for over 5 years...

    I mean S comes before T so obviously the GS is slower than the GT and GTX is obviously faster than GT and GTS is obviously between GT and GTX because of the Combining of the S and T.
    obvious? sry, but it's not obvious at all... S comes before T? so why didn't they choose a GeForce A1, A2 etc.. and GeForce B1, B2 etc... - THAT would be obvious, but not GT, GS, GTX and GTS.
    if you'd have absolutely no clue about nvidia's naming scheme it's everything but obvious what GT, GS, GTS and GTX mean...
    Last edited by RaZz!; 05-17-2008 at 10:36 AM.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  12. #37
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715

  13. #38
    Xtreme Member
    Join Date
    Nov 2005
    Posts
    143
    Quote Originally Posted by OBR View Post
    Exactly how accurate are those stats? They sound a little too good to be true.

  14. #39
    Xtreme Enthusiast
    Join Date
    Feb 2008
    Location
    Cancun
    Posts
    713
    Quote Originally Posted by tranceaddict View Post
    Exactly how accurate are those stats? They sound a little too good to be true.

    No way to be certain, imo with all the speculation surrounding the new cards it's better to just wait and see what the specs when the cards are released rather than get hyped up by speculation.

  15. #40
    Registered User
    Join Date
    Feb 2008
    Posts
    46
    I'm hoping that the 280 will run cooler than most high end cards do, though if it has those specs I doubt it will. The 9800GTX is just too hot! 60C at idle for me.

    But if the card is coming out June 18, then we should be finding out some solid information soon enough.
    Enermax Infiniti 720W || ASUS P5K Deluxe || Intel Core 2 Quad Q6700 || 4GB OCZ Reaper DDR2 1066Mhz RAM || eVGA GeForce GTX 280 SC || WD Velociraptor 300GB || 2x Seagate 7200.11 1TB || WD Caviar SE16 640GB || Auzentech X-Fi Prelude || 2x LG DVD-RW || Samsung 245BW || Windows Vista Ultimate x64

    http://www.cuddlewar.com/

  16. #41
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by OBR View Post
    why 448-bit and 896mb? sounds a bit too uneven to me.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  17. #42
    Xtreme 3D Team Member
    Join Date
    Jun 2006
    Location
    small town in Indiana
    Posts
    2,285
    Still no DX 10.1 support? so just a respin of old tech? Not flaming , just asking. Are they not going to support dx10.1 for this generation?
    QX 9650 5ghz with 1.55v 4.8ghz with 1.5v 24/7 in a VAPOLI V-2000B+ Single stage phase cooling.
    DFI LP LT X-38 T2R
    2X HD4850's water cooled , volt modded
    Thermaltake 1KW Psu
    4x Seagate 250GB in RAID 0
    8GB crucial ballistix ram

  18. #43
    Xtreme Addict
    Join Date
    Sep 2006
    Location
    Surat, India.
    Posts
    1,309
    Quote Originally Posted by RaZz! View Post
    why 448-bit and 896mb? sounds a bit too uneven to me.
    how is the 384-bit and 768Mb on the 8800GTX ?

  19. #44
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by ANP !!! View Post
    how is the 384-bit and 768Mb on the 8800GTX ?
    768mb = 64 * 12 (3x 4 chips). 896mb = 64 * 14?? 128 * 7?? i don't know, sounds odd to me

    but you're right about the 384-bit, so nvm
    Last edited by RaZz!; 05-18-2008 at 04:46 AM.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  20. #45
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    DX10.1 is for nothing ... forget about this wrongly AMD implemented crap ... in their famous HD3800 ...

    This GeForce GTX 280 Spec is very very close to reality, but one thing is missing ... i cannot to tell what, because NDA ... TOO launch is many earlier then we expected ...

  21. #46
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by Origin_Unknown View Post
    they should use the ABC naming scheme.
    i.e.
    1a - lowend
    1b - midrange
    1c - omgubarprice

    then when a new gen is out just goto

    2a
    2b
    2c

    repeat over and over
    I think that's a boring naming scheme, "look at XFX's uberclocked 1b ULTRA!!1".

    OBR sounds like a biased nV fanboy, don't listen to his pro-nV crap.
    Last edited by Nuker_; 05-18-2008 at 05:17 AM.

  22. #47
    Xtreme Enthusiast
    Join Date
    Feb 2008
    Location
    Cancun
    Posts
    713
    Quote Originally Posted by Lightning_Rider View Post
    I'm hoping that the 280 will run cooler than most high end cards do, though if it has those specs I doubt it will. The 9800GTX is just too hot! 60C at idle for me.

    But if the card is coming out June 18, then we should be finding out some solid information soon enough.


    Mine doesn't even get that high under load when overclocked to the max it can go on stock air and the fan is it 40%. But I'd expect the GT200 to be around the same temps as the 8800GTX or maybe a bit higher.

  23. #48
    Registered User
    Join Date
    Feb 2008
    Posts
    46
    Really? What kind of temps do you get and what program do you use to view your temps?

    I just checked again with eVGAs precision tool and it reads 62C. Fan is at 39%.

    BTW... I have a question about eVGA's step up. Say we step up to the 280, do they send us a brand new card still in the box or is it a refurbished one or used etc.
    Enermax Infiniti 720W || ASUS P5K Deluxe || Intel Core 2 Quad Q6700 || 4GB OCZ Reaper DDR2 1066Mhz RAM || eVGA GeForce GTX 280 SC || WD Velociraptor 300GB || 2x Seagate 7200.11 1TB || WD Caviar SE16 640GB || Auzentech X-Fi Prelude || 2x LG DVD-RW || Samsung 245BW || Windows Vista Ultimate x64

    http://www.cuddlewar.com/

  24. #49
    Xtreme Enthusiast
    Join Date
    Feb 2008
    Location
    Cancun
    Posts
    713
    Quote Originally Posted by Lightning_Rider View Post
    Really? What kind of temps do you get and what program do you use to view your temps?

    I just checked again with eVGAs precision tool and it reads 62C. Fan is at 39%.
    Rivatuner and the EVGA precision tool both give me around 43C idle and 58C load at 825/2000/2400 (core/shader/memory) and the fan is set to 35% not 40% like I thought.

  25. #50
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by OBR View Post
    DX10.1 is for nothing ... forget about this wrongly AMD implemented crap ... in their famous HD3800 ...
    Lol DX_10.1 is made by Microsoft. To AMD adress DX_10.1 they had to change 0.0 (zero point zero) their arquitecture.
    Your comment yes, is crap.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

Page 2 of 3 FirstFirst 123 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •