Page 2 of 12 FirstFirst 12345 ... LastLast
Results 26 to 50 of 297

Thread: Nvidia GT200-successor on 22th October?

  1. #26
    Xtreme Member
    Join Date
    Oct 2005
    Posts
    197
    there is always Cuda-pi or what ever it will be called.

  2. #27
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    i have the feeling its just gonna be 2 gtx 280's or gtx 260's in some form.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  3. #28
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by Lestat View Post
    did i just read an article written by Yoda ?? sure sounded like it
    Yoda overdosing on amphetamines and a poor track record in English classes.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  4. #29
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    i thuoght that nv only must get close to the price/performance of the 4870x2 to beat it?
    as per say the 4870 to the 280.but anyways will see
    _________________

  5. #30
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    Quote Originally Posted by Macadamia View Post
    oh wait, where's CUDA now?
    What do you mean exactly by this statement?

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  6. #31
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Macadamia View Post
    I don't think they even had the GDDR5 memory controller finalized when it was July. Whatever reason they didn't do it before, I'm very inclined to say it's technical. Otherwise even with existing GDDR4 they wouldn't have been so shortsighted with the GT200 obviously.


    nVidia really did NOT knew about this one. They thought the 4870 was going to be like the 3870, excessive waste on bandwidth.

    Plus they have a lot of other fish to fry too, including shader density, keeping the core clock >700Mhz, and oh wait, where's CUDA now?
    GDDR4 is and was a failure. So lets drop that part.

    GDDR5 is still in low supply. If nVidia had gone GDDR5 aswell. There would be a huge supply issue. And AMD already got all the supply. In short, until GDDR5 gets enough volume and gets cheap enough. 512But GDDR3 or 256bit GDDR5 is basicly the same in all aspects. 256bit GDDR5 is the winner in the long run tho when GDDR5 prices comes down and supply increases.

    So its not a matter of if GT200 supports GDDR5 or not. Bandwidth doesnt matter atm.

    nVidia failed because they thought AMD would go the cheap small route again. So they put their money on 480SP or so. Not 800.
    Crunching for Comrades and the Common good of the People.

  7. #32
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by Shintai View Post
    GDDR4 is and was a failure. So lets drop that part.

    GDDR5 is still in low supply. If nVidia had gone GDDR5 aswell. There would be a huge supply issue. And AMD already got all the supply. In short, until GDDR5 gets enough volume and gets cheap enough. 512But GDDR3 or 256bit GDDR5 is basicly the same in all aspects. 256bit GDDR5 is the winner in the long run tho when GDDR5 prices comes down and supply increases.

    So its not a matter of if GT200 supports GDDR5 or not. Bandwidth doesnt matter atm.

    nVidia failed because they thought AMD would go the cheap small route again. So they put their money on 480SP or so. Not 800.
    Totally correct, and agreed.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  8. #33
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Bulgaria
    Posts
    238
    It really does not matter, how will the new video card perform. The interesting think is does AMD have some hidden cards with the PLX chip on 4870x2 board and the inactive functions.

  9. #34
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by Shintai View Post
    GDDR4 is and was a failure. So lets drop that part.

    GDDR5 is still in low supply. If nVidia had gone GDDR5 aswell. There would be a huge supply issue. And AMD already got all the supply. In short, until GDDR5 gets enough volume and gets cheap enough. 512But GDDR3 or 256bit GDDR5 is basicly the same in all aspects. 256bit GDDR5 is the winner in the long run tho when GDDR5 prices comes down and supply increases.

    So its not a matter of if GT200 supports GDDR5 or not. Bandwidth doesnt matter atm.

    nVidia failed because they thought AMD would go the cheap small route again. So they put their money on 480SP or so. Not 800.
    ^^ this

  10. #35
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    GDDR5 is still in low supply.
    AMD get all the GDDR5 stock so there is no low supply.
    HD 4870 512Mb GDDR5 despite being launched later then GTX260, it came up on market first.

    AMD is selling right now:
    - HD 4870 512Mb GDDR5
    - HD 4870 1Gb GDDR5
    - HD 4870X2 2Gb GDDR5

    And there is supply for everyone in every markets and many brandings.
    256bit + GGDR5 was a very very smart move....

    For Q1 2009 they can simply shrink RV770 to 40nm still using 256bit and jump to 5Ghz++ GDDR5.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  11. #36
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by v_rr View Post
    AMD get all the GDDR5 stock so there is no low supply.
    HD 4870 512Mb GDDR5 despite being launched later then GTX260, it came up on market first.
    You are counteragrueing yourself. No low supply, AMD get all there is and only 3 lowvolume cards with GDDR5.

    See it?

    And right now, even with a shrink, faster GDDR5 speeds doesnt matter much if any. Maybe on RV870+

    But hopefully AMD and nVidia will jump on the same wagon as Intels Larrabee.

    Last edited by Shintai; 09-20-2008 at 08:15 AM.
    Crunching for Comrades and the Common good of the People.

  12. #37
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    You are counteragrueing yourself. No low supply, AMD get all there is and only 3 lowvolume cards with GDDR5.
    Is not so Low Volume cards because HD 4870 is almost in the sweet spot of 200 euros (in Euroland).
    The HD 4870X2 is the very low volume card.

    Despite the lower supply itīs more then enought for AMD pourposes and strategy
    As I said, AMD strategy worked in perfection.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  13. #38
    Registered User
    Join Date
    Sep 2007
    Posts
    90
    Am I the only one that doesn't want a major performance increase in terms of numbers in apps and games?

    I would rather see currently performance levels to stay somewhat the same and have something that focuses more of significantly reduced heat and power consumption. Games arn't increasing in performance requirements at the moment. Surely that with reduced heat and power consumption this would in the end game provide much much profit for the companies?

  14. #39
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by r4gm4n View Post
    Games arn't increasing in performance requirements at the moment.


    Games are increasing in performance requirements all the time.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  15. #40
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by r4gm4n View Post
    Am I the only one that doesn't want a major performance increase in terms of numbers in apps and games?

    I would rather see currently performance levels to stay somewhat the same and have something that focuses more of significantly reduced heat and power consumption. Games arn't increasing in performance requirements at the moment. Surely that with reduced heat and power consumption this would in the end game provide much much profit for the companies?
    Me too. Its pretty sad that you can have a 50W rig, and then some semicrap 100-150W GFX card in.
    Crunching for Comrades and the Common good of the People.

  16. #41
    I am Xtreme
    Join Date
    Sep 2006
    Posts
    10,374
    Quote Originally Posted by v_rr View Post
    As I said, AMD strategy worked in perfection.
    It has to work once in a while... good for them otherwise they would have sacked employees instead of Nvidia
    Question : Why do some overclockers switch into d*ckmode when money is involved

    Remark : They call me Pro Asus Saaya yupp, I agree

  17. #42
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by Shintai View Post
    Its pretty sad that you can have a 50W rig, and then some semicrap 100-150W GFX card in.
    True.
    Hehe, I remember back in the day when fans first started appearing on video cards. Now look at where we are - single GPU abominations with 180W of actual power consumption.
    You were not supposed to see this.

  18. #43
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by largon View Post
    True.
    Hehe, I remember back in the day when fans first started appearing on video cards. Now look at where we are - single GPU abominations with 180W of actual power consumption.
    Oh ye...I want the old cards back.

    TNT2 ULTRA!



    GFX cards is also the only things keeping us in large cases.
    Last edited by Shintai; 09-20-2008 at 02:16 PM.
    Crunching for Comrades and the Common good of the People.

  19. #44
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    170
    GFX cards is also the only things keeping us in large cases.
    Well, I think that CPU coolers and three memmory channels do that quite well

  20. #45
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Someone missed the HD 4600 series.
    Low power-consuption, high performance, no aditional power conector, DX_10.1, tesselation, UVD2.0, Opengl 3.0, etc
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  21. #46
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by v_rr View Post
    Someone missed the HD 4600 series.
    (Alot of commercial crap)
    Nope, still too much power drawn, still too big cooler, still too big card.

    Some HD4550/9400GT cards...

    http://www.techpowerup.com/img/08-09-11/29a.png
    Last edited by Shintai; 09-20-2008 at 02:22 PM.
    Crunching for Comrades and the Common good of the People.

  22. #47
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    Rochester, MN
    Posts
    718
    Quote Originally Posted by Shintai View Post
    Oh ye...I want the old cards back.

    TNT2 ULTRA!



    GFX cards is also the only things keeping us in large cases.
    that thing was the ! I had one in an old Dell XPS and it played my games for at least 5 years.
    Thermaltake Armor Series Black
    GIGABYTE GA-P35-DS3R
    Q6600 3.6 GHZ Thermalright Ultra 120 eXtreme
    4 GB Corsair XMS2 w/ OCZ XTX Ram Cooler 2 x 60mm
    9800GT 512MB
    18X Pioneer DVD-RW Burner
    720 Watt Enermax Infiniti
    4x640GB RAID 10
    Windows 7

  23. #48
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Quote Originally Posted by Leeghoofd View Post
    It has to work once in a while... good for them otherwise they would have sacked employees instead of Nvidia
    So it's a fluke! LMAO, and you were saying you were objective, funneh.

  24. #49
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Shintai View Post
    But hopefully AMD and nVidia will jump on the same wagon as Intels Larrabee.

    http://www.techarp.com/article/Intel.../bandwidth.jpg
    Pardon my ignorance, but is that pic illustrating the bandwidth usage difference between tile based rendering versus total fragment rendering?
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  25. #50
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by r4gm4n View Post
    Am I the only one that doesn't want a major performance increase in terms of numbers in apps and games?

    I would rather see currently performance levels to stay somewhat the same and have something that focuses more of significantly reduced heat and power consumption. Games arn't increasing in performance requirements at the moment. Surely that with reduced heat and power consumption this would in the end game provide much much profit for the companies?
    I will want more perfomance until they reach the level of 4xAA or 8xAA plus sustained 60fps (in a 60Hz monitor), 75fps (in a 75Hz monitor), etc. in all games. Only then me and probably most people in this forum will be happy with their gaming perfomance. Don't misunderstand me, I want the lowest possible power consumption, specially at idle because most of the time is spent idling, but when gaming I only care about those dreamed constant 60/75fps.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

Page 2 of 12 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •