Page 3 of 12 FirstFirst 123456 ... LastLast
Results 51 to 75 of 297

Thread: Nvidia GT200-successor on 22th October?

  1. #51
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    170
    Quote Originally Posted by Cybercat View Post
    Pardon my ignorance, but is that pic illustrating the bandwidth usage difference between tile based rendering versus total fragment rendering?
    Yep.

  2. #52
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Man, that's a huge difference. I wonder if it would even be possible for NVIDIA and ATI to move to that model without too much driver reworking.

    I heard though that the G80 does some form of tile rendering.

    http://www.icare3d.org/GPU/CN08
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  3. #53
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    170
    Well, rather not.
    It's possible in Larrabee, because it keeps tiles in L2 cache, so it reads each pixel only once.

  4. #54
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by Shintai View Post
    You are counteragrueing yourself. No low supply, AMD get all there is and only 3 lowvolume cards with GDDR5.

    See it?

    And right now, even with a shrink, faster GDDR5 speeds doesnt matter much if any. Maybe on RV870+

    But hopefully AMD and nVidia will jump on the same wagon as Intels Larrabee.

    You mean the card's that gonna be 1/2 the speed of R900? Good!
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  5. #55
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Shintai View Post
    GDDR4 is and was a failure. So lets drop that part.

    GDDR5 is still in low supply. If nVidia had gone GDDR5 aswell. There would be a huge supply issue. And AMD already got all the supply. In short, until GDDR5 gets enough volume and gets cheap enough. 512But GDDR3 or 256bit GDDR5 is basicly the same in all aspects. 256bit GDDR5 is the winner in the long run tho when GDDR5 prices comes down and supply increases.

    So its not a matter of if GT200 supports GDDR5 or not. Bandwidth doesnt matter atm.

    nVidia failed because they thought AMD would go the cheap small route again. So they put their money on 480SP or so. Not 800.



    Bandwidth has always been a problem! Since Everquest, video cards have always had a hard time dealing with the game world, simply due to the limitations of bandwidth and memory size.

    All the stuttering that goes on is almost solely due to lack of bandwidth and/or only 1gig of memory. I'm sick of this. Who cares about max fps or even shaders, when every game glitches when entering a building or awsome spot... totally ruining the experience.




    .

  6. #56

  7. #57
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Xoulz View Post


    Bandwidth has always been a problem! Since Everquest, video cards have always had a hard time dealing with the game world, simply due to the limitations of bandwidth and memory size.

    All the stuttering that goes on is almost solely due to lack of bandwidth and/or only 1gig of memory. I'm sick of this. Who cares about max fps or even shaders, when every game glitches when entering a building or awsome spot... totally ruining the experience.
    And alot of games use streaming textures so that even 1GB cards or more aren't taken advantage of, you still have to load the textures over the system bus to the card.

    Every game should have the option of putting everything in the ram/vram.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  8. #58
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by Stukov View Post
    And alot of games use streaming textures so that even 1GB cards or more aren't taken advantage of, you still have to load the textures over the system bus to the card.

    Every game should have the option of putting everything in the ram/vram.
    Partially true.

    Xoulz man... that game and the previous Elders Scroll suck big time, and their programmers even more.

    Just because you see stuttering doesn't make your assumption true.
    That's not an issue of the graphics card memory bandwidth ( which is godly high for those games ) neither the VGA's FB capacity ( 512MB is more than ok for those games ).
    It's just a stupid programmer and no future-proofing their game with advanced "options" to preload the whole world into the VRAM.

    You'd be surprised if you check the VRAM consumption of nowadays games ( past games aren't even worth it... max ~200MB ) and how much of the VRAM BW the cards are utilizing.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  9. #59
    Registered User
    Join Date
    Sep 2007
    Posts
    90
    ooooh - GTX 350.... Better not require more power else its a flop in my case no matter what performance numbers its brings out! But with 2GB GDDR5 surely thats gotta be a X2 card of some sort. Hopefully they've got the ATI route with of X2 cards

  10. #60
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    Quote Originally Posted by r4gm4n View Post
    ooooh - GTX 350.... Better not require more power else its a flop in my case no matter what performance numbers its brings out! But with 2GB GDDR5 surely thats gotta be a X2 card of some sort. Hopefully they've got the ATI route with of X2 cards
    Considering the HD 4870 X2 consumes around 100W more than the GTX 280, nVidia has plenty of room to increase power consumption and I am sure that they will do so. I hope that the GTX 350 doesn't consume as much as the 4870 X2 though, the insane power consumption is something that is keeping me away. I could probably handle the X2 but my 520W PSU would be crying.
    Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit

  11. #61
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    Nope, still too much power drawn, still too big cooler, still too big card.

    Some HD4550/9400GT cards...

    http://www.techpowerup.com/img/08-09-11/29a.png
    Hum?



    This is a small card with plently of tecnology and speed.
    The HD4550/9400GT are micro cards with stupid performance for a standalone card. They decode HD and not much more.

    Otherwise the HD 4600 play every games with good resolution and good setings. Performance is between HD 3850 - 3870.
    Last edited by v_rr; 09-21-2008 at 04:38 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  12. #62
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    I find it hard to believe that both the GTX 280+ and the GT300 will BOTH be on 55nm.., with the GT300 using GDDR5 memory (if 512-bit bandwidth should already be enough). Perhaps the so-called "GTX 350" will just be a dual-chip card, using the same 55nm chip, since that is what 2GB of GDDR5 memory suggests. Then it makes sense that both are using the same chips after all (and that the GT300 is not a new generation).

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  13. #63
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Extelleron View Post
    Considering the HD 4870 X2 consumes around 100W more than the GTX 280, nVidia has plenty of room to increase power consumption and I am sure that they will do so. I hope that the GTX 350 doesn't consume as much as the 4870 X2 though, the insane power consumption is something that is keeping me away. I could probably handle the X2 but my 520W PSU would be crying.
    The X2 is not that bad, my 700 works fine with 3 hard drives. I think in Guru3d's review the whole system with an E8400 and X2 only pulled ~450 watts at the wall.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  14. #64
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    that card would be overkill

  15. #65
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    GTX 350 could easily be a medium-powered 55nm GT200 derivative on a 256-bit bus with 2 chips totaling 512-bit of GDDR5 bandwidth. This would be a very good option for Nvidia to pursue, I doubt with the "350" name it'll be a GPU that requires both 512-bit and GDDR5 for a single-core. After G80 I would have thought Nvidia would have learned about the problem of huge and hot cores, maybe GT300 will be smaller and scale like RV770.

  16. #66
    Xtreme X.I.P.
    Join Date
    Aug 2004
    Location
    Chile
    Posts
    4,151
    that looks like a bad joke, they misspelled NVIDIA (Nidvia)

    Quote Originally Posted by [XC] gomeler View Post
    GTX 350 could easily be a medium-powered 55nm GT200 derivative on a 256-bit bus with 2 chips totaling 512-bit of GDDR5 bandwidth. This would be a very good option for Nvidia to pursue, I doubt with the "350" name it'll be a GPU that requires both 512-bit and GDDR5 for a single-core. After G80 I would have thought Nvidia would have learned about the problem of huge and hot cores, maybe GT300 will be smaller and scale like RV770.
    Yup with the 55nm they can go with a lower bus and a X2 model with GDDR5

  17. #67
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073
    That tesla board looks pretty interesting too. I wonder if it will have a hybrid SLI type config, like amd has hybrid Xfire? If thats legit, the 350 has to be a double card.
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

  18. #68
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Please, let's quote that picture 500 more times just in case someone missed it!
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  19. #69
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Orange County, Southern California
    Posts
    583
    Quote Originally Posted by Cybercat View Post
    Please, let's quote that picture 500 more times just in case someone missed it!
    I just gave a headstart and posted it on 6 other forums!


    I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:

    HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!

    We got to know the related news but we are not so sure about this:

    NVIDIA GTX 350
    GT300 core
    55nm technology
    576mm
    512bit
    DDR5 2GB memory, doubled GTX280
    480SP doubled GTX280
    Grating operation units are 64 the same with GTX280
    216G bandwidth
    Default 830/2075/3360MHZ
    Pixel filling 36.3G pixels/s
    Texture filling 84.4Gpixels/s
    Cannot support D10.1 .10.0/SM4.0
    Last edited by AuDioFreaK39; 09-21-2008 at 01:52 PM.
    EVGA X58 SLI Classified E759 Limited Edition
    Intel Core i7 Extreme 980X Gulftown six-core
    Thermalright TRUE Copper w/ 2x Noctua NF-P12s (push-pull)
    2x EVGA GeForce GTX 590 Classified [Quad-SLI]
    6GB Mushkin XP Series DDR3 1600MHz 7-8-7-20
    SilverStone Strider ST1500 1500W
    OCZ RevoDrive 3 240GB 1.0GB/s PCI-Express SSD
    Creative X-Fi Fatal1ty Professional / Logitech G51 5.1 Surround
    SilverStone Raven RV02
    Windows 7 Ultimate x64 RTM



  20. #70
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by [XC] gomeler View Post
    GTX 350 could easily be a medium-powered 55nm GT200 derivative on a 256-bit bus with 2 chips totaling 512-bit of GDDR5 bandwidth. This would be a very good option for Nvidia to pursue, I doubt with the "350" name it'll be a GPU that requires both 512-bit and GDDR5 for a single-core. After G80 I would have thought Nvidia would have learned about the problem of huge and hot cores, maybe GT300 will be smaller and scale like RV770.
    Yeah, 256x2 bits for a dual-chip configuration makes perfect sense, paired with GDDR5.

    However, do not expect Nvidia to maintain integrity with its naming scheme. Remember the 9800GX2? It was a new "generation" from the 8800GT and 8800GT 512MB. Nvidia might just as well do the same thing with its GTX 350, just to make it sound more attractive.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  21. #71
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Tell me, what is the best way to get your computer store world renown overnight? Post false GPU SKUs.

  22. #72
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by AuDioFreaK39 View Post
    I just gave a headstart and posted it on 6 other forums!


    I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:

    HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!

    We got to know the related news but we are not so sure about this:

    NVIDIA GTX 350
    GT300 core
    55nm technology
    576mm
    512bit
    DDR5 2GB memory, doubled GTX280
    480SP doubled GTX280
    Grating operation units are 64 the same with GTX280
    216G bandwidth
    Default 830/2075/3360MHZ
    Pixel filling 36.3G pixels/s
    Texture filling 84.4Gpixels/s
    Cannot support D10.1 .10.0/SM4.0
    Interesting how the specs were already there for like 2 months!

    But I thought 576mm^2 is for the 65nm version, not the smaller 55nm version?!? I wonder if it would need a far more exotic cooling solution than the 9800GX2?

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  23. #73
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    above USA...below USSR
    Posts
    1,186
    Quote Originally Posted by AuDioFreaK39 View Post
    I just gave a headstart and posted it on 6 other forums!


    I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:

    HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!

    We got to know the related news but we are not so sure about this:

    NVIDIA GTX 350
    GT300 core
    55nm technology
    576mm
    512bit
    DDR5 2GB memory, doubled GTX280
    480SP doubled GTX280
    Grating operation units are 64 the same with GTX280
    216G bandwidth
    Default 830/2075/3360MHZ
    Pixel filling 36.3G pixels/s
    Texture filling 84.4Gpixels/s
    Cannot support D10.1 .10.0/SM4.0
    wow, if that's real; whats ati go'n to have? better drivers lol?
    Case-Coolermaster Cosmos S
    MoBo- ASUS Crosshair IV
    Graphics Card-XFX R9 280X [out for RMA] using HD5870
    Hard Drive-Kingston 240Gig V300 master Seagate 160Gb slave Seagate 250Gb slave Seagate 500Gb slave Western Digital 500Gb
    CPU-AMD FX-8320 5Ghz
    RAM 8Gig Corshair c8
    Logitech 5.1 Z5500 BOOST22
    300Gb of MUSICA!!


    Steam ID: alphamonkeywoman
    http://www.techpowerup.com/gpuz/933ab/

  24. #74
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by SKYMTL View Post
    Tell me, what is the best way to get your computer store world renown overnight? Post false GPU SKUs.
    Yep.

    36.3GP/s fillrate? How is that achieved? 43.7 ROPs? lol

    Totally bogus.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  25. #75
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Oh yeah, I overlooked this absurd claim: 830 MHz core clock and 2075 MHz for the 240 shaders... on a dual-chip configuration, I think it's far-fetched.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

Page 3 of 12 FirstFirst 123456 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •