Page 14 of 77 FirstFirst ... 4111213141516172464 ... LastLast
Results 326 to 350 of 1917

Thread: GeForce 9900 GTX & GTS Slated For July Launch

  1. #326
    Registered User
    Join Date
    Dec 2007
    Posts
    41
    Quote Originally Posted by DilTech View Post
    2nd gen means more optimized shaders, as in 50% increased shader performance per shader.
    if one shader from 128 equal 4 unit

    you mean that one shader from 240 equal 6 unit

    then

    128*4= 512 unit

    240*6= 1440 unit

    Now

    1440/512 * 100 = 281.25%

    Do you mean that GTX 280 will perform 2.8X better than 9800GTX

    NO

    it isn't SPARTAAAAAAAAAA

    it is MADNESS

  2. #327
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    Failing to put 10.1 support for now a 2nd product in a row 9xxx series being the previous that "could" have had it, will end up being a mistake. Given the strength of the 8800/9800 not a lot of folks are going to bite, because they won't need the extra power. And with a lack of additional features further souring the soup, the initial sales will be good as the first adopter wave hits and passes. After that things will stagnate, and you will be surprised when they see that mistake in action.

    The holiday season and discounts may save things somewhat through Christmas. After Dec 08, sales of 280/260 series will go into the tank.

    $.02

  3. #328
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Why would you need DX10.1 anway.

    If nVidia doesn't support it, I'll very much doubt you'll see it in many games, since TWIMTBP is very very strong, ATI really dropped the ball there.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  4. #329
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    10.1 supposedly adds a lot of features, and fwiw, what MS does, MS tends to get people to change

  5. #330
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Exactly why many people, including alienware believe that microsoft should have released vista as x64 only, software would have been forced to catch up, and just about every cpu on the market is 64 bit right now anyways. I'm pretty sure that eventually microsoft will force its hand and dx10.1 will show up, its a shame nvidia caused assassins creed to lose dx10.1, that brought a good performance boost that ati users paid for when they bought the cards, not to mention dx10.1 offers more than just extra performance, it also affects the minimum image quality etc
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  6. #331
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by zerazax View Post
    10.1 supposedly adds a lot of features, and fwiw, what MS does, MS tends to get people to change
    Even Microsoft aren't shouting about DX 10.1 as they don't really see the merit in it themselves.

    Quote Originally Posted by AliG View Post
    Exactly why many people, including alienware believe that microsoft should have released vista as x64 only, software would have been forced to catch up, and just about every cpu on the market is 64 bit right now anyways. I'm pretty sure that eventually microsoft will force its hand and dx10.1 will show up, its a shame nvidia caused assassins creed to lose dx10.1, that brought a good performance boost that ati users paid for when they bought the cards, not to mention dx10.1 offers more than just extra performance, it also affects the minimum image quality etc
    Can you prove nVidia made 10.1 disappear? No, you can't but like the other "fanATIcs" (to coin an old term) you'll happily spout conspiracy theories & conjecture.

  7. #332
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    It will eithery be a GTX 280 or a HD4870X2 for me this summer. I wonder which of them that will perform at the utmost top end of performance?

  8. #333
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    dx10 is born to die ^^

  9. #334
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by zerazax View Post
    10.1 supposedly adds a lot of features, and fwiw, what MS does, MS tends to get people to change
    Vista you mean?

    (That's taking it's merry time)

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  10. #335
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Orange County, Southern California
    Posts
    583
    Fudzilla - Geforce GTX 280 to launch on June 18th

    AWESOME. It looks like GTX 280 launches on June 18th, exactly 3 months after I bought my 9800GX2

    But the only I question from Fud is that they claim the launch will take place on a Wednesday (June 18). Whereas Nvidia always tends to release its cards on a Tuesday. So perhaps they will correct the date to June 17?
    EVGA X58 SLI Classified E759 Limited Edition
    Intel Core i7 Extreme 980X Gulftown six-core
    Thermalright TRUE Copper w/ 2x Noctua NF-P12s (push-pull)
    2x EVGA GeForce GTX 590 Classified [Quad-SLI]
    6GB Mushkin XP Series DDR3 1600MHz 7-8-7-20
    SilverStone Strider ST1500 1500W
    OCZ RevoDrive 3 240GB 1.0GB/s PCI-Express SSD
    Creative X-Fi Fatal1ty Professional / Logitech G51 5.1 Surround
    SilverStone Raven RV02
    Windows 7 Ultimate x64 RTM



  11. #336
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by DilTech View Post
    2nd gen means more optimized shaders, as in 50% increased shader performance per shader.


    If you actually read the image, it says that the extra perfomance is obtained through the use of 240SPs. Nothing about them being optimized, though they surely have done some kind of optimization.

    50% more perfomance per shader? SURE
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  12. #337
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    332
    Quote Originally Posted by AuDioFreaK39 View Post
    Fudzilla - Geforce GTX 280 to launch on June 18th

    AWESOME. It looks like GTX 280 launches on June 18th, exactly 3 months after I bought my 9800GX2

    But the only I question from Fud is that they claim the launch will take place on a Wednesday (June 18). Whereas Nvidia always tends to release its cards on a Tuesday. So perhaps they will correct the date to June 17?
    Not an expert, but I doubt the GTX280 will be faster than a 9800gx2 in sli optimized games, that would make this card 100% faster than a 8800gtx

  13. #338
    Xtreme Member
    Join Date
    Mar 2008
    Location
    germany-münster
    Posts
    375
    why?

    looking at the specs, its not only possible, but certain
    (when drivers are good enough right on launchday; but after optimisation, shure)
    system:

    Phenom II 920 3.5Ghz @ 1.4v, benchstable @ over 3,6Ghz (didnt test higher)
    xigmatek achilles
    sapphire hd4870 1gb @ 820 1020
    Gigabyte GA-MA790GP-DS4H
    8gb a-data 4-4-4-12 800
    x-fi xtrememusic
    rip 2x 160gb maxtor(now that adds up to 4...)
    320gb/250gb/500gb samsung

  14. #339
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    216
    I'm gonna wait for the new 55nm 300 Ultra

  15. #340
    Xtreme Mentor
    Join Date
    Jan 2005
    Posts
    3,080
    I'm rather stunned that 10.1 has not been implemented yet, yet ATI have....there must be some logic to this. Perhaps holding off for another gen?
    Gigabyte EP45-DQ6 - rev 1.0, F13a bios | Intel Q9450 Yorkfield 413x8=3.3GHz | OCZ ProXStream 1000W PSU | Azuen X-Fi Prelude 64MB X-RAM| WD VelociRaptor 74HLFS-01G6U0 16MB cache 74GB - 2 drive RAID 0 64k stripe | ASUS 9800GT Ultimate 512MB RAM (128 SP!!) | G.SKILL PC2-8800 4GB kit @ 1100MHz | OCZ ATV Turbo 4GB USB flash | Scythe Ninja Copper + Scythe 120mm fan | BenQ M2400HD 24" 16:9 LCD | Plextor 716SA 0308; firmware 1.11 | Microsoft Wireless Entertainment Desktop 8000 | Netgear RangeMax DG834PN 108mbps; firmware 1.03.39 + HAWKING HWUG1 108mbps USB dongle | Digital Doc 5+ | 7 CoolerMaster 80mm blue LED fans | Aopen H700A tower case | Vista Home Premium - 32bit, SP1

  16. #341
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    Why not implement 10.1? Because the die shrink of the GT200 is going to need a selling point. It might be DX 11 or 10.1, but since 11 is a long way off they are hedging their bets. Remember the changeover from 9.0 to 9.1? Based on Nv's experience I think they are saving it as a nugget for a future product bullet point, hoping instead that pure performance will sell this model. I believe they are correct, this strategy will work, it will just be short term.

    The other thought is an attempt to implement 10.1 via drivers? I have no idea if this is possible anymore, but it's an age old method of approach if software can duplicate the needed commands.

  17. #342
    Xtreme Guru
    Join Date
    Dec 2002
    Posts
    4,046
    they better do 1GB mem per gpu this time around

    512MB: total joke @ high rez

    i aint buying no freakn 512MB/gpu no more!!
    Last edited by NapalmV5; 05-18-2008 at 11:17 AM.

  18. #343
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Orange County, Southern California
    Posts
    583
    Quote Originally Posted by Richard Dower View Post
    I'm rather stunned that 10.1 has not been implemented yet, yet ATI have....there must be some logic to this. Perhaps holding off for another gen?
    DirectX 10.1 is basically a white paper spec. There's no true benefit of having it over DirectX 10 - it just enforces several DX implementations and serves as a "boundary guideline" for coders. ("you must include this, must include that," etc.)
    EVGA X58 SLI Classified E759 Limited Edition
    Intel Core i7 Extreme 980X Gulftown six-core
    Thermalright TRUE Copper w/ 2x Noctua NF-P12s (push-pull)
    2x EVGA GeForce GTX 590 Classified [Quad-SLI]
    6GB Mushkin XP Series DDR3 1600MHz 7-8-7-20
    SilverStone Strider ST1500 1500W
    OCZ RevoDrive 3 240GB 1.0GB/s PCI-Express SSD
    Creative X-Fi Fatal1ty Professional / Logitech G51 5.1 Surround
    SilverStone Raven RV02
    Windows 7 Ultimate x64 RTM



  19. #344
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    Quote Originally Posted by NapalmV5 View Post
    they better do 1GB mem per gpu this time around

    512MB: total joke @ high rez

    i aint buying no freakn 512MB/gpu no more!!
    I agree!!

  20. #345
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by AuDioFreaK39 View Post
    DirectX 10.1 is basically a white paper spec. There's no true benefit of having it over DirectX 10 - it just enforces several DX implementations and serves as a "boundary guideline" for coders. ("you must include this, must include that," etc.)
    Go read the white paper and then come back and say there is "no true benefit."

  21. #346
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by STaRGaZeR View Post


    If you actually read the image, it says that the extra perfomance is obtained through the use of 240SPs. Nothing about them being optimized, though they surely have done some kind of optimization.

    50% more perfomance per shader? SURE
    I dont think there is any PhysX, or any wide memory bus etc. AMD already showed a 512bit bus was a neckbreaker.
    Crunching for Comrades and the Common good of the People.

  22. #347
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Shintai View Post
    I dont think there is any PhysX, or any wide memory bus etc. AMD already showed a 512bit bus was a neckbreaker.
    Explain how it could possibly *hurt* performance...

  23. #348
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by Shintai View Post
    I dont think there is any PhysX, or any wide memory bus etc. AMD already showed a 512bit bus was a neckbreaker.
    AMD's problem wasn't the 512bit bus.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  24. #349
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Sr7 View Post
    Explain how it could possibly *hurt* performance...
    Did I say it hurt performance? Try $.

    There is a reason 9800GTX/HD3870 etc is 256bit
    Crunching for Comrades and the Common good of the People.

  25. #350
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Shintai View Post
    Did I say it hurt performance? Try $.

    There is a reason 9800GTX/HD3870 etc is 256bit
    When you say neckbreaker, you are making it sound like (even if unintentionally) it is going to hurt performance. Sorry, your words, not mine.

    You say 512bit is expensive, but do you think DDR5 is cheap for ATI?

Page 14 of 77 FirstFirst ... 4111213141516172464 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •