MMM
Page 14 of 20 FirstFirst ... 411121314151617 ... LastLast
Results 326 to 350 of 478

Thread: Nvidia Fermi GF100 working desktop card spotted on Facebook

  1. #326
    Xtreme Member
    Join Date
    Oct 2009
    Location
    Santos(São Paulo), Brasil.
    Posts
    202
    Well, for people who really cares about games with GPU PhysX, you can always do something like this:




    hmm, HD5870 + GTX275 faster than 2xGTX285 SLI at a GPU PhysX game.

    AMD Phenom II X6 1055T @ 4009MHz
    NB @ 2673MHz
    Corsair H50 + Scythe Ultra Kaze 3k
    Gigabyte GA-MA790X-UD4P
    2X2GB DDR2 OCZ Gold
    XFX Radeon HD5850 XXX @ 900MHz Core
    OCZ Agility2 60GB
    2x500GB HDD WD Blue
    250GB Samsung
    SevenTeam 620W PAF
    CoolerMaster CM690

  2. #327
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    I wrote Physics.
    Does it matter if I wrote Physics or PhysX? Its OBVIOUS we're talking about 1 setting.

    The point is, thousands of games (minus the dozen that use PhysX) didn't need nVidia fancy PhysX to make rigid bodies and cloth and blowing papers.
    In 2002, Hitman2 had cloth simulation. Half-Life 2 floating barrels and colliding objects didn't require a nVidia card. And ofcourse Crysis runs just as well on a Radeon as a GeForce.

    But, Batman is a TWIMTBP game. nVidia wants only their loyal gamers to have access to all features. I'm all for acceleration of existing content, but requiring PhysX for additional game content (cloth, debris, papers etc) is just wrong.. And if Crysis managed to model village houses collapsing from exploding barrels with a CPU, why other than "cheating" would some crumpled paper in Batman cause it to always go down to 15fps with a Radeon?

    FYI: in older post I have link to tomshardware which shows like 30% CPU (i7) both WITH AND WITHOUT PHYSX - most cores idle.. clearly its artificially generating low results.
    Last edited by ***Deimos***; 11-28-2009 at 06:55 PM.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  3. #328
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by ***Deimos*** View Post
    Dont think thats the problem. X2900XT was 80nm and something like 740Mhz. The mid-end X2600XT was a blazing 800Mhz.. a milestone nVidia has yet to conquer 3 years later.

    AMD moved to 55nm early on, at same time as nVidia launched 8800GT on 65nm. Likewise, AMD first to 40nm. So, you'd think AMD would be the one with yield problems, right?

    Lets compare last 3 gen product launches.
    X2900XT/PRO - full chip, 600-740 clocks
    HD3850/HD3870 - full chip, 668-825 clocks
    HD4850/HD4870 - full chip, 625-850 clocks
    HD5xxx... only one where not selling all full chips at launch.

    nVidia? crippled chips gallore - 8800GTS, 8800GT/9800GT, GTX260.. even their mid-range where you'd think yields wouldn't be an issue.

    Clearly, nVidia designs are either more susceptable to defects or tight schedules are pressuring them to cut corners... litterally.
    no, using partially functioning chips on low/middle end parts is to save money. they have also been tenacious moving to new processes for the past few years. why are you so focused on frequency? g80 had 24 rops and gt200 had 32. they can also do 4 multisamples per clock.
    And power budget is clearly no prob for a 32sp 40nm nVidia chip... so how does AMD make 850Mhz 2B tran and nVidia only 600Mhz and shaders running barely as high as original 90nm GTX....?
    1. They do it on purpose, so Fermi looks good compared to GT220 /GT240
    2. Fermi delay will be announced and they'll ship a 40nm G92 in the meantime
    3. Design not scalable. Could be a prob if even crippled cut-down Fermi can only run (extrapolating)... 400Mhz.
    4. Simply inexperienced and incompetent engineers. Can't be the "process" since 4770 was getting great clocks early on before things were ironed out.
    5. Management.

    Hate to repeat it so many times, but nVidia Fermi is way late, their DX10.1 cards are a joke, huge die GT200s are sinking profits, and dont even have Intel/AMD chipset business to fall back on. If they dont get PERFECT Fermi out with whole lineup down to bottom, it could be not just NV30, but more like 3Dfx time
    i would take this as sensationalism. half of the "facts" you have stated are wrong or just made up. they are in better shape than amd right now too. its a 384 bit bus so my estimate for the die size is less than 500mm2. that would put clocks at a g80 level. gf100 is simply a monster of a chip and there is no denying that.

  4. #329
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by ***Deimos*** View Post
    I wrote Physics.
    Does it matter if I wrote Physics or PhysX? Its OBVIOUS we're talking about 1 setting.

    The point is, thousands of games (minus the dozen that use PhysX) didn't need nVidia fancy PhysX to make rigid bodies and cloth and blowing papers.
    In 2002, Hitman2 had cloth simulation. Half-Life 2 floating barrels and colliding objects didn't require a nVidia card. And ofcourse Crysis runs just as well on a Radeon as a GeForce.

    But, Batman is a TWIMTBP game. nVidia wants only their loyal gamers to have access to all features. I'm all for acceleration of existing content, but requiring PhysX for additional game content (cloth, debris, papers etc) is just wrong.. And if Crysis managed to model village houses collapsing from exploding barrels with a CPU, why other than "cheating" would some crumpled paper in Batman cause it to always go down to 15fps with a Radeon?

    FYI: in older post I have link to tomshardware which shows like 30% CPU (i7) both WITH AND WITHOUT PHYSX - most cores idle.. clearly its artificially generating low results.
    PhysX is marketing.... nothing more.

  5. #330
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    its not a PHYSX game then...

    and a lot more than 14 do it

  6. #331
    Registered User
    Join Date
    Aug 2009
    Posts
    32
    Quote Originally Posted by orangekiwii View Post
    its not a PHYSX game then...

    and a lot more than 14 do it
    eeeeeh no. only 14 GPU PhysX-accelerated titles. The rest of the titels is CPU accelerated.

  7. #332
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    I'm sorry. I thought there were a LOT more than that. My mistake.

    While I do think theres something wrong with Radeons and the performance levels with Phsyx, I'm not sure what it is. The code itself is being run on the cpu, which has done this sort of stuff for years. Its like its purposely horribly coded for cpus, and then told not to work when in the presence of a radeon gpu (hence why hacks can get physx working with great performance on radeons). It just seems anti-competitive. I don't have anything wrong with physx but it just seems its being used for 'silly' things and not used to its full potential - especially since its proprietary.

  8. #333
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by Chumbucket843 View Post
    no, using partially functioning chips on low/middle end parts is to save money. they have also been tenacious moving to new processes for the past few years. why are you so focused on frequency? g80 had 24 rops and gt200 had 32. they can also do 4 multisamples per clock.

    i would take this as sensationalism. half of the "facts" you have stated are wrong or just made up. they are in better shape than amd right now too. its a 384 bit bus so my estimate for the die size is less than 500mm2. that would put clocks at a g80 level. gf100 is simply a monster of a chip and there is no denying that.
    So are you saying nVidia will be fine if Fermi isn't launched Q1 and perfect?

    or

    Are you giving nVidia the benefit of the doubt and re-assuring me that a non-perfect Fermi is impossible?

    I'm not particularly concerned about ultra-high end Fermi products, but more worried about lack of anything else other that Fermi on roadmaps. Whats gonna replace the 8800GT->GTS250?

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  9. #334
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    Quote Originally Posted by ***Deimos*** View Post
    Whats gonna replace the 8800GT->GTS250?
    Why, the 8800GT -> GTS250 -> G350 is what.

    Rebadging is your friend.

  10. #335
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    I wouldnt be surprised in the least. G92 is probably as cheap as dirt now and still matches up with HD 57xx performance wise. It'll be a tough sell without even DX10.1 support but since there's no word of anything faster than GT216 on the horizon they probably have no choice but to rebrand (or just keep selling) old reliable.

  11. #336
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    Quote Originally Posted by Humminn55 View Post
    Why, the 8800GT -> GTS250 -> G350 is what.

    Rebadging is your friend.

    Coming soon to a store near you!
    Geforce G350!
    Features:
    CUDA!
    PhysX!

    DirectX 10.1
    TWIMTBP!
    Did we mention CUDA??!!

  12. #337
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    uh trinibwoy, g92 isn't even really close to 57xx series

    thats like saying the 8800 ultra = 4870 which is just a blatant lie

  13. #338
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by trinibwoy View Post
    I wouldnt be surprised in the least. G92 is probably as cheap as dirt now and still matches up with HD 57xx performance wise.
    Maybe with a 256 bit GDDR5 bus... and LN2.

    All along the watchtower the watchmen watch the eternal return.

  14. #339
    Xtreme Member
    Join Date
    Oct 2008
    Posts
    263
    Fermi better be the second coming of the 8800 or I will be giving up on nvidia.
    Whats up?

  15. #340
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by shoopdawoopa View Post
    Fermi better be the second coming of the 8800 or I will be giving up on nvidia.
    Think they'll miss you ? ( no offense )
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  16. #341
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    even if fermi sucks for gaming, so what? itll only take nvidia 6-12 months to get back on its feet, and they have a big enough gobbly fat belly to make it through that time easily without going bankrupt

    i really hope fermi will deliver... i dont even want to think about what will happen if they dont...
    ati prices will get out of control, nvidia will go crazy pr wise, trying to make up for bad performance with more pr, they will go for lots of perf tweaks again, be even more agressive in trying to lock ati out of games they support during the development... it would get real ugly...

  17. #342
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    Quote Originally Posted by saaya View Post
    even if fermi sucks for gaming, so what? itll only take nvidia 6-12 months to get back on its feet, and they have a big enough gobbly fat belly to make it through that time easily without going bankrupt

    i really hope fermi will deliver... i dont even want to think about what will happen if they dont...
    ati prices will get out of control, nvidia will go crazy pr wise, trying to make up for bad performance with more pr, they will go for lots of perf tweaks again, be even more agressive in trying to lock ati out of games they support during the development... it would get real ugly...
    If they are too aggressive in locking ATi out, AMD may file charges aka law suite's and nvidia would be in trouble as its not really that trustworthy looking at the past "3dfx"

    The PR dep for Nvidia is as good as the lawyer dep for Apple.
    Coming Soon

  18. #343
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by saaya View Post
    even if fermi sucks for gaming, so what? itll only take nvidia 6-12 months to get back on its feet, and they have a big enough gobbly fat belly to make it through that time easily without going bankrupt

    i really hope fermi will deliver... i dont even want to think about what will happen if they dont...
    ati prices will get out of control, nvidia will go crazy pr wise, trying to make up for bad performance with more pr, they will go for lots of perf tweaks again, be even more agressive in trying to lock ati out of games they support during the development... it would get real ugly...
    QFT
    ░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░

  19. #344
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by orangekiwii View Post
    uh trinibwoy, g92 isn't even really close to 57xx series. thats like saying the 8800 ultra = 4870 which is just a blatant lie
    I think you need to read some 5750 reviews.

  20. #345
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by saaya View Post
    even if fermi sucks for gaming, so what? itll only take nvidia 6-12 months to get back on its feet, and they have a big enough gobbly fat belly to make it through that time easily without going bankrupt
    Exactly. Maybe big green has to learn the hard way that bigger is not always better
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  21. #346
    Xtreme Member
    Join Date
    Oct 2009
    Location
    Bucharest, Romania
    Posts
    381
    it matches the 5750, but can't match the 5770 and can't match the 5750 and 5770 power consumption, OC potential, features etc...

  22. #347
    Xtreme Addict
    Join Date
    Jun 2005
    Posts
    1,095
    Quote Originally Posted by ***Deimos*** View Post
    And if Crysis managed to model village houses collapsing from exploding barrels with a CPU, why other than "cheating" would some crumpled paper in Batman cause it to always go down to 15fps with a Radeon?
    Maybe that's why even the most expensive PCs at the time couldn't run Crysis 60+FPS stable on enthusiast settings. Have you heard the phrase "Can this run Crysis?" Wouldn't you like being able to run this game at 100FPS with all eye candy enabled by only adding a 60$ secondary card?

    Look, I totally understand all this NVdia is evil deal and frustration caused by their business practices but the bottom line is, they invented the technology they can use however they want. You don't like how they market their technology? Solution is simple: Just don't purchase their cards and the games that only support their technology. Let your wallet do the talking. Even this thread, 6 pages of nonsense speculation over a stupid picture on facebook is nothing but fuel to the NVdia hype machine. Besides, AMD has directx11 cards and some games will show more features to ATI cards and not to NVdia cards (like tesselation). Just buy a 5870 and enjoy your games.

  23. #348
    Banned
    Join Date
    Jul 2008
    Posts
    162
    Nvidia invented Physx?

  24. #349
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by postumus View Post
    Nvidia invented Physx?
    nVidia bought Ageia, that invented PhysX and the first PPU card.

    But you already knew it
    Are we there yet?

  25. #350
    Banned
    Join Date
    Jul 2008
    Posts
    162
    Well I think the PPU cards floating about out there are part of the gripe, ppl purchased them and used them with their Ati GPU for years and now they can't. The Ati gpu Aegia ppu combination must have been the QA and Support nightmare that drove Aegia to auction itself off

Page 14 of 20 FirstFirst ... 411121314151617 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •