Page 18 of 20 FirstFirst ... 8151617181920 LastLast
Results 426 to 450 of 478

Thread: Nvidia Fermi GF100 working desktop card spotted on Facebook

  1. #426
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    San Diego, CA
    Posts
    1,062
    Quote Originally Posted by highoctane View Post
    If any of that proves to be true things are definitely going to be interesting. While this peaks my interests I'm still waiting for some solid numbers.
    I seriously doubt it though, not the performance part but the price part. But well with all eyes on Fermi now, they can pretty much make up anything.

    CPU: Core i7-2600K@4.8Ghz Mobo: Asus Sabertooth P67 Case: Corsair 700D w/ 800D window
    CPU Cooler:
    Corsair H70 w/ 2 GTs AP-15 GPU: 2xGigabyte GTX 670 WindForce OC SLI
    RAM: 2x8GB G.Skill Ripjaws PSU: Corsair AX850W Sound card: Asus Xonar DX + Fiio E9
    HDD:
    Crucial M4 128GB + 4TB HDD Display: 3x30" Dell UltraSharp 3007WFP-HC
    Speakers: Logitech Z-5500 Headphone: Sennheiser HD650

  2. #427
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by DarthBeavis View Post
    I am sorry but I disagree. When I as shown the demo I did not accept the proposal at face value (I am a former software engineer). The first thing I did was to check out the CPU utilization and monitor it during the demo . . . .and verified the demo verified the claims being made. There is no reason applications cannot be coded to take advantage . . . .of course that is up to the application developers. I do understand your skepticism though.
    Unfortunately, there are. Formerly, there's the question of the hardware itself. I suppose that the demo you're talking about would be running on CUDA. So the real problem here is not the CPU but to be able to run the physics in addition to the graphical load at the GPU. Demos use to be graphically simpler than games, and even if not, you will be always cutting down your graphical resources to include physics. And graphics are shown in screenshots...

    Then, it's the thing of how much of the target audience can run that code. Probably only people with a high end graphics card (a much smaller target audience than you could think, amongst gamers). Then, if it's done in CUDA, cut that in half (no ATi compatibility). That gives some unacceptably low numbers that makes non profitably to invest resources on that. Sincerelly, if you were a developer, you would better invest your resources on a thing that so little people could see, or in something more widely useable instead?

    Someone could argue than some graphical settings are not usable except with high end hw. Well, graphics are special in a thing: games are sold by selling screenshots, and the graphics are the screenshots. So even if not everyone can take advantage of the graphics, the developers take advantage of their effort with everyone...

    Notice that even when PhysX is the most widely used physics library, only a few games have any kind of CUDA accelerated effects (Batman and Mirror's Edge basically, if we don't count the laughable falling leaves in Sacred 2 or the extra out-of-main-game level/s on UT3). There's a reason for that.

    Maybe you're right with it being the future of the videogames, maybe not (I'm somewhat skeptical in that it's a good idea to transfer workload from CPU to GPU in a field of sw that has been GPU bottlenecked for years now, but I don't think it's impossible). But anyway I don't think it will happen any soon (not at the GF100 life time anyway, oh well, at least if it finally lives in 2010 or so...). And when it does, it will be with some kind of widely supported GPGPU standard (be it OpenCL, DirectCompute, or whatever it will be), and I think most of them are more immature than CUDA right now.

  3. #428
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    "GeForce GTX 380 will be 15% faster than dual GPU Radeon HD 5970."
    Haha, yeah.... they wish.

    Quote Originally Posted by shoopdawoopa View Post
    Source?
    Your nearest Nvidia PR guy.

    Quote Originally Posted by Teemax View Post
    I call it BS.
    It very likely is.

  4. #429
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    [QUOTE=
    "GeForce GTX 380 will be 15% faster than dual GPU Radeon HD 5970."
    "In terms of performance GTX 360 will sit between HD 5870 and dual GPU HD 5970."

    Words on MSRP for GTX 380 is $499 and GTX 360 is $379, if that's true then count me in. [/QUOTE]
    i just got new info nvidia decided to release gtx 370 which also will be faster than 5970 (didn't say how much though) Msrp will be 459 and great thing about this new one it will have only one slot cooler

  5. #430
    Xtreme Member
    Join Date
    Oct 2004
    Posts
    171
    My source (Santa) told me that everybody should write his wishes to this thread.

    Edit: The GTX385 will cost $99, beat a 5970 by 27,25% and have passive one slot cooling.
    Last edited by mibo; 12-09-2009 at 04:24 AM.

  6. #431
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    YOu all are making fun of fermi, come on guys it has the potential to really be a either a series 5 or a series 6....

    Series 5 was beaten black and blue by ATi and Nvidia took their revenge with series 6 which was quite different architecture and included hyper games like Doom3 in the bundle....

    If GTX360 is on par with 5870 it will be a huge accomplishment but "and a huge but" if nvidia does not fund DX11 games it may as well have a low track record for DX11 and we will again have something like 5870's performance coming close to GTX 380...
    Coming Soon

  7. #432
    Xtreme Member
    Join Date
    Apr 2008
    Posts
    239
    Quote Originally Posted by ajaidev View Post
    If GTX360 is on par with 5870 it will be a huge accomplishment
    Wouldn't that put nVidia is the same situation as the last gen where in terms of performance GTX260 and 4870 were basically equal.

  8. #433
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by AKM View Post
    Wouldn't that put nVidia is the same situation as the last gen where in terms of performance GTX260 and 4870 were basically equal.
    well no, you said it yourself, 260 was nvidias mainstream, 4870 was atis fastest.

    if 360 is as fast as 5870 it means nvidias fastest is as fast as atis fastest.

  9. #434
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by iTravis View Post
    why would nvidia make one chip have different L1 and L2 caches than the other? what happens to the 380s that are bad yields, will they just trash them cause they cant down bin them into 360s when they are completely different?

  10. #435
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Manicdan View Post
    why would nvidia make one chip have different L1 and L2 caches than the other? what happens to the 380s that are bad yields, will they just trash them cause they cant down bin them into 360s when they are completely different?
    Considering how common harvesting due to cache has been with cpu's I would imagine they need some way to downgrade otherwise good dies with bad cache not to mention the available cache could also be relative to the number of shaders.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  11. #436
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by Manicdan View Post
    why would nvidia make one chip have different L1 and L2 caches than the other? what happens to the 380s that are bad yields, will they just trash them cause they cant down bin them into 360s when they are completely different?
    FFS why on earth have nVidia pulled some really random RAM sizes out of the Paul Daniel's Magic Hat. I was hoping for the 360 to have 1GB and the 380 to have 2GB!

    I really do need a 2GB+ Video card
    Stop looking at the walls, look out the window

  12. #437
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by highoctane View Post
    If any of that proves to be true things are definitely going to be interesting. While this peaks my interests I'm still waiting for some solid numbers.
    there are no numbers due to no cards exists....


    there is always next year...
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  13. #438
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by iTravis View Post



    "GeForce GTX 380 will be 15% faster than dual GPU Radeon HD 5970."
    "In terms of performance GTX 360 will sit between HD 5870 and dual GPU HD 5970."

    Words on MSRP for GTX 380 is $499 and GTX 360 is $379, if that's true then count me in.
    Its over!

    ATI is finished !
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  14. #439
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    I was thinking about posting this earlier, but I guess now is the time

    I love how all you guys make jokes and laugh about Fermi, the delay and it's performance ( yeah, I've seen lots of people saying it's going to stink, it won't even come close to a 5870, etc ) and still cry like kids asking nVIDIA to release it ASAP...
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  15. #440
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by LiquidReactor View Post
    Its over!

    ATI is finished !
    How so?
    ATi still produce leading edge and cutting edge GPU's.....albeit availability is awful and the launch here in the UK has been all but a paper launch... shipments ARE coming in.

    You can now get a Radeon 5870.... for £400 odd pounds and one e-tailer even has TWO Radeon 5970's in stock!!!! for £525

    After Christmas the prices will come down to common sense levels (if the cards are stocked nicely) by that I mean Radeon 5870 £250 and the 5970 £350

    John
    Stop looking at the walls, look out the window

  16. #441
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Shin Osaka, Japan
    Posts
    152
    Quote Originally Posted by saaya View Post
    well no, you said it yourself, 260 was nvidias mainstream, 4870 was atis fastest.

    if 360 is as fast as 5870 it means nvidias fastest is as fast as atis fastest.
    I disagree. Nvidia will be in trouble again.

    The issue of differing die sizes, memory bus sizes and PCB costs would favour AMD/ATI again.

    Plus, the 5970 is already here so that puts quite a lot of pressure on the "GTX 380". Also, AMD/ATI may pull out a tweaked Cypress for maybe a "5890" or something like that again...
    Quote Originally Posted by flippin_waffles on Intel's 32nm process and new process nodes
    1 or 2 percent of total volume like intel likes to do. And with the trouble intel seems to be having with they're attempt, it [32nm] doesn't look like a very mature process.
    AMD has always been quicker to a mature process and crossover point, so by the time intel gets their issues and volume sorted out, AMD won't be very far behind at all.

  17. #442
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    5870 already here?!... only just and highly over priced too and NO 2GB model.... hardly what I would call here, but here nether the less.
    What do you mean nVidia would be in trouble again?!?
    I agree that they are in trouble NOW (by not having a new DirectX 11 card on the table) but the last time they were in trouble was the notorious Geforce FX series cards
    John
    Last edited by JohnZS; 12-09-2009 at 08:19 AM.
    Stop looking at the walls, look out the window

  18. #443
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i was expecting nvidia to aim a little smaller this time around, so they can have a single card that uses 180W and a duel card would be right at 300W. but instead they went massive again. and have we even herd of a dx11 card from nvidia that will cost less than $200? those are the ones that will sell the most, and given how games are still being built for consoles, it will be strong enough to really enjoy a game on any 150$ monitor.

  19. #444
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Shin Osaka, Japan
    Posts
    152
    Quote Originally Posted by JohnZS View Post
    5870 already here?!... only just and highly over priced too and NO 2GB model.... hardly what I would call here, but here nether the less.
    What do you mean nVidia would be in trouble again?!?
    I agree that they are in trouble NOW (by not having a new DirectX 11 card on the table) but the last time they were in trouble was the notorious Geforce FX series cards
    John
    G200 (rather RV770) was trouble for Nvidia. A spade is called a spade.

    Massive price cuts after after the first week, subsiding AIB partners, profit margins damaged while AMD/ATI's market share increased in both desktop and notebook sectors.

    The high costs of producing G200 compared to RV770 was what made Nvidia's pricing and product placement strategy inflexible.

    Also the Radeon 5xxx series supply shortages are blown way out of proportion. I remember that the G80 was facing similar problems years ago. It's just pent up demand in the face of a new OS and new DirectX revision.
    Quote Originally Posted by flippin_waffles on Intel's 32nm process and new process nodes
    1 or 2 percent of total volume like intel likes to do. And with the trouble intel seems to be having with they're attempt, it [32nm] doesn't look like a very mature process.
    AMD has always been quicker to a mature process and crossover point, so by the time intel gets their issues and volume sorted out, AMD won't be very far behind at all.

  20. #445
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by Wesker View Post
    I disagree. Nvidia will be in trouble again.

    The issue of differing die sizes, memory bus sizes and PCB costs would favour AMD/ATI again.

    Plus, the 5970 is already here so that puts quite a lot of pressure on the "GTX 380". Also, AMD/ATI may pull out a tweaked Cypress for maybe a "5890" or something like that again...
    I wouldn't consider 5970 putting "a lot" of pressure if prices keep going up since prices have reached $700+ (I never seen such price tag since G80 Ultra) and yet not knowing how the "GTX 380" will perform, its hard to find a lot of people willing to pay $700 for a video card now days (since gets replaced in s short time compared to other pc hardware) nvidia is probably working on a GX2 version as well which could be nvidia triumph card...
    ░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░

  21. #446
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Wesker View Post
    Massive price cuts after after the first week, subsiding AIB partners, profit margins damaged while AMD/ATI's market share increased in both desktop and notebook sectors.

    The high costs of producing G200 compared to RV770 was what made Nvidia's pricing and product placement strategy inflexible.
    Inflexible, thats just pulling things out your crack, they managed to shrink gt200 to 55nm, drop the prices to competitive levels and still maintain margins and performance during the time frame up until now.

    Just a refresher on q3 results:
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  22. #447
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    Quote Originally Posted by JohnZS View Post
    5870 already here?!... only just and highly over priced too and NO 2GB model.... hardly what I would call here, but here nether the less.
    What do you mean nVidia would be in trouble again?!?
    I agree that they are in trouble NOW (by not having a new DirectX 11 card on the table) but the last time they were in trouble was the notorious Geforce FX series cards
    John
    yeah overpriced sure when gtx 285 is nearly selling for same price and what are you gonna do with 2 gb model how many games are there demanding that much ?

  23. #448
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by highoctane View Post
    Inflexible, thats just pulling things out your crack, they managed to shrink gt200 to 55nm, drop the prices to competitive levels and still maintain margins and performance during the time frame up until now.

    Just a refresher on q3 results:
    nvidia does more than sell gt280s. just cause they were making 5$ on every 280 and 260 sold, does not mean they ever paid off the cost to design them. if they were able to keep the prices only a few bucks higher, they could have made alot higher margin (if they are working under 5% margin for a 200$ chip. then to get 10% margins (or double profit) they only need to sell it for 210$) again, thats an example scenario, and i have no idea how much they either hate or love themselves for how the gt200 series went. but i think we all know ati loved how well the 4000 series worked for them.

  24. #449
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    Space
    Posts
    769
    Quote Originally Posted by highoctane View Post
    Inflexible, thats just pulling things out your crack, they managed to shrink gt200 to 55nm, drop the prices to competitive levels and still maintain margins and performance during the time frame up until now.

    Just a refresher on q3 results:
    If that trend continues, going off that table, ATI will overtake them in the 4th Quarter. Nvidia need to do something to stop their decline because they will be the next matrox at that rate.

  25. #450
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by highoctane View Post
    Inflexible, thats just pulling things out your crack, they managed to shrink gt200 to 55nm, drop the prices to competitive levels and still maintain margins and performance during the time frame up until now.
    Ummm... he was specifically talking about the gaming card/market, not the professional/workstation which is what was saving Nvidia for most of the G200/RV770/RV870 time period.

    G200's BOM was around 2x RV770s. The gap between GF100 and RV870 is going to get even larger. I will be very surprised if we don't see 3-4 cutdown variants of GF100, not at launch but eventually.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

Page 18 of 20 FirstFirst ... 8151617181920 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •