Page 11 of 42 FirstFirst ... 89101112131421 ... LastLast
Results 251 to 275 of 1035

Thread: The official GT300/Fermi Thread

  1. #251
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    http://www.youtube.com/watch?v=r17UOMZJbGs

    Fermi shows off physx... That's INSANE.

    128 thousand particles simulated and rendered in real-time on the GPU. SPH fluid simulation with surface tension effects.

    That'd make a cpu choke instantly...
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  2. #252
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by AVB View Post
    pic for ya




    http://rs648.rapidshare.com/files/28...Key_Visual.jpg ( res. 6316 x 3240)

    Fermi 1.4-1.6 x of GTX 295.

    ( 1.6-1.8x of GTX 285 is not too much)
    so a bit better than the 5870? ( as in quite a bit single GPU wise but it'll get owned by the 5870 X2, and unless nvidia shrinks to 32nm we're not going to see a dual GF100 card)

    looks like it's going to be GT200/HD4800 all over again.
    Last edited by Helloworld_98; 10-02-2009 at 09:13 AM.

  3. #253
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by DilTech View Post
    http://www.youtube.com/watch?v=r17UOMZJbGs

    Fermi shows off physx... That's INSANE.

    128 thousand particles simulated and rendered in real-time on the GPU. SPH fluid simulation with surface tension effects.

    That'd make a cpu choke instantly...
    It wasn't shown on fermi it was shown on G200.
    [SIGPIC][/SIGPIC]

  4. #254
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Ah, I see...read it on another forum and saw it wasn't in this thread...got a link to show it was on G200, considering it's listed as Next Generation GPU Fluids? Not saying you're wrong(in fact, I hope you're not, because if that was on the G200 then that means Fermi is capable of even more than that!)
    Last edited by DilTech; 10-02-2009 at 09:15 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  5. #255
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by DilTech View Post
    http://www.youtube.com/watch?v=r17UOMZJbGs

    Fermi shows off physx... That's INSANE.

    128 thousand particles simulated and rendered in real-time on the GPU. SPH fluid simulation with surface tension effects.

    That'd make a cpu choke instantly...
    that is pretty sick. i somehow think raytracing is overrated given the lighting effects that has.

    reminds me of the nice 50 page pdf on how to render perfect looking hair, if this is just about raw processing power, imagine if they found an efficient way to do this. (i sure dont care about water, but imagine what kind of blood and gore we can have on gpu physics)

  6. #256
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by DilTech View Post
    Also, Helloworld, if 1.4-1.6 of the GTX-295 is true, that's quite a bit higher than the 5870, not a little notch.
    Yup it puts it in the 5870X2's category only a bit slower but that doesn't matter because it's a single GPU.
    [SIGPIC][/SIGPIC]

  7. #257
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by DilTech View Post
    Also, Helloworld, if 1.4-1.6 of the GTX-295 is true, that's quite a bit higher than the 5870, not a little notch.
    I edited my post, and they need to add a bit more leeway to the GTX 285 and GTX 295 performance estimates versus GF100.

    also by the time 5800 drivers have matured, I expect 5870 performance to be about 1.2x the 295, but we'll have to see.

  8. #258
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Couldn't resist guys! Wether the Fermi card is real or fake, I thought I would break out the old 3dfx V6K pics. *teehee

    (info from dodge garage)




    The VSA-100 family was introduced at Comdex '99 computer hardware show on November 17, 1999. Shown at the show were three boards, a 4500, a 5500 and the 6000 shown above. None of the boards were functional items, they were just static displays to show the press.



    Only two Comdex V6K were known to exist, for "the dog and pony show" as one engineer put it. They were strictly for photo-ops till the real thing could be produced. Shown above is a rare picture from 3dfx showing a Comdex'99 V6K with what looks like uncut V3-3000 heatsinks.



    The Comdex '99 was a dummy PCB populated with components with no traces or connections, even the AGP connector contacts just stop.

    The Comdex '99 boards has 4 Voodoo 3-3500 chips! Hence the VSA-100 decals covering the chips during the show.

    Since nVidia bought 3dfx, I wouldn't be surprised if they dug up the November 1999 "how to produce a mockup board for GPU conference" archive

    All jokes aside, I really do not care if it's fake or not. As long as GT300 will be very fast and around xmas time I'm happy.
    Last edited by Tim; 10-02-2009 at 09:23 AM.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  9. #259
    Xtreme Member
    Join Date
    Jun 2008
    Posts
    208
    Quote Originally Posted by DilTech View Post
    Charlie made one big error, his statement that A1 is the first revision...

    Anyone who knows a thing or two about chips knows A0 is the first revision. Anyone remember the Q6600 update, the G0(that's g zero)? If Charlie was correct in that assumption then that famous stepping would have been G1, not G0. These aren't counting the prototype samples which are just to test the features without making the full blown chip.

    As such, kind of blows his argument clean out of the water in that regard, doesn't it?
    DilTech to less be of a Nvidia fanboi, please... Pretty much everything you post is negative on ATI in one way or another, or is defending Nvidia, regardless of what you currently own.

    Im certainly not loyal to either camp. Ive used card from both manufacturers over the years. Currently sporting ATI 4850s and 4870X2. Main reason for that is Nvidias fault, the 6XX and 7XX mobos suck and I like dual card solutions. Previous cards were 8800 GTX, 7800 GTX (iirc).

    I want Nvidia to succeed! The technology their talking about is great on paper. The ideas they are showing look really solid. Can it make a real world jump to a GPU and still make money on it, remains to be seen. When I see sketchy photos of COMPLETELY fake hardware, mock up or not, I start to have doubts. When you dont even have the care to make the power plugs go in the right place, lol. FAIL...

    Quote Originally Posted by Farinorco View Post


    The first points about the serial numbers and dates on the IHS seem good (excepting the absolutely idiotic one about the "7") about defending their own previous writings.

    But even if absolutely pointless (well, they have shown a "fake" card to decorate the presentation, who cares? It's not like this would mean anything) what I have enjoyed more of the article is the part about the "fake" card. Hey, the way it's written it is hilarious... "Those lead to... well, not the power connector", "The 6-pin connector, on the other hand, lines up with, umm, nothing", "Except glue. Notice the connector is black and the hole below it shows white. The only real question now is, Elmers or glue stick"... by here I was
    I felt the same way. Nvidia doesnt even take the time to make a real fake. Thats what is so funny.

  10. #260
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    You edited after my reply, and as such I removed it out of my reply.

    As for how drivers will improve things, both brands should see big improvements as the life cycle runs it's course. No one can be sure how that will pan out ever for either company, which is why I'm not making predictions on where the performance difference will end up at the end of this generation.

    Kaldor, I'm not saying the mock-up wasn't fake. On the contrary I agree with that part. Mock-ups are very common(look at the hori tekken wireless stick images all over the net, it's a 3d rendered mock-up), but NVidia saying it was the real deal IS wrong, and I DO agree with that statement. What I'm saying is that charlie was out-right wrong on A1 being the first silicon revision when it's A0. I fail to see how that's being a fanboy.

    Also, when 75% of what needs to be corrected in this section is Pro-ATi or Anti-NVidia anyone knowledgeable is going to look like a fanboy...no?
    Last edited by DilTech; 10-02-2009 at 09:27 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  11. #261
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by ubuntu83 View Post
    Yup it puts it in the 5870X2's category only a bit slower but that doesn't matter because it's a single GPU.
    however they won't/can't make a dual GPU card due to the ridiculously high power usage of GF100, for a dual GPU card it would be 400w at least, for the GTX 295 equivalent,we'd be talking 450w+, nvidia aren't stupid enough to have a card which requires 3x 8 pin connectors.

  12. #262
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    http://www.fudzilla.com/content/view/15798/1/
    For those who missed it, yes it WAS FAKE.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  13. #263
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by zalbard View Post
    http://www.fudzilla.com/content/view/15798/1/
    For those who missed it, yes it WAS FAKE.
    from the article

    Charlie was surely right about this one here.
    and they left out that nubs like us knew this 24 hours ago.

  14. #264
    Xtreme Member
    Join Date
    Jun 2008
    Posts
    208
    Quote Originally Posted by DilTech View Post
    /snip

    Kaldor, I'm not saying the mock-up wasn't fake. On the contrary I agree with that part. Mock-ups are very common(look at the hori tekken wireless stick images all over the net, it's a 3d rendered mock-up), but NVidia saying it was the real deal IS wrong, and I DO agree with that statement. What I'm saying is that charlie was out-right wrong on A1 being the first silicon revision when it's A0. I fail to see how that's being a fanboy.

    Also, when 75% of what needs to be corrected in this section is Pro-ATi or Anti-NVidia anyone knowledgeable is going to look like a fanboy...no?
    My issue is Nvidia passing a fake off as real, dead wrong, as we both agree on. As far as the silicon goes you could conceivably put any stinking number on it that you want, hehe. Hmm, piece of silicon, IHS, laser etched whatever you want it to say. The fact of the matter is alot of enthusiasts, myself include do not trust Nvidia after all the bull they have pulled in the last couple of years. Renames, meltdowns, bad chips, poor mobos, shady crap in general. They should at least respect the community to sniff them out if they are faking it. The entire thing smacks of Nvidias "were better than you" attitude and people are getting sick of it.

    Even still I still want Nvidia to really do a good job this time. They started from scratch, as the whitepapers show, and I want this thing to be the part that blows everything they have done previously out of the water, IE the next G80. I loved my 8800 GTX, but hated Nvidias mobos, so I went to ATI so I could run dual cards on a good mobo.

  15. #265
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by zalbard View Post
    http://www.fudzilla.com/content/view/15798/1/
    For those who missed it, yes it WAS FAKE.
    If that article(fudzilla's) is accurate, than the video I posted WAS done on the GT300... Either way, their next gen physx demo was extremely impressive.

    Of course, not like I've ever been one to believe charlie or fuad in the first place. That much salt is extremely bad for you.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  16. #266
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by DilTech View Post
    If that article(fudzilla's) is accurate, than the video I posted WAS done on the GT300... Either way, their next gen physx demo was extremely impressive.
    http://www.youtube.com/watch?v=iyg9HgiD8X0&feature=sub

    PCgameshardware people say it's G200.
    [SIGPIC][/SIGPIC]

  17. #267
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    I hope Nvidia can sell the mess out of Tesla Supercomputers based on fermi, we really need this so that revenue increases will either allow a cheaper geforce part, or a remade fermi-based geforce with a smaller die size and better profitability. Looking at that die, I don't know how they are going to release a cut-down model or make it scalable to all market segments...I also hope that semi-accurate is inaccurate about the NV Q4 / Q1 2010 roadmaps, because all he is talking about is a high clocked 2GB g200b and a slew of 40nm G92's - which doesn't sound appealing. They need to bring out some consumer GF100 parts quickly, otherwise their competing parts are gt200b and g92.

    It seems like they're taking the high risk long road though with a big bad-ass die. GF100 "GTX380 or 395" should be an awesome part. I hope it really is 1.5x of a GTX295 / 1.8x GTX285.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  18. #268
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by jaredpace View Post
    I hope Nvidia can sell the mess out of Tesla Supercomputers based on fermi, we really need this so that revenue increases will either allow a cheaper geforce part, or a remade fermi-based geforce with a smaller die size and better profitability. Looking at that die, I don't know how they are going to release a cut-down model or make it scalable to all market segments...I also hope that semi-accurate is inaccurate about the NV Q4 / Q1 2010 roadmaps, because all he is talking about is a high clocked 2GB g200b and a slew of 40nm G92's - which doesn't sound appealing. They need to bring out some consumer GF100 parts quickly, otherwise their competing parts are gt200b and g92.

    It seems like they're taking the high risk long road though with a big bad-ass die. GF100 "GTX380 or 395" should be an awesome part. I hope it really is 1.5x of a GTX295 / 1.8x GTX285.
    well, what you're asking for depends on larrabee since businesses will wait it out, if larrabee beats the tesla's, what you're asking for probably won't happen, if the tesla's beat larrabee, then what you're asking for has a higher chance of happening.

  19. #269
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Helloworld_98 View Post
    well, what you're asking for depends on larrabee since businesses will wait it out, if larrabee beats the tesla's, what you're asking for probably won't happen, if the tesla's beat larrabee, then what you're asking for has a higher chance of happening.
    given how good GF100 is on DP float, i think LRB should be scared.

  20. #270
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    hmm so Fermi is a rider of two boats. It is gonna to compete with Evergreen in Games and Larrabee in HPC.
    [SIGPIC][/SIGPIC]

  21. #271
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by DilTech View Post
    Charlie made one big error, his statement that A1 is the first revision...

    Anyone who knows a thing or two about chips knows A0 is the first revision. Anyone remember the Q6600 update, the G0(that's g zero)? If Charlie was correct in that assumption then that famous stepping would have been G1, not G0. These aren't counting the prototype samples which are just to test the features without making the full blown chip.

    As such, kind of blows his argument clean out of the water in that regard, doesn't it?
    That's not true... Neither ATi nor Nvidia name first silicon A0...
    This is a discussion we have had before.

    Quote Originally Posted by DilTech View Post
    I can tell you out-right that's not the case... The original NV15 chip(GF2 GTS IIRC) for OEM's was Rev A0. If you don't believe me just run a quick search thru this webpage and you'll find the info listed for you...

    http://forums.gentoo.org/viewtopic-t-819-start-125.html

    Same here, the gainward driver for that specific chip
    http://www.givemefile.net/drivers/video/gainward.html

    The reason so many think NV start at A1 is because almost never is the A0 perfect, and as such there's usually 1 to 2 revisions before it's ready for the public. As such most people never know they exist.
    Wait... you think Nvidia got 2 different batches back in less than 2 months? WTF.
    Tapeout was back in June, they received first silicon back in August, see the IHS, and that was the hotlot.
    You think they somehow got first silicon before they taped out or you think they taped out in April?
    If you really think they start with A0 all that means is that Nvidia is in a MUCH bigger mess than even Charlie is reporting, which seems doubtful since Charlie wouldn't miss an opportunity to bash Nvidia.
    Last edited by LordEC911; 10-02-2009 at 11:22 AM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  22. #272
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by Manicdan View Post
    given how good GF100 is on DP float, i think LRB should be scared.
    I wouldn't make conclusions yet, we haven't seen any GPGPU results for larrabee, or pricing.

    however even if larrabee is slightly less powerful, I could still see businesses opting for it due to lower power usage and it will probably be cheaper

  23. #273
    Xtreme Member
    Join Date
    Dec 2008
    Posts
    161
    Quote Originally Posted by ubuntu83 View Post
    http://www.youtube.com/watch?v=iyg9HgiD8X0&feature=sub

    PCgameshardware people say it's G200.
    --> http://www.nvidia.com/object/gpu_tec...onference.html

    watch the opening keynote with Jen-Hsun Huang. He says it runs on Fermi. A lot better quality than youtube too.

    And it's really interesting.

  24. #274
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by LordEC911 View Post
    That's not true... Neither ATi nor Nvidia name first silicon A0...
    This is a discussion we have had before.


    Wait... you think Nvidia got 2 different batches back in less than 2 months? WTF.
    Tapeout was back in June, they received first silicon back in August, see the IHS, and that was the hotlot.
    You think they somehow got first silicon before they taped out or you think they taped out in April?
    If you really think they start with A0 all that means is that Nvidia is in a MUCH bigger mess than even Charlie is reporting, which seems doubtful since Charlie wouldn't miss an opportunity to bash Nvidia.
    http://www.xbitlabs.com/news/video/d...703090833.html

    The first working sample of a chip carries A0 revision, while companies usually launch A2 revisions commercially. It usually takes several – up to 10 – weeks to build a new chip revision, which means that it is unlikely that the G80 would be production-ready by September.
    Original tape out in april, then A0...10 weeks later they respin to A1 which is what they have now...

    Also, I'm positive the oem NV15 was definitely revision A0.
    Last edited by DilTech; 10-02-2009 at 11:42 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  25. #275
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Farinorco View Post
    Yeah but... why do you expect it to have 2x the performance (I'm suppose you're talking about real world performance) if it's going to have +113% CPs more but only +50% ROP more, +60% mem bandwidth more...

    Consider that HD5870 is exactly double the HD4890 (+100% everything at the same clocks) except bandwidth (aprox. +30%) and it's far from double the real world performance (that's one of the most recent proves that doubling everything doesn't mean doubling real world performance), and NVIDIA is not even doubling processing units.

    Can they improve the performance per unit and per clock? Sure. Maybe. But how much and why, I think is way soon with the info we have to say it's going to be 2x real world performance of a GTX285. I even would say I hugely doubt it, given that they are more focused in get the new (future?) HPC market before Intel has their Larrabee working (if it happens to be on this century).
    explain to me how 512 shaders is not over double 240 shaders. the bandwidth increased by 50% too. the theoretical numbers are not that impressive but you completely missed a lot of factors and posted wrong information. nvidia also said 1.5ghz is a conservative estimate for clockspeed.

Page 11 of 42 FirstFirst ... 89101112131421 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •