Page 9 of 42 FirstFirst ... 678910111219 ... LastLast
Results 201 to 225 of 1035

Thread: The official GT300/Fermi Thread

  1. #201
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by 003 View Post
    True. I'm referring to the people who run around like a chicken with its head cut off screaming that the GT300 is going to suck for games and it will be beaten by RV870. Honestly, in a WORST case scenario, it will be roughly twice as fast as the GTX285, which will trump a 5870 easily.
    The worst scenario could be way worse then that. Roughly twice as fast as a GTX285 is probably the REALISTIC scenario.

    IMO, it sounds like this will be a repeat of the last round. NV will have the fastest single chip and ATI will have a somewhat slower, but price/performance competitive offering. I don't know about the dual cards. Obviously ATI will have an x2. I imagine NV will want to release a dual GPU card to counter. But how do you cool 6B full speed transistors in 2 slots of space? They'll have to cut down and/or reduce the speed of the chips or wait for a shrink. The fastest single card halo could go either way, imo.

  2. #202
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Solus Corvus View Post
    Roughly twice as fast as a GTX285 is probably the REALISTIC scenario.
    I'm not convinced. We'll all find out when it's released though.

    I imagine NV will want to release a dual GPU card to counter.
    Nvidia has already indicated that there will be a dual GPU version of the GT300.

    But how do you cool 6B full speed transistors in 2 slots of space?
    Number of transistors is not what determines heat output, that would be the TDP, which will be similar to the GTX285, so that really won't be much of an issue.
    The fastest single card halo could go either way, imo.
    Nvidia won't let that happen, and based on performance of the 5870, and knowing the specs of GT300, I believe it is pretty clear the 380 will be faster, which nvidia has already confirmed.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  3. #203
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by 003 View Post
    Nvidia has already indicated that there will be a dual GPU version of the GT300.
    Eventually, yes.
    Full spec'ed and full speed? No.
    6months after release? Maybe, most likely longer.
    32/28nm shrink needed? Possibly.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  4. #204
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Italy
    Posts
    1,331
    Quote Originally Posted by Solus Corvus View Post
    The worst scenario could be way worse then that. Roughly twice as fast as a GTX285 is probably the REALISTIC scenario.
    I expect the performance to be MORE than two 285 in high-res 8xAA.

    SB Rig:
    | CPU: 2600K (L040B313T) | Cooling: H100 with 2x AP29 | Motherboard: Asrock P67 Extreme4 Gen3
    | RAM: 8GB Corsair Vengeance 1866 | Video: MSI gtx570 TF III
    | SSD: Crucial M4 128GB fw009 | HDDs: 2x GP 2TB, 2x Samsung F4 2TB
    | Audio: Cantatis Overture & Denon D7000 headphones | Case: Lian-Li T60 bench table
    | PSU: Seasonic X650 | Display: Samsung 2693HM 25,5"
    | OS: Windows7 Ultimate x64 SP1

    +Fanless Music Rig: | E5200 @0.9V

    +General surfing PC on sale | E8400 @4Ghz

  5. #205
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by LordEC911 View Post
    Eventually, yes.
    Full spec'ed and full speed? No.
    6months after release? Maybe, most likely longer.
    32/28nm shrink needed? Possibly.
    Those are all guesses. Full spec and speed should very well be possible if TDP is similar to the GTX285 (which it should be).

    6+ months is not going to be the case IMO. There will be no need for a shrink if TDP is similar to the 285.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  6. #206
    Xtreme Addict
    Join Date
    May 2003
    Location
    Peoples Republic of Kalifornia
    Posts
    1,541
    Well... I for one am quite happy that I held onto my money and waited for the GT300 specs to come to light. I'll more than likely buy a second gtx285 if they drop in price or just hold onto it until the initial GT300 cards prices equalize after their release.

    Quote Originally Posted by Ursus View Post
    I used to be quite curious regarding nvidia's new products, but now they disabled physics on systems with an amd gpu and a dedicated nvidia physics gpu, I will be boycotting them in in whatever way i can.
    Why? Do you honestly believe that it is in nVidia's best business interests to spend millions of dollars on driver development to make their cards architecture will work smoothly in conjunction as a physx card in conjunction with an ATi main graphics card?

    yeah, that's a great idea.

    ... spend tons of money to help out your competitor....

    "If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
    -- Alexander Hamilton

  7. #207
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by 003 View Post
    Those are all guesses. Full spec and speed should very well be possible if TDP is similar to the GTX285 (which it should be).

    6+ months is not going to be the case IMO. There will be no need for a shrink if TDP is similar to the 285.
    Ummm... so GTX295 is two full spec'ed GTX285s? That is news to me...
    Yes, my post was guesses and speculation but no less than your posts.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  8. #208
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by 003 View Post
    I'm not convinced. We'll all find out when it's released though.
    If it was just a doubling of resources without major architecture changes then I would say that the average performance would would be LESS then 2x the previous gen. You can't expect linear scaling.

    But I'm taking into account that there are architecture changes. Most of the changes don't sound like they would make much, if any, performance difference in games. But perhaps the efficiency improvements will help make the most of the resources available. But without major changes to the shaders and arch I don't see how you'd get more then 2.13x scaling, except in corner cases.

    Nvidia has already indicated that there will be a dual GPU version of the GT300.
    Any timeframe? Will it launch with/near the other cards, or will we have to wait for a shrink like the 295?

    Number of transistors is not what determines heat output, that would be the TDP, which will be similar to the GTX285, so that really won't be much of an issue. Nvidia won't let that happen, and based on performance of the 5870, and knowing the specs of GT300, I believe it is pretty clear the 380 will be faster, which nvidia has already confirmed.
    You know what I mean, of course 10 billion 1 mhz transistors wouldn't be very hard to cool.

    A TDP near the GTX285 would be very much of an issue. How many NV cards are on the market with 2 full speed, not cut down GTX285 cores, even now? 1000? To make the standard 295 they had to cut the number of shaders and the speed. To make the 395 they will probably have to make even deeper cuts to be able to fit it in the power/TDP envelope.

    Nvidia didn't confirm anything, they said they "believe" it will be faster in games. And I'm not going to take their word for it, benchmarks will tell us the truth.

    Quote Originally Posted by RealTelstar View Post
    I expect the performance to be MORE than two 285 in high-res 8xAA.
    Why?

  9. #209
    Xtreme Addict
    Join Date
    Aug 2008
    Location
    Hollywierd, CA
    Posts
    1,284
    Quote Originally Posted by LordEC911 View Post
    ~2x as fast is the BEST case scenario. Shaders are being doubled but completely overhauled which should bring IPC but not necessarily.
    Also there are many other parts of GF100 that could possibly bottleneck the architecture.
    i would like to know which of the listed features will bottleneck this gpu.

    Quote Originally Posted by LordEC911 View Post
    Eventually, yes.
    Full spec'ed and full speed? No.
    6months after release? Maybe, most likely longer.
    32/28nm shrink needed? Possibly.

    http://www.fudzilla.com/content/view/15758/34/


    i know it's fud, and i'm looking for corroborating sources, but they're stating a full range release (even the 2 gpu version). when/if i find anyone else with the same info, i'll share it.
    [SIGPIC][/SIGPIC]

    I am an artist (EDM producer/DJ), pls check out mah stuff.

  10. #210
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by Solus Corvus View Post
    You know what I mean, of course 10 billion 1 mhz transistors wouldn't be very hard to cool.

    A TDP near the GTX285 would be very much of an issue. How many NV cards are on the market with 2 full speed, not cut down GTX285 cores, even now? 1000? To make the standard 295 they had to cut the number of shaders and the speed. To make the 395 they will probably have to make even deeper cuts to be able to fit it in the power/TDP envelope.
    Whackypedia has you not far off the transistor count...

    The initial model in the series to be released will use the GT300 chip, a very large chip that is heavily modified of G92 GPU(Up to nine billion transistors in quad core version) manufactured by TSMC in a 55-nanometer process. Versions will be available with 1.5GB, 3GB or 6GB of memory, attached to six separate 64-bit GDDR4 memory controllers on the chip.


    Also the GTX295 only had lowered clocks & 448bit Memory Bus instead of 512bit. It has the full 240 shaders on each chip.

  11. #211
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by DeathReborn View Post
    Also the GTX295 only had lowered clocks & 448bit Memory Bus instead of 512bit. It has the full 240 shaders on each chip.
    Yeah, actually that's correct. Though they did have to wait for the shrink before they could do it. And even now only the MARS has full speed cores.

  12. #212
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by LordEC911 View Post
    Ummm... so GTX295 is two full spec'ed GTX285s? That is news to me...
    Yes, my post was guesses and speculation but no less than your posts.
    1. The GTX295 is not two full spec 285s, but it still is the fastest single card.

    2. Full spec dual 285 on a single card is perfectly possible. Look at the Asus Mars. Not only is it full spec, but it actually has 2GB of memory per GPU for a total of 4GB of DRAM chips. Evga was also working on a dual 285 card, but I'm not sure if they will release it with GT300 right around the corner.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  13. #213
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by 570091D View Post
    i would like to know which of the listed features will bottleneck this gpu.
    ROP throughput?
    Also performance is dependent on clockspeeds, which is very underwhelming at the moment.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  14. #214
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by 003 View Post
    1. The GTX295 is not two full spec 285s, but it still is the fastest single card.

    2. Full spec dual 285 on a single card is perfectly possible. Look at the Asus Mars. Not only is it full spec, but it actually has 2GB of memory per GPU for a total of 4GB of DRAM chips. Evga was also working on a dual 285 card, but I'm not sure if they will release it with GT300 right around the corner.
    "Look at the Asus MARS and Evga's possibly aborted project" aren't really good arguments for the likelihood of a full speed GT300, lol.

  15. #215
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Andrew LB View Post
    Why? Do you honestly believe that it is in nVidia's best business interests to spend millions of dollars on driver development to make their cards architecture will work smoothly in conjunction as a physx card in conjunction with an ATi main graphics card?

    yeah, that's a great idea.

    ... spend tons of money to help out your competitor....
    what? again they did not, NOT fix something for ati, they BROKE it on purpose...
    it worked fine before, then they released a new driver that blocks it...

    not supporting it, well their decision, if you want to spread physix as a standard you need to support as many customer configs as possible... but whatever...

    but BLOCKING it on purpose... thats lame...
    but hey, we know it from sli
    i guess now that nvidia was forced to unblock sli, they are probably looking for other things to block instead ^^

    or maybe they actually think they can do the same as with sli and ask for license fees or money from ati and intel to allow physix on their systems

    solus corvus, yes, and gt212 seems to be dead too, otherwise there wouldnt have been mars or matrix or evga beefed up cards...
    so gt212 (40nm gt200) is definately cancelled and gt300 delayed, meh :/

  16. #216
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Saaya, are we sure it worked fine before? Multiple vendors drivers generally don't play nicely together, which usually causes all kinds of stability issues. Could you imagine the sheer number of non-computer knowledgable people calling NVidia complaining when the 2 drivers together cause graphical corruption and they blame NVidia because before they added the geforce card for physx the system worked fine? You worked in CS before, you know it would happen.

    If we have a completely impartial judge to test this out, see if it worked without issue before in all physx enabled titles, then we can make that call. I'd actually be interested in seeing those numbers myself!
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  17. #217
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    215
    Quote Originally Posted by DilTech View Post
    Saaya, are we sure it worked fine before? Multiple vendors drivers generally don't play nicely together, which usually causes all kinds of stability issues. Could you imagine the sheer number of non-computer knowledgable people calling NVidia complaining when the 2 drivers together cause graphical corruption and they blame NVidia because before they added the geforce card for physx the system worked fine? You worked in CS before, you know it would happen.

    If we have a completely impartial judge to test this out, see if it worked without issue before in all physx enabled titles, then we can make that call. I'd actually be interested in seeing those numbers myself!
    Yes it did clearly work before. Nvidia just blocked it with updated drivers.

    http://www.rage3d.com/board/showpost...&postcount=628

    I have just got a GeForce GTS 250 as dedicated PhysX card and my 4870x2 for rendering, I have run the game benchmark with those results:

    - Drivers: Catalyst 9.9 & ForceWare 185.68
    - Resolution: 1920x1200
    - Vsync: On
    - AntiAliasing: 4x with AAA forced via CCC
    - Anisotropic: 16x
    - PhysX: High
    - Minimum: 30
    - Average: 58
    - Maximum 60:

    It's really cool to see the ATI and nVidia cards working together. PhysX are cool in this game, I have seen banners, smoke and flying papers, it adds a bit of atmosphere to the game.

  18. #218
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by LordEC911 View Post
    ~2x as fast is the BEST case scenario. Shaders are being doubled but completely overhauled which should bring IPC but not necessarily.
    Also there are many other parts of GF100 that could possibly bottleneck the architecture.
    I'm with LordEC911 on that one. Very few things make think in a 3d rendering performance increase further than processing units count increase and/or clocks seeing the new arch.

    With a 113% CP increase, a 50% ROP increase, an unknown TMU increase, a 60% memory bandwidth increase, seems highly unlikely that we can say that a 100% real world performance increase is a worst case scenario. It's more like a best case scenario with a completely shader bottlenecked situation.

    I think we will see something similar to last generation, maybe slightly better for NVIDIA performance wise (I would say 20-30% performance advantage to the higher end GTX380, and a very little advantage for the GTX360) but even worse price wise (competing with a product 3-4 months old and that can cut down prices after months selling, and remember that GT300 is a much more expensive chip than RV870)

  19. #219
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by tdream View Post
    Yes it did clearly work before. Nvidia just blocked it with updated drivers.

    http://www.rage3d.com/board/showpost...&postcount=628
    That's why I asked, rather than saying it didn't.

    I'm not on windows 7 and I don't have a dual pci-e slotted mobo.

    If that truly is the case, next time I talk to our NV rep I'll see if he can give me an answer for what their reasoning is with this one. They usually don't talk to us in PR speak.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  20. #220
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by LOE View Post
    It is VERY strange to see all those people who fiercely DEFEND a greedy, multi billion dollar corporation that IS OBVIOUSLY PLAYING DIRTY TRICKS not only to it's competitor, but to YOU - CUSTOMERS as well.

    Think, god damn it!

    Is it fanboyism? Or stupidity? Or blind patriotism or something? Damn...
    naivete i say.Wait isn't his the G300 thread?

  21. #221
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by 003 View Post
    True. I'm referring to the people who run around like a chicken with its head cut off screaming that the GT300 is going to suck for games and it will be beaten by RV870. Honestly, in a WORST case scenario, it will be roughly twice as fast as the GTX285, which will trump a 5870 easily.
    That^^ is still to be determined! Most of whats new architecturally, is indeed for the CUDA end of Nvidia's expansion into the Scientific Community. So, given Anand's comments, the GT300 could certainly debut @ only 2X the 285's performance!


    Meaning it on par with a HD5870 1GB ~ then, think price?

  22. #222
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Xoulz View Post
    That^^ is still to be determined! Most of whats new architecturally, is indeed for the CUDA end of Nvidia's expansion into the Scientific Community. So, given Anand's comments, the GT300 could certainly debut @ only 2X the 285's performance!


    Meaning it on par with a HD5870 1GB ~ then, think price?
    2x GTX 285 performance(not sli, but DOUBLE) would be better than "par" with the 5870...

    Of course, no one knows the full performance on these cards, as been stated, but I heavily doubt it'll be less than double the GTX285(and again, NOT sli, but total double).
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  23. #223
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by DilTech View Post
    2x GTX 285 performance(not sli, but DOUBLE) would be better than "par" with the 5870...

    Of course, no one knows the full performance on these cards, as been stated, but I heavily doubt it'll be less than double the GTX285(and again, NOT sli, but total double).
    it all depends were its clocked, the 5870 is about 2x the 280, and we havnt seen a large review of the 5870 overclocked if it clocks and scales like the 4890 it will have a huge gain compared to the 300 if that clocks and scales like the 285. and price/watt will also be interesting it looks like 4 oced 5870 will go against 2-3 300's in wattage and price.

    its going to be interesting to see performance and openCL now that khonos is finally validating drivers we should finally get some pro grade software (although im not sure what it will do for consumers)
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  24. #224
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by DilTech View Post
    That's why I asked, rather than saying it didn't.

    I'm not on windows 7 and I don't have a dual pci-e slotted mobo.

    If that truly is the case, next time I talk to our NV rep I'll see if he can give me an answer for what their reasoning is with this one. They usually don't talk to us in PR speak.
    this whole thing started getting attention when a customer actually contacted nv tech support about physix no longer working in his system equipped with an ati vga.

    he got an official nvidia reply after a week that said it was a corporate business strategy decision or something along those lines...

    youll probably get the same reply... :/

    i dont think this is a big deal cause i dont expect a lot of people to actually use an ati vga and an nvidia vga for physix... but even then it clearly shows what kind of business practices nvidia follows... still nothing compared to what apple does, but not exactly fair-play and customer oriented...

    oh and guys, i think it makes no sense to argue over gt300 perf right now...
    we have no idea what clockspeeds range itll be able to hit... id say sit back and wait for some game devs and gpu gurus to read their way through the nvidia whitepapers and infos from nvidia, and we will have some pretty good guesses within a few weeks

  25. #225
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by DilTech View Post
    2x GTX 285 performance(not sli, but DOUBLE) would be better than "par" with the 5870...

    Of course, no one knows the full performance on these cards, as been stated, but I heavily doubt it'll be less than double the GTX285(and again, NOT sli, but total double).
    Yeah but... why do you expect it to have 2x the performance (I'm suppose you're talking about real world performance) if it's going to have +113% CPs more but only +50% ROP more, +60% mem bandwidth more...

    Consider that HD5870 is exactly double the HD4890 (+100% everything at the same clocks) except bandwidth (aprox. +30%) and it's far from double the real world performance (that's one of the most recent proves that doubling everything doesn't mean doubling real world performance), and NVIDIA is not even doubling processing units.

    Can they improve the performance per unit and per clock? Sure. Maybe. But how much and why, I think is way soon with the info we have to say it's going to be 2x real world performance of a GTX285. I even would say I hugely doubt it, given that they are more focused in get the new (future?) HPC market before Intel has their Larrabee working (if it happens to be on this century).

Page 9 of 42 FirstFirst ... 678910111219 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •