Page 27 of 42 FirstFirst ... 172425262728293037 ... LastLast
Results 651 to 675 of 1035

Thread: The official GT300/Fermi Thread

  1. #651
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Xoulz View Post
    They need to stick with their CORE BUSINESS, CUDA doesn't matter to 99.9% of the populace!
    So you're saying 99.9% of the populace with a computer is what gaming, I would say you're every bit as out in left field with your perception.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  2. #652
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by 003 View Post
    I'm not even going to bother explaining what is wrong with this. If somebody else wants to have a go, be my guest.

    OCN is full of blind AMD/ATI fanboys who have no idea what they are talking about. XS was more immune for a while, but the cancer has started to rapidly spread here as well.


    No, please do explain... I'm interested in what you have to say. I'm no fan of anything, don't paint me as such to feel better about your comments, just back them up!

  3. #653
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Xoulz View Post
    No, please do explain... I'm interested in what you have to say. I'm no fan of anything, don't paint me as such to feel better about your comments, just back them up!
    ATI now has AMD. What does that mean? x86 license. Intel will soon have Larrabee, and obviously x86.

    Nvidia? No x86.

    Intel and AMD now both have plans to integrate GPUs on the CPU die.

    See a problem here? Nvidia has to get a foothold in the CPU market somehow and they only way that will happen is with GPGPU. And since they will need to rely on it much more heavily than ATI, it is a lot better.

    Many developers want to use CUDA except their applications require ECC, which GT200 lacks. Fermi fixes that and many other problems, and dramatically increases double precision performance.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  4. #654
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by Chumbucket843 View Post
    thats just pure denial. you can get an approximation of performance based on all of the whitepapers and specs that have been released. no one said anything about "mind blowing performance". its fairly obvious that it will beat the 5870 even if its only 60% faster. thats based from the increase in bandwidth.
    mate it wont beat it by 60% i promise you.

    I kno you want them to do well for w/e reasons.. but seriously dont over-hype will end up being dissapointed.. theres a reason nvidia isnt going on about game performance and thats got me worried (and should do to you also.)


    (next bit not aimed at you)

    just they keep going on about all this stupid precision stuff theyve improved.. its a graphics card.. people want it for games.. im sorry thats the market they cater for.. going off trying to do a different thing is a bit stupid.. sure by all means go for physx.. thats fine.. just I believe they are forgetting the core market... which is lame....




    Quote Originally Posted by Xoulz View Post
    I wish Nvidia would stop trying to be Intel or AMD and just concentrate on gaming... Jen-Hsun Huang is delusional in thinking Nvidia has that much clout. They need to stick with their CORE BUSINESS, CUDA doesn't matter to 99.9% of the populace!

    But Nvidia has hung their entire business hat on it's acceptance, dumb!
    completley agree seeming there market is gamers.. im not sure people are like WOW I WANT THAT ... just to encode a movie in 10 mins as apposed to 20 .. im sorry people may use it and be like oh this is cool... but its hardly a defining feature come on nvidia.. lets see something good from you again like the 8 series cut the crap with this double floating point performance poop..






    Quote Originally Posted by 003 View Post
    ATI now has AMD. What does that mean? x86 license. Intel will soon have Larrabee, and obviously x86.

    Nvidia? No x86.

    Intel and AMD now both have plans to integrate GPUs on the CPU die.

    See a problem here? Nvidia has to get a foothold in the CPU market somehow and they only way that will happen is with GPGPU. And since they will need to rely on it much more heavily than ATI, it is a lot better.

    Many developers want to use CUDA except their applications require ECC, which GT200 lacks. Fermi fixes that and many other problems, and dramatically increases double precision performance.
    I completley agree.. but at the end of the day.. it CANT be a replacement for cpu.. and i feel there going to fall flat on their face in the long run :/ you say all this stuff but think when its ACTUALLY useful say a few years time.. amd will have "apu" (gpu) on their cpu die suposedley and im sure intel will have come up with something allowise they will loose a lot of laptop/netop market imo (if it's implemented well) I just can see the performance nvidia COULD bring to the table... but I just think it will be a hassel for unless they put a lot of money in and i think it'll end up being a money sink...

    the end of the day we WONT have decent high end graphics solutions on a cpu, unless something new comes along i mean now where are close to 300w on gpu's slap that in a cpu people will just call you a tard etc...

    I dunno I think nvidia will always have an awesome share of the graphics market.. but not if they go venturing on this little crusade cos imo' until they get a x86 liscence.. its just going to waste money :/

    (if you believe different please say, you probably noticed.. or not... ill listen to your opinon and happily correct myself or you if theres proof or you convince me etc.. )
    Last edited by Jamesrt2004; 10-08-2009 at 04:39 PM.
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  5. #655
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Jamesrt2004 View Post
    just they keep going on about all this stupid precision stuff theyve improved.. its a graphics card.. people want it for games.. im sorry thats the market they cater for.. going off trying to do a different thing is a bit stupid.. sure by all means go for physx.. thats fine.. just I believe they are forgetting the core market... which is lame....
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  6. #656
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Chumbucket843 View Post
    thats just pure denial. you can get an approximation of performance based on all of the whitepapers and specs that have been released. no one said anything about "mind blowing performance". its fairly obvious that it will beat the 5870 even if its only 60% faster. thats based from the increase in bandwidth.
    I recall many people being in 'pure denial' of the 58xx theoretical performance figures too.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  7. #657
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by 003 View Post
    *snip*
    lol ...

    read my edited bit cos i missed out your post.. i wanna see what you think of my ACTUAL views about them being more and more general processing and less gaming?



    -edit again... i think i put my points across very bad haha-.... I get what your saying, i just think it'll be a money sink cos' at the end of the day it isn't a cpu maybe be able to help out in a few odd bits here and there.. but yeah with small gpu's going onto cpu. amd/intel could then do the same thing nvidia are doing. then all they have will be there graphics cards etc again .. :/ I think to save time and money they should just focus on graphics/physx and the like
    Last edited by Jamesrt2004; 10-08-2009 at 04:45 PM.
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  8. #658
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Jamesrt2004 View Post
    just I believe they are forgetting the core market... which is lame....
    How have are they "forgetting the core market", what have they done to make you think gaming & discrete video in general isn't an important segment.

    They are expanding and working to create new markets that have not yet matured. If every inventor stopped every time somebody said that won't work or nobody's going to need that we wouldn't be where we are today.

    Luckily the world isn't full of pessimism and there are always people that can see the light at the end of the tunnel looking for new ways to do the same things we do today better, faster, cheaper.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  9. #659
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by highoctane View Post
    How have are they "forgetting the core market", what have they done to make you think gaming & discrete video in general isn't an important segment.

    They are expanding and working to create new markets that have not yet matured. If every inventor stopped every time somebody said that won't work or nobody's going to need that we wouldn't be where we are today.

    Luckily the world isn't full of pessimism and there are always people that can see the light at the end of the tunnel looking for new ways to do the same things we do today better, faster, cheaper.
    Its just EVERY bit of info so far i've seen doesnt say anything about GAMING .. thats all maybe its jsut poor marketing or their holding it off but meh... seems silly as you an I know the people buying it, are going to be using it for games not to accelerate flash or we :/... make sense ??

    I agree, but my post one abiove yours (another edited bit) youll see my views on that i.e its a good idea, but in the long run amd and intel can kinda just block them out by just doing it on the apu/gpu on the cpu sorta thing.. and tbh nothing stopping them I just think althought its a good thing to "adventure" like this. Its just a bad looking one for the future i may be wrong but meh??? I think you might see where im coming from though

    I'm not being pessimistic atall imo i like change.. i loved the 8 series + the 4 series they where both awesome same with say amd 64 owning then Pentium 4.. I just believe this will fail the whole general processing stuff.. just for the pure fact nothings stopping amd and intel from doing it but just in a few years time after they have it on die, cos imo once it's on there and say accelerating the program its still going to get a massive improvement over the normal cpu's but i dont think people will purchase a card to do something the cpu can do pretty much just as well (in the future i have to add)
    Last edited by Jamesrt2004; 10-08-2009 at 04:53 PM.
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  10. #660
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Jamesrt2004 View Post
    mate it wont beat it by 60% i promise you.

    I kno you want them to do well for w/e reasons.. but seriously dont over-hype will end up being dissapointed.. theres a reason nvidia isnt going on about game performance and thats got me worried (and should do to you also.)
    60% was relative to a gtx 285 and it was to get zalbard to understand my point about theoretical performance.

    review websites skew their results by adding a lot of AA and AF. its not an apples to apples comparison if you ask me. we need a gpu benchmark similar to cinebench. all a 5870 can do is apply more AA to current games and that doesnt show the true power of the card,just the ROP's. i dont think games will benefit that much from new cards until crysis2 or next consoles. a 5870 makes a xbox 360 look like a wii. you basically have to put up with crappy textures and high fps for a while.
    just they keep going on about all this stupid precision stuff theyve improved.. its a graphics card.. people want it for games.. im sorry thats the market they cater for.. going off trying to do a different thing is a bit stupid.. sure by all means go for physx.. thats fine.. just I believe they are forgetting the core market... which is lame....

    completley agree seeming there market is gamers.. im not sure people are like WOW I WANT THAT ... just to encode a movie in 10 mins as apposed to 20 .. im sorry people may use it and be like oh this is cool... but its hardly a defining feature come on nvidia.. lets see something good from you again like the 8 series cut the crap with this double floating point performance poop..
    the market for gpu computing is estimated to be 50% of the desktop market. dont ask why ask why not. this is a link to cuda home page and all of its applications. its more than just encoding. there are a lot of real world uses for CUDA. just not at the desktop yet. in the future when we are ray tracing, maybe. fermi is a gpgpu with fixed function graphics abilities just like all dx10 gpu's. i dont see why this would affect d3d performance.

    http://www.nvidia.com/object/cuda_home.html#

  11. #661
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by highoctane View Post
    So you're saying 99.9% of the populace with a computer is what gaming, I would say you're every bit as out in left field with your perception.


    Are you saying that more than 1% of the population is Folding@home..? Or, having sleepless nights waiting for the next greatest compute GPU..? Or have $499 to spend on a video card...? Thats why many of us are writing off Nvidia, because their drastic move toward non-gaming leaves many within the industry wondering about the COMPANY...


    Secondly, almost all the people who fold (or will use these added Fermi "features") are right here on XS... everyone else doesn't give a rats ass about CUDA. <--tis da truth


    For a small example demographic; there are some 12 million people who play World of Warcraft, none of them know about CUDA... nor do they care or even have reason to care. All that matters to people who buy video cards, is how well they run their computer's graphics & games.

    Understand, CPU is for calculation and the video card is for graphics... Tesla only speeds up compute calculations, nothing more. People don't buy video cards for that...

    Finite element analysis, high-precision scientific computing, sparse linear algebra, sorting, etc may drive Fermi sales, but how does that benifit the end-user here? Specially, if in the near future, people can just slap a larrabee on a hydra board. Then you have what Nvidia is offering.. if you need it.
    Last edited by Xoulz; 10-09-2009 at 01:50 AM.

  12. #662
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by Chumbucket843 View Post
    60% was relative to a gtx 285 and it was to get zalbard to understand my point about theoretical performance.

    review websites skew their results by adding a lot of AA and AF. its not an apples to apples comparison if you ask me. we need a gpu benchmark similar to cinebench. all a 5870 can do is apply more AA to current games and that doesnt show the true power of the card,just the ROP's. i dont think games will benefit that much from new cards until crysis2 or next consoles. a 5870 makes a xbox 360 look like a wii. you basically have to put up with crappy textures and high fps for a while.

    the market for gpu computing is estimated to be 50% of the desktop market. dont ask why ask why not. this is a link to cuda home page and all of its applications. its more than just encoding. there are a lot of real world uses for CUDA. just not at the desktop yet. in the future when we are ray tracing, maybe. fermi is a gpgpu with fixed function graphics abilities just like all dx10 gpu's. i dont see why this would affect d3d performance.

    http://www.nvidia.com/object/cuda_home.html#
    agree with the first bit, im hoping for a bit more.. imagination from their next architechture next year..

    and I agree i think it can have it's uses ... I think like... say how enabling physx makes cpu score amazzzing on vantage... you could make gpu kick in and give a nice boost to various apps... all im trying to say is i think when amd/intel get it on their cpu then they will just end up blocking nvidia out THAT potential market :/ ofcourse they wouldnt disapear cos its always nice to get a lil bit more of a boost.
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  13. #663
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by 003 View Post
    ATI now has AMD. What does that mean? x86 license. Intel will soon have Larrabee, and obviously x86.

    Nvidia? No x86.

    Intel and AMD now both have plans to integrate GPUs on the CPU die.

    See a problem here? Nvidia has to get a foothold in the CPU market somehow and they only way that will happen is with GPGPU. And since they will need to rely on it much more heavily than ATI, it is a lot better.

    Many developers want to use CUDA except their applications require ECC, which GT200 lacks. Fermi fixes that and many other problems, and dramatically increases double precision performance.

    And..?

    I'm still correct, CUDA (ie:Fermi's architecture) doesn't matter to 99.9% of the end-user... just to Nvidia, who doesn't have an X86 processor!

    3 billion transistors... for Nvidia's business model, not for our gaming pleasure!

  14. #664
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Jamesrt2004 View Post
    I'm not being pessimistic atall imo.. Its jsut I believe this will fail the whole general processing stuff.. just for the pure fact nothings stopping amd and intel from doing it but just in a few years time
    I would say they are trying to get a leg up on the competition with their heavy Tesla push. If they can actually get their gpgpu initiative to take root first with developers they can very well be a market leader until the other players catch up.

    There's definitely no guarantees of landslide success but the potential market is there if they can get enough developers aware of & on board to utilize the technology. It may not necessarily be huge with the general user but say the medical & research fields this could be a huge chance for them to make in roads into untouched markets with potentially much higher margins.

    Nvidia is essentially getting squeezed out of mainstream markets with chipsets pretty much dead in the water they need to find other ways to sustain their growth or else go the way of VIA.

    Their best chance is to move quickly now while they have the resources and influence to still make big moves.

    I don't think they're forgetting their roots at all but are trying hard making more noise about their gpgpu initiatives to bring about awareness and spark curiosity & imagination with potential developers & customers which seems to be working on the awareness side at least.

    We'll have to wait and see which is the hard part in the I want it yesterday technology world.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  15. #665
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by highoctane View Post
    I would say they are trying to get a leg up on the competition with their heavy Tesla push. If they can actually get their gpgpu initiative to take root first with developers they can very well be a market leader until the other players catch up.

    There's definitely no guarantees of landslide success but the potential market is there if they can get enough developers aware of & on board to utilize the technology. It may not necessarily be huge with the general user but say the medical & research fields this could be a huge chance for them to make in roads into untouched markets with potentially much higher margins.

    Nvidia is essentially getting squeezed out of mainstream markets with chipsets pretty much dead in the water they need to find other ways to sustain their growth or else go the way of VIA.

    Their best chance is to move quickly now while they have the resources and influence to still make big moves.

    I don't think they're forgetting their roots at all but are trying hard making more noise about their gpgpu initiatives to bring about awareness and spark curiosity & imagination with potential developers & customers which seems to be working on the awareness side at least.

    We'll have to wait and see which is the hard part in the I want it yesterday technology world.
    ok I understand what your saying .. pretty much people KNOW they do graphics cards etc.. and will be good for gaming etc.. but trying to push the new stuff now so they have a big/better etc.. chance in the long run!

    it does make sense to me personnally, but still worried like you said they are being squeezed out of the other markets, (which is one reason im peeved about physx.. sure people keep arguing we could do alot of the stuff on cpu but its only just came into the spotlight give it time and im sure we will need something decent to run physics etc..) and i said they may be able to get a big push forward but I cant help and think peopel will opt for open CL, i mean cuda runs it ati runs it processors run it, i believe it will be easier for developers to go that route, which is why i think nvidia will have to splash some cash...


    I do hope they made the right decision... if it works it will be a huge relief if not.. then i guess ill start worrying when the time comes
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  16. #666
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Jowy Atreides, the renaming was only a side effect of what really went on... they had problems with several product refreshes and new chips, since those were delayed or failed, they had to rename the same old parts and shrink them... the real problem is that their refresh and new gen roadmap was too ambitious and way out of touch with reality... that plus they didnt have propper plan B options many times, which is something you should have in a competitive market where projects DO fail every now and then...

    with 40nm ati played their cards very very well... first they stuck their little toe in the water to check the temp, rv740, then decided its too cold and prepared to stand on two legs, a rather small juniper, same speed as previous gen but cheaper and cooler, and a bigger and beefier cypress... even if cypress would have failed, they could have come up with a dual juniper card to tackle the 295, and juniper is fast enough to force 285 prices lower as well even if there wouldnt have been cypress to outperform it...

    compare that to nvidias gt200 strategy, theres gt200... and thats it...
    and gt300 so far looks to be the same thing, there will be gt300... and then thats it... they said they will cut it down, but when?
    even if gt300 arrives in small numbers in december, then what? g92 and gt200 are both outperformed by juniper and cypress respectively, and whats the point of having a halo product if there is nothing else to sell but an expensive halo product?

    its a shame, we are already seing higher prices for juniper and cypress than we would have if nvidia had a 40nm gt200 or something else to compete with atis new chips... :/
    Last edited by saaya; 10-08-2009 at 06:14 PM.

  17. #667
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by Xoulz View Post
    And..?

    I'm still correct, CUDA (ie:Fermi's architecture) doesn't matter to 99.9% of the end-user... just to Nvidia, who doesn't have an X86 processor!

    3 billion transistors... for Nvidia's business model, not for our gaming pleasure!
    So you want NV, to just die a slow death? Well of course you do xoulz, you hate Nvidia.

    Nv is simpy trying to make outlets for cash since, it going to be left out in the cold alot more when everyone integrates GPU on onto a CPU, those slowing down cpu sales.

    Adding to the equation is Larrabee, which is invading NV retail space and NV has to do something, just in case larrabee does destroy everything out their
    By your logic, AMD, although a cpu company had no damn right to even challenge Intel as a CPU company based on differences in the size.

    Do you think Intel has no right to enter the GPU market, because they are a CPU company too?

    Man people really hate NV. But when you look back at it, Intel has done some much shadier stuff than both NV and AMD. Yet people love them. Bribery and using clout to make sure your CPU have an unfair advantage, Intel wrote the book on this.

    y marketing? AMD and Intel have both done this. Amd's most recent slides were taking shots at NV like no tommorow. They have essentially bought Charlie(his website contains only AMD ads) and are using him to push their propaganda. Don't you think it is dirty to start rumors to lower your competitors stock price? Even NV hasn't sunk that low yet.

    The whole renaming thing got a little carried away but really ripped no one off, because most people didn't rebuy the same card contrary to what the haters think and they typically always dropped the price of their card and put it in a lower target segment when they did this.

    Sure it removes a bit of focus that the card is not a direct x 10.1 card, but really did people miss that much from not getting direct x 10.1. If anything this hurt NV more because it made the transition to direct x 11 a bit more difficult.

  18. #668
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    See the problem with all of this is actually very simple.

    Most of us buy video cards to play games and that's it. I don't give a rat's butt if it can do anything else.

    So forgive me if I'm less than impressed when Nvidia comes out and starts trying to sell me their products based on non-gaming related features. I just don't want or need them.

  19. #669
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by saaya View Post
    compare that to nvidias gt200 strategy, theres gt200... and thats it...
    and gt300 so far looks to be the same thing, there will be gt300... and then thats it... they said they will cut it down, but when?
    Well there is this, new cards from Nvidia. Yea, that will do.

  20. #670
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Xoulz View Post
    And..?

    I'm still correct, CUDA (ie:Fermi's architecture) doesn't matter to 99.9% of the end-user... just to Nvidia, who doesn't have an X86 processor!

    3 billion transistors... for Nvidia's business model, not for our gaming pleasure!
    You talk as if it will be slow or something. MOST of the transistors come from more than doubling the shaders to 512 from 240, and that alone will bring the biggest performance in games.

    GT200 had 240 SPs and 1.4b transistors.

    Fermi has 512 SPs and 3.0b transistors.

    Guess what? The ratio transistors to SPs on Fermi is virtually the same as GT200. What does that mean?

    All the people saying "omg all those transistors are wasted on GPGPU" are freaking morons!!! The GPGPU features on Fermi are at no extra cost to gamers, and game performance is not sacrificed.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  21. #671
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by 003 View Post
    All the people saying "omg all those transistors are wasted on GPGPU" are freaking morons!!! The GPGPU features on Fermi are at no extra cost to gamers, and game performance is not sacrificed.
    So the extra transistors consume no power, and are free as well?

  22. #672
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by eleeter View Post
    So the extra transistors consume no power, and are free as well?
    Read (or reread) my previous post until it sticks.

    Or just repeat the following sentence over and over until it sinks in:

    The ratio of transistors to SPs on Fermi is exactly the same as it was on GT200.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  23. #673
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    244
    Quote Originally Posted by 003 View Post
    You talk as if it will be slow or something. MOST of the transistors come from more than doubling the shaders to 512 from 240, and that alone will bring the biggest performance in games.

    GT200 had 240 SPs and 1.4b transistors.

    Fermi has 512 SPs and 3.0b transistors.

    Guess what? The ratio transistors to SPs on Fermi is virtually the same as GT200. What does that mean?

    All the people saying "omg all those transistors are wasted on GPGPU" are freaking morons!!! The GPGPU features on Fermi are at no extra cost to gamers, and game performance is not sacrificed.
    GF100 has 48ROP and 384bit MC

    GT200 had 32ROP and 512bit MC

    The ratio transistors to SPs on Fermi is virtually higher than GT200
    Last edited by mindfury; 10-08-2009 at 08:23 PM.

  24. #674
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by mindfury View Post
    GF100 has 48ROP and 384bit MC

    GT200 had 32ROP and 512bit MC

    The ratio transistors to SPs on Fermi is virtually the higher than GT200
    To calculate the ratio of transistors to SPs is to divide transistors by SPs. This figure is virtually the same on Fermi as it was on GT200. Other factors don't change that.

    Now, when you do factor in ROP counts and bus width, among other things, it is higher on Fermi but not by any significant amount.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  25. #675
    Xtreme Member
    Join Date
    Mar 2009
    Location
    Miltown, Wisconsin
    Posts
    353
    Nvidia needs to double down and split their hand. Make a dedicated PPU and make a seperate gpu. They can still support cuda on the gpu, just not make it the one and only thing its made for. That would free up silicon and make them competitive again. There is no point going in the direction there going right now because its a dead end. You cant be competitive in 3d graphics if your not making gpus.

Page 27 of 42 FirstFirst ... 172425262728293037 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •