Page 5 of 7 FirstFirst ... 234567 LastLast
Results 101 to 125 of 173

Thread: Nvidia's Next Gen Speculation

  1. #101
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    the raytracing thing is still kinda bugging me, for shadows you see that grainy look cause i guess it didnt finish certain spots. i wonder why not just throw in an assumption first, and then fix it later. it could make something look "good enough" near instantly, which is what we need for games if they get this stuff anytime soon

  2. #102
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Baron_Davis View Post
    You know why Nvidia is throwing sand in your guys' eyes with all this iray, Matlab, CUDA crap?

    Cause they know they can't release a GPU that isn't gonna get beasted free by ATI's next gen lineup.

    Notice how their "focus" on CUDA, PhysX, rendering, and the other garbage technology that's supposedly targetting the professional market started when ATI started beating their ass?
    If you are suggesting that the things they are talking about are not important or significant, you are being extremely short sighted and naive.

    ATI will definitely release HD6000 before Nvidia can get out a refresh (and that's all HD6000 really is), and they will no doubt outperform Nvidia's current offerings (at least for PC gaming). Nvidia will no doubt counter at a later point with a refresh of their own.

    But gaming performance is far from everything.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  3. #103
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Utah, USA
    Posts
    301
    Quote Originally Posted by Baron_Davis View Post
    You know why Nvidia is throwing sand in your guys' eyes with all this iray, Matlab, CUDA crap?

    Cause they know they can't release a GPU that isn't gonna get beasted free by ATI's next gen lineup.

    Notice how their "focus" on CUDA, PhysX, rendering, and the other garbage technology that's supposedly targetting the professional market started when ATI started beating their ass?

    It's Nvidia's way of changing the conversation.

    Jen-Hsun Huang is a pathetic douche. It's OK though, no amount of talking will help heal the pain once ATI unleashes in a month.
    Wow, biased much? I personally like both sides of the GPU market. They have very different uses. I think that the HPC side of what nVidia is doing is amazing.
    EVGA Classified 759|Lian Li A20B|i7 920 at 4.0GHz 1.29V|HIS 5970 2GB|12GB Mushkin PC12800 |6 Raid 6 WD RE-3 250's, Adaptec 5805 SAS Controller| Megahelms + Panaflow Low speed

  4. #104
    Banned
    Join Date
    Mar 2008
    Posts
    583
    Lol I'm not biased at ALL. I buy whatever is best for me. But Nvidia has sickened me over the past couple of years. They release the same over and over with a different name, and worst of all, they can't even release top of the line but expensive cards anymore.

    They are talking about everything EXCEPT gaming now because even they are smart enough to realize how terrible their recent GPU's have been.

    If you really think all the mumbo jumbo they're trying to feed you now actually matters, you must not know how businesses work.

    Nvidia's business plan: If you're failing at making the best gaming GPU's, start talking about other useless technologies that look pretty on a Power Point presentation.

    Nvidia's gaming market business plan: If you're failing at making the best gaming GPU's, start focusing on some specific advantage of your GPU's, and release benchmarks in games that take advantage of said technology.


    I'll bet you guys Nvidia will NOT have a response to ATI's October lineup. If they have the audacity to release a dual GF104 card, I will laugh, cause it will be such an obvious and pathetic attempt at keeping up. It will prolly consume 400W, and run at 100C.
    Last edited by Baron_Davis; 09-21-2010 at 09:41 AM.

  5. #105
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    this conference is not for just gamers, so why talk forever about that stuff.
    theres ALOT of good technology being discussed and the uses for GPUs are growing, gaming wont be the only thing you buy them for

  6. #106
    Xtreme Enthusiast
    Join Date
    Mar 2010
    Location
    Istanbul
    Posts
    606


    28nm Kepler

    Focus is apparently on performance/watt with Kepler aimed to be 3-4 times more efficient..

  7. #107
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Utah, USA
    Posts
    301
    Quote Originally Posted by Baron_Davis View Post
    Lol I'm not biased at ALL. I buy whatever is best for me. But Nvidia has sickened me over the past couple of years. They release the same over and over with a different name, and worst of all, they can't even release top of the line but expensive cards anymore.

    They are talking about everything EXCEPT gaming now because even they are smart enough to realize how terrible their recent GPU's have been.

    If you really think all the mumbo jumbo they're trying to feed you now actually matters, you must not know how businesses work.

    Nvidia's business plan: If you're failing at making the best gaming GPU's, start talking about other useless technologies that look pretty on a Power Point presentation.

    Nvidia's gaming market business plan: If you're failing at making the best gaming GPU's, start focusing on some specific advantage of your GPU's, and release benchmarks in games that take advantage of said technology.


    I'll bet you guys Nvidia will NOT have a response to ATI's October lineup. If they have the audacity to release a dual GF104 card, I will laugh, cause it will be such an obvious and pathetic attempt at keeping up. It will prolly consume 400W, and run at 100C.
    Sorry, I can't take anything you say seriously anymore.

    EVGA Classified 759|Lian Li A20B|i7 920 at 4.0GHz 1.29V|HIS 5970 2GB|12GB Mushkin PC12800 |6 Raid 6 WD RE-3 250's, Adaptec 5805 SAS Controller| Megahelms + Panaflow Low speed

  8. #108
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601


    Edit : Nice Man from Atlantis

  9. #109
    Banned
    Join Date
    Mar 2008
    Posts
    583
    Quote Originally Posted by Manicdan View Post
    this conference is not for just gamers, so why talk forever about that stuff.
    theres ALOT of good technology being discussed and the uses for GPUs are growing, gaming wont be the only thing you buy them for
    I agree, but as I said, this is like some douche bag having a stroke and then becoming nice and passive and promoting world peace and kindness. He didn't do it on his own, he did it cause of fear.

    Nvidia is doing this cause they're scared of ATI, and just straight up haven't had and will not have an answer to them for a while.

    EDIT:

    Quote Originally Posted by Olivon View Post
    LOL MY POINT EXACTLY. "Look ma! Them Nvidia sure is strong!!"
    Last edited by Baron_Davis; 09-21-2010 at 09:47 AM.

  10. #110
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    nvidia's version of ticktock.
    [SIGPIC][/SIGPIC]

  11. #111
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Canada
    Posts
    763
    Quote Originally Posted by Baron_Davis View Post
    Lol I'm not biased at ALL. I buy whatever is best for me. But Nvidia has sickened me over the past couple of years. They release the same over and over with a different name, and worst of all, they can't even release top of the line but expensive cards anymore.

    They are talking about everything EXCEPT gaming now because even they are smart enough to realize how terrible their recent GPU's have been.

    If you really think all the mumbo jumbo they're trying to feed you now actually matters, you must not know how businesses work.

    Nvidia's business plan: If you're failing at making the best gaming GPU's, start talking about other useless technologies that look pretty on a Power Point presentation.

    Nvidia's gaming market business plan: If you're failing at making the best gaming GPU's, start focusing on some specific advantage of your GPU's, and release benchmarks in games that take advantage of said technology.


    I'll bet you guys Nvidia will NOT have a response to ATI's October lineup. If they have the audacity to release a dual GF104 card, I will laugh, cause it will be such an obvious and pathetic attempt at keeping up. It will prolly consume 400W, and run at 100C.
    Im sure Nvidia wont have a response at the same time as ATi launches (if they do) at that time.

    But why you get so emotional about a videocard company is hilarious.Yes Nvidia dicked around getting the latest version out. But now that its out, its not bad. it atleast gives some competition again on the market which is what we need. I don't see how they are failing at making gaming gpus, if you look at the current available graphics we can play most if not all games at high res with very good framerates with either companies top end cards or with a couple of lower end cards. To say they failed, one only has to look at how many people have picked up gtx 4xx series in recent months to see otherwise. In the end we just want good cards from both companies to compete with each other. Don't get too crazy about it
    Lian Dream: i7 2600k @ 4.7Ghz, Asus Maximus IV Gene-z, MSI N680GTX Twin Frozr III, 8GB 2x4GB Mushkin Ridgeback, Crucial M4 128GB x2, Plextor PX-755SA, Seasonic 750 X, Lian-li
    HTPC: E5300@3.8, Asus P5Q Pro Turbo, Gigabyte 5750 Silentcell, Mushkin 2GBx2, 2x 500gb Maxtor Raid 1, 300gb Seagate, 74gb Raptor, Seasonic G series 550 Gold, Silverstone LC16m

    Laptop: XPS15z Crucial M4
    Nikon D700 ~ Nikkor 17-35 F2.8 ~ Nikkor 50mm F1.8
    Lian Dream Work Log
    my smugmug
    Heatware

  12. #112
    Banned
    Join Date
    Mar 2008
    Posts
    583
    Quote Originally Posted by ubuntu83 View Post
    nvidia's version of ticktock.
    http://en.wikipedia.org/wiki/Intel_Tick-Tock

    Nvidia is neither ticking or tocking...and with that I'm done. As I said, I'm not biased, I thought the 8800GT was pretty GDLK. Will be remembered along with the Celeron 300A and other classics. But lately...my God, lately they are 100% trash.

    I hope that changes, but I doubt it.

  13. #113
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    So we have a name let's start speculating. Keplar will surely be on a 28nm process if they want that much computational power upgrade.
    [SIGPIC][/SIGPIC]

  14. #114
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Baron_Davis View Post
    I agree, but as I said, this is like some douche bag having a stroke and then becoming nice and passive and promoting world peace and kindness. He didn't do it on his own, he did it cause of fear.

    Nvidia is doing this cause they're scared of ATI, and just straight up haven't had and will not have an answer to them for a while.
    i think they are trying to show how well CUDA has come, and how much farther it will go. its a market they are doing a very good job of dominating. instead of trying to fight over how much of the pie they can get in just gaming, they are going after pies that x86 has been eating all by themselves for decades now.

    they didnt suck at gaming back when cuda was invented, but its clear that it became a more important part of their design. it seems to be doing ok, and it looks like they expect it to become a reason for most of their revenue.

  15. #115
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by kaktus1907 View Post

    28nm Kepler

    Focus is apparently on performance/watt with Kepler aimed to be 3-4 times more efficient..
    That chart proves they were late releasing Fermi as it was a 2010 release not 2009. What do you think that means for Kepler...
    Last edited by Eastcoasthandle; 09-21-2010 at 10:01 AM.
    [SIGPIC][/SIGPIC]

  16. #116
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by Baron_Davis View Post
    Nvidia is gonna fail once again. Whatever BS card they release, you can bet it's either gonna be:

    a) a sandwich (two slapped together GF104/6's with a sweet new name )
    b) a mobile heater.
    Hmmm Sorry Baron but I would just like to address the two points you have highlighted here.

    Point a)

    A sandwich?

    No, I have a dual GPU Single PCB nVidia solution in my PC at the moment and I can confirm to you it is not a sandwich, in fact it is two GPU's on one PCB elegantly linked together by a 10GB/s NV chip. It is a dual slot card, with a nice back plate and a single fan in the centre of the cooler. The days of the nVidia Sandwich dual GPU dual PCB hackjobs are over and went out with the GTX 295.

    b) A mobile heater?

    I am afraid you might be correct on this one, the GTX 480 does run awfully hot, in fact it runs hotter than my dual GPU single PCB nVidia solution. (Mine maxes out in the mid 80's after 15minutes of Furmark 1.8.2 @ 1920*1200 with 16X FSAA) I have seen the GTX 480 reach around 100C at this setting.

    However it is from my limited understanding that nVidia have somewhat counteracted this with the GF104 series chip by stripping down on a lot of the "Tesla" Computing components and also creating a more efficient architecture for gaming.

    I am going to assume that nVidia will do the following (and yes this is an assumption).

    1) Gamer based GPU's will have limited DP capability and will boast moderate computational features and performance, but would do well at Physx.

    2) Tesla based GPU's will rock the socks off for DP performance and computational features and put all other GPU's (including ATi) to shame.

    3) Quadro based GPU's will have a mixture of both worlds. They will boast increased computational horse power over the "Gamer GPU's", but may have, I have this opinion as I feel nVidia would want to drive the Ray-Tracing Bandwagon in Quadro Land.

    What this boils down to is that nVidia will and can charge the earth for Tesla and Quadro based GPU's (because of the niche market they serve), but will also hopefully reduce costs to the gamer.

    ATi do have a very impressive GPU lineup, just I feel that they lack the diverse feature set nVidia have with their cards. AND ATi have a habit for notoriously cutting corners.

    (In the past we had no real SM 3.0 feature set, limited instruction length on R300 (fixed in refresh) and more recently the tessellation bottleneck).

    Don't get me wrong, I am a big fan of the leaps and strides ATi have made over the years (Ex Radeon R300 based GPU owner here ) and I truly believe that competition is great for the consumer..... just I do not like the way ATi have been assimilated by the AMD collective

    John
    Stop looking at the walls, look out the window

  17. #117
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by Eastcoasthandle View Post
    That chart proves they were late releasing Fermi as it was a 2010 release not 2009. What do you think that means for Kepler...
    It depends on the readiness of TSMC's 28nm process and their architecture design. It likely will be a late 2011 release probably Q4.
    [SIGPIC][/SIGPIC]

  18. #118
    Xtreme Member
    Join Date
    Dec 2009
    Posts
    435
    Quote Originally Posted by Eastcoasthandle View Post
    That chart proves they were late releasing Fermi as it was a 2010 release not 2009. What do you think that means for Kepler...
    Fermi was a significantly redesigned architecture on a new process (a flaky one at that) all on one step. That is why it was late. Presumably Kepler will be based on Fermi, just scaled up and tweaked. Nvidia's experience with Fermi and not very signifiant architectureal changes should mean a much more timely release.

    Remember how late ATIs new architecture was (R600)? Then after that, how smooth everything went? As you can see, ATIs new Northern Islands architecture has been significantly delayed due to process roadmaps not hitting their targets, so instead we get Southern Islands which will be very similar to HD5000 and on the same process.
    i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit

  19. #119
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Baron_Davis View Post
    Nvidia is gonna fail once again.
    that's obvious because everyone makes mistakes. nvidia is no exception.

  20. #120
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601


    JHH defeated the kraken with 4 bladez of fury combo !!!

    Last edited by Olivon; 09-21-2010 at 10:14 AM.

  21. #121
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Chumbucket843 View Post
    i think they know that.designing a chip is a bit more complex than just putting stuff on there. keep in mind they had around four years to design fermi. 6 months is only a 10% delay. there is almost no room to screw up with a schedule that tight.

    you can. g80 had more gpgpu features than r600. shaders and gpgpu are becoming more and more similar which is why dx11 features directxcompute. features like caches will help graphics more and more in the future.
    A g80 having more GPGPU features doesn't mean you can have a coexistence. An equally valid counterexample is Fermi.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  22. #122
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by Olivon View Post


    JHH defeated the kraken with 4 bladez of fury combo !!!

    Quote Originally Posted by Mechanical Man View Post
    Quote Originally Posted by ubuntu83
    Maybe CudaFoo
    Fixed
    [SIGPIC][/SIGPIC]

  23. #123
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by cegras View Post
    A g80 having more GPGPU features doesn't mean you can have a coexistence. An equally valid counterexample is Fermi.
    well, then what about cypress having more gpgpu features than rv770? namely LDS and some instructions for dot products. or how about increasing the register file size in gt200 relative to g80? an equally interesting point about fermi is that it ray traces MUCH faster than previous gpu's. the caches help out a lot for ray tracing.

    although i will say there is a significant difference in efficiency from a specialized processor to a general purpose processor, graphics code itself is becoming more generic because there is enough computational power to do more. many of these things i listed will actually help graphics.

  24. #124
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by kaktus1907 View Post


    28nm Kepler

    Focus is apparently on performance/watt with Kepler aimed to be 3-4 times more efficient..
    Exactly what they need if they want to be successful again... looks like the point finally went home with Fermi.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  25. #125
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by RPGWiZaRD View Post
    Exactly what they need if they want to be successful again... looks like the point finally went home with Fermi.
    agreed
    we can expect about 1.5-2x perf per watt just going to 28nm, which means its going to get 2x better than that from their improvements. which should mean that a 28nm midrange chip should offer some incredible bonuses. imagine a physics card for <100$ that is the same as a gt480, but uses only 60-80W

Page 5 of 7 FirstFirst ... 234567 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •