MMM
Page 1 of 4 1234 LastLast
Results 1 to 25 of 85

Thread: Does Nvidia have a future?

  1. #1
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870

    Does Nvidia have a future?

    Does Nvidia have a future?

    The guys at bit-tech chronicle Nvidia's pitfalls and potential. Personally, the most intriguing part for me was the suggestion that ARM could become a more direct competitor to Intel as smartphones and mini-notebooks gain traction (helped along by stuff like Android and Chrome OS).

  2. #2
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Very interesting article.

  3. #3
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Right, good article

  4. #4
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Posts
    681
    If gpu and cpu are converging technologies, doesn't that mean all gpu companies are doomed? There is no way gpu can win that battle.

  5. #5
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Yep, that article is spot-on !! thnx.

  6. #6
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i read the article as "Nvidia vs x86, a fight to the death". it does look like in the next 5 years, only one of them will exist.

  7. #7
    Xtreme Enthusiast
    Join Date
    Feb 2007
    Location
    So near, yet so far.
    Posts
    737
    I've read the article yesterday. The title is very catchy IMO, but with fairly summed-up content.
    [[Daily R!G]]
    Core i7 920 D0 @ 4.0GHz w/ 1.325 vcore.
    Rampage II Gene||CM HAF 932||HX850||MSI GTX 660ti PE OC||Corsair H50||G.Skill Phoenix 3 240GB||G.Skill NQ 6x2GB||Samsung 2333SW

    flickr

  8. #8
    Xtreme Member JaD's Avatar
    Join Date
    Oct 2002
    Posts
    257
    Not even Nvidia itself is pretending their GPGPUs to compete with CPUs. They just want GPUs to become an essential co-processor, dedicated to extensively multithreaded applications; and guess what? Intel does too, although following a different approach: it is clear how different kinds of applications require more specific architectures, how CPUs will never work well enough for rendering, transcoding, running physics or scientific simulations, as well as how a more classic CPU architecture will always be required for a variety of more serial applications. All this "we want our GPUs to become the main computer chip" is just a rumor started by lousy journalists, nothing more.
    Wheter the x86 or "gpu" approach to multiple, simple core chips will prevail, remains to be seen. And I think nobody can pretend to foretell.

  9. #9
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    You come away with feeling like for all Nvidia has done in the last few years, they still have yet to be given full credit....

    (Leading GPU Computing, CUDA, CUDA Compilers, PhysX, Ray Tracing, Folding, Native Drag-n-Drop Video Transcoding in 7, and headding up the OpenCL board...)

    Forget x86. Let the CPU do it. It probably does it better anyway.

    Keep the GPU on apps that shine with lots of parallel processors.

    I dont like the title of the article... I hope they do one on ATI too. I wonder if it will be written in the same manor?

    "GT200 was the faster chip we’d all been waiting for, but AMD had trumped Nvidia with its RV770 chip – the Radeon HD 4870 delivered about 80 per cent of the performance for almost half the price of a GTX 280 at launch. Soon after, AMD followed up with the Radeon HD 4870 X2, which held the undisputed performance crown for over five months and, even when it did get competition, Nvidia’s GeForce GTX 295 wasn’t hugely faster and suffered with high levels of AA applied so it could be argued that the 4870 X2 was still the superior part in some respects."

    How does a chip that is about 20% slower trump Nvidia? Cost, yes at first... That was adjusted. They should put more importance on performance.

    You can tell where a site stands on Nvidia, on how they select their words when putting a 295 up against a 4870 X2...

    So how many months has the 295 heald the performance crown, and why dont they mention that? It has been King all year...
    Last edited by Talonman; 08-21-2009 at 07:29 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  10. #10
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i am all for nvidia trying to think outside the box and not using the keep it simple stupid strong gpu method. they have alot of incredible ideas that just arnt sticking to the wall, but the moment one does we will have quite a change in products (the tegra chip might be one)

  11. #11
    XS_THE_MACHINE
    Join Date
    Jun 2005
    Location
    Denver
    Posts
    932
    Quote Originally Posted by Talonman View Post
    It has been King all year...
    Umm... it's only a video card, not the messiah or some grand monarch.

    But really, unless you work for nvidia, you shouldn't take the article so personal.


    xtremespeakfreely.com

    Semper Fi

  12. #12
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    personal... I don't. Just my opinion.

    I don't understand why people who like Nvidia, must also work for them.
    Last edited by Talonman; 08-21-2009 at 07:49 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  13. #13
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Manicdan View Post
    i read the article as "Nvidia vs x86, a fight to the death". it does look like in the next 5 years, only one of them will exist.
    The day Microsoft develops an ARM compatible OS Nvidia will have a fighting chance.

    Quote Originally Posted by Manicdan View Post
    i am all for nvidia trying to think outside the box and not using the keep it simple stupid strong gpu method. they have alot of incredible ideas that just arnt sticking to the wall, but the moment one does we will have quite a change in products (the tegra chip might be one)
    Yeah these companies are in an unfortunate position where they have to spend money today hoping for it to bear fruit in 10 years. And while doing that they have to keep up with our short attention span and produce compelling products every 6 months too.

  14. #14
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    Quote Originally Posted by Talonman View Post
    How does a chip that is about 20% slower trump Nvidia? Cost, yes at first... That was adjusted. They should put more importance on performance.

    You can tell where a site stands on Nvidia, on how they select their words when putting a 295 up against a 4870 X2...

    So how many months has the 295 heald the performance crown, and why dont they mention that? It has been King all year...
    The title of the article is "Does Nvidia Have a Future?", not "Is Nvidia the King?". Ultimately, it doesn't matter how strong their strongest card is, if they can't sustain their business. And while having the strongest card may be a selling point that contributes to sales (Directly or via halo effect), it's obviously not a magic wand that somehow skews the laws of profit and loss in business. So no, they shouldn't put more importance on performance - they should put the importance on Nvidia's health and future as a business, which is exactly what they've done in this article.

    Really, there's lots of bias to be found out there - this article is barely, if at all, a blip on the radar.
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  15. #15
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Rockville, MD
    Posts
    426
    Excellent article, and I really didn't think that it was too biased against nvidia, I actually thought it forecast the future as pretty bright for them. The negative aspects mentioned are true - nvidias chipsets are basically non existent now and the radeon 48xx series did trounce the gtx260 and 280 based on price/performance when they were released until the price adjustments.
    Main PC
    i7 3770k
    Asus P8Z77-Deluxe
    4x4 GB Gskill Sniper
    Sandisk Extreme 240 GB
    Gigabyte GTX 670
    Coolermaster ATCS 840
    MCP35X - Apogee Drive II - MCR320


    HTPC
    i7 920
    Gigabyte EX58 UD5
    Sapphire 5670
    3x2 GB OCZ Platinum @ 7-7-7-20
    Corsair HX-650
    Silverstone LC10
    Intel X25-M G2

  16. #16
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by trinibwoy View Post
    The day Microsoft develops an ARM compatible OS Nvidia will have a fighting chance.
    we all know microsoft is trying to fight apple, and its clear nvidia is great at trying to make a niche product that does one task very well. if microsoft were to work together with the hardware manufacturers to build some really sick products, maybe apple wont get away with their over prices white gizmos. we are not far off from seeing a laptop that last 15 hours for basic usage on the battery, while still having the power to do everything but intense gaming, for under 500$. whats stopping that is the OS not utilizing all these things the hardware manufacturers can do

  17. #17
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    It was a good and interesting read indeed

    Quote Originally Posted by MpG View Post
    And while having the strongest card may be a selling point that contributes to sales (Directly or via halo effect), it's obviously not a magic wand that somehow skews the laws of profit and loss in business. So no, they shouldn't put more importance on performance - they should put the importance on Nvidia's health and future as a business, which is exactly what they've done in this article.
    True that!
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  18. #18
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    278
    Quote Originally Posted by Manicdan View Post
    i read the article as "Nvidia vs x86, a fight to the death". it does look like in the next 5 years, only one of them will exist.
    That means ATI wont exists if nVidia goes under, which i dont think is going to happen.

  19. #19
    Xtreme Addict
    Join Date
    Jul 2005
    Posts
    1,646
    Quote Originally Posted by rogueagent6 View Post
    Umm... it's only a video card, not the messiah or some grand monarch.

    But really, unless you work for nvidia, you shouldn't take the article so personal.
    Some people can't handle it when their pet brands get put under the spotlight. As long as the article isn't making things up to tell a story then only fanboys are going to get offended. This article doesn't show any obvious bias for instance.

  20. #20
    Xtreme Addict
    Join Date
    Jul 2005
    Posts
    1,646
    Quote Originally Posted by Sh1tyMcGee View Post
    That means ATI wont exists if nVidia goes under, which i dont think is going to happen.
    Uh ATI = AMD which = x86 so, nvidia dieing to x86 has nothing to do with ATI.

  21. #21
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    278
    Quote Originally Posted by MrMojoZ View Post
    Uh ATI = AMD which = x86 so, nvidia dieing to x86 has nothing to do with ATI.
    As a matter of fact, i know that nVidia is not going to go under, and i think most people will be susprised with the new cards performance and pricing.

  22. #22
    Xtremely High Voltage Sparky's Avatar
    Join Date
    Mar 2006
    Location
    Ohio, USA
    Posts
    16,040
    I think GPU should no longer mean Graphics Processing Unit, but rather Global Processing Unit. Seems to be doing more and more than just graphics.

    </random thought>
    The Cardboard Master
    Crunch with us, the XS WCG team
    Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64

  23. #23
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    If Nvidia gets in too much danger, I hope they merge with AMD and make a company that can destroy Intel and Larrabee. I wish Intel would allow Nvidia to have an x86 license, but then again, it is probably too late for that anyway. Nvidia has to stick around for ATI's future too. Competition is what makes everything worthwhile.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  24. #24
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Nah we don't want that either, having multiple players in the graphics space is good for everybody. Nvidia is getting squeezed right now between AMD's graphics focus and Intel's compute focus. They're trying to fight on two fronts and time will tell if they persevere.

  25. #25
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    The last line in that article is very important. Tegra and its successors will save NVIDIA. Mobile phones are dirt cheap but there are millions of them and lots of money to be made there. NVIDIA will continue to develop discrete GPUs for graphic and CUDA usages but I suspect mobile applications will become their new cash cow.

Page 1 of 4 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •