Page 1 of 3 123 LastLast
Results 1 to 25 of 54

Thread: No more discrete chipsets from Nvidia

  1. #1
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Vancouver, BC
    Posts
    2,061

    No more discrete chipsets from Nvidia

    Wednesday, March 26, 2008 16:37
    Santa Clara (CA) - Current and future owners of Nvidia’s nForce 790i Ultra SLI and SLI chipsets own a piece of history: Nvidia confirmed to TG Daily that it is now focusing on delivering chipsets with integrated graphics.

    Bryan Del Rizzo, Nvidia's PR manager for platform products told us that the company will be starting to implement graphics in chipsets by default beginning with the nForce 700-series for AMD. The manufacturer will be using the mGPU for HybridPower operation. “For example, the nForce 780a is a high-end motherboard with an integrated [graphics] core," Del Rizzo said.

    Graphics chipset always has a low-end tone to it, but Nvidia’s announcement does not mean that the company will stop creating high-end chipsets, but rather work on expanding the power saving features of upcoming desktop and notebook parts.

    The goal of this move is also to connect a display through DVI, HDMI, DisplayPort or a similar cable directly to the motherboard. One, two, three or four discrete GPUs would provide 3D performance when required.

    This approach could also address the rising issue of latency. The fact that the frame buffer will now be located at the same spot for all GPUs should actually benefit the consumer.
    Source: TG Daily

    Good/Bad/Ugly?

  2. #2
    Xtreme X.I.P.
    Join Date
    Dec 2005
    Posts
    1,472
    Sounds like a nvidia approach to make sure they keep gaining market share
    CPU: Intel CORE 2 Duo E6550 @ 3.6GHz w/ 1.29vcore (517*7)
    Motherboard:
    Gigabyte P35-DQ6
    Memory:
    Crucial 8500's
    Video:
    Nvidia 8800GTX
    PSU:
    Zippy 700W (fan modded of course)

  3. #3
    Xtreme Addict
    Join Date
    Feb 2005
    Location
    Maine, USA
    Posts
    1,029
    So does this spell the end of SLI?

  4. #4
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    Quote Originally Posted by xenolith View Post
    So does this spell the end of SLI?
    What does one thing have to do with another?
    Sigs are obnoxious.

  5. #5
    Xtreme Addict
    Join Date
    Feb 2005
    Location
    Maine, USA
    Posts
    1,029
    Quote Originally Posted by iddqd View Post
    What does one thing have to do with another?
    I guess I reading it wrong. For second there I thought they were going to only focus on delivering chipsets with integrated graphics.

  6. #6
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    if they can done it well it is good so need to wait and see


    When i'm being paid i always do my job through.

  7. #7
    Xtreme Member
    Join Date
    Jun 2005
    Location
    Bulgaria, Varna
    Posts
    447
    Looks like the video display technology is following the audio path -- you can hardly find a main board without integrated AC'97/HDA output, so... you guess it.

    There was a topic about the [possible] return of the quasi-software 3D engines with the next generation of powerful enough CPU architectures with fused GPU capabilities and so on.

  8. #8
    Xtreme Enthusiast
    Join Date
    Jul 2003
    Posts
    895
    this will limit overclocks quite a bit
    Quote Originally Posted by charlie View Post
    honestly there are some really stupid people here. I mean stupid as in low IQ and fantastic imaginations with little deductive reasoning.

  9. #9
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Posts
    559
    Well considering the recent 780G from AMD can destory nvidia in that segment, they need to do something.
    x6.wickeD

  10. #10
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    so when is the nVIA cpu coming out?


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  11. #11
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    I like the part that says "the company will (...) work on expanding the power saving features of upcoming desktop parts".
    You were not supposed to see this.

  12. #12
    Xtreme Mentor
    Join Date
    Mar 2007
    Posts
    2,588
    nvidia doesnt care enough to label anything good/bad/etc.... they just want to turn a profit from anything and everything they get their hands on...

    this is where other brands need to band together just to stop nvidia and then resume normal market practice

  13. #13
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by safan80 View Post
    so when is the nVIA cpu coming out?
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  14. #14
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by hecktic View Post
    nvidia doesnt care enough to label anything good/bad/etc.... they just want to turn a profit from anything and everything they get their hands on...

    this is where other brands need to band together just to stop nvidia and then resume normal market practice

    Sorry to burst your bubble, but nVIDIA is a profit seeking corporate - just like AMD, Google, etc. Their only purpose is to turn a profit from anything and everything they get their hands on.
    You were not supposed to see this.

  15. #15
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    Quote Originally Posted by virtualrain View Post
    Source: TG Daily

    Good/Bad/Ugly?
    FUD
    No more discrete chipsets from Nvidia
    incredibly misleading title
    Santa Clara (CA) - Current and future owners of Nvidia’s nForce 790i Ultra SLI and SLI chipsets own a piece of history: Nvidia confirmed to TG Daily that it is now focusing on delivering chipsets with integrated graphics.
    we already knew nvidia were planning to include a GPU on all their future chipsets
    Graphics chipset always has a low-end tone to it, but Nvidia’s announcement does not mean that the company will stop creating high-end chipsets, but rather work on expanding the power saving features of upcoming desktop and notebook parts.
    aside from hybrid-SLI which is a given it implies no such thing
    The goal of this move is also to connect a display through DVI, HDMI, DisplayPort or a similar cable directly to the motherboard.
    well no
    This approach could also address the rising issue of latency. The fact that the frame buffer will now be located at the same spot for all GPUs should actually benefit the consumer.
    this guy should be working at mcdonalds
    Last edited by hollo; 03-27-2008 at 06:37 AM.

  16. #16
    Banned
    Join Date
    Feb 2008
    Posts
    213
    ooh yeah. We're back in the 3D Accelerators days. This is bs. I hope that the other manufacturers won't follow. I just can't stand integrated GPU's.

  17. #17
    I am Xtreme
    Join Date
    Jul 2004
    Location
    Little Rock
    Posts
    7,204
    Quote Originally Posted by BullGod View Post
    ooh yeah. We're back in the 3D Accelerators days. This is bs. I hope that the other manufacturers won't follow. I just can't stand integrated GPU's.
    QFT! Like built-in sound, it's more $hit we have to unnecessarily pay for. Well, not me because I don't buy any more nVidia based boards LOL!

    To the other guy, all business' are in business to make money. Business' have been known to Tank or go out of Business when trying to make too much too fast Folks will stay loyal to them for only so long.
    Quote Originally Posted by Movieman
    With the two approaches to "how" to design a processor WE are the lucky ones as we get to choose what is important to us as individuals.
    For that we should thank BOTH (AMD and Intel) companies!


    Posted by duploxxx
    I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
    Posted by gallag
    there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.
    qft!

  18. #18
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    my only complaint is that it makes the boards cost extra, but if they have some type of good power saving feature that uses integrated while idle, that money will quickly be replenished.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  19. #19
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    yeah the change itself is pretty trivial, unless you have a tri/quad-SLI setup as your everyday computer. the article was terribly written.

  20. #20
    Xtreme Addict
    Join Date
    Jan 2004
    Location
    somewhere in Massachusetts
    Posts
    1,009
    Umm, why is this a big deal?

    Since their hybrid graphics (w/e it's called) which goes to integrated to save power NEEDS that integrated bit, hasn't it been their intention for a while to put a GPU in the chipset thus meaning that NONE of their chipsets ought be without graphics.

  21. #21
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Copy AMD/ATI much? When will they come up with their own ideas For a company that has the market share that it does it amazes me that the only ideas they come up with are those from it's competitor.
    [SIGPIC][/SIGPIC]

  22. #22
    Banned
    Join Date
    Feb 2008
    Posts
    213
    Quote Originally Posted by DerekFSE View Post
    Why?
    Because a better approach would have been to make the cards themselves enter into an idle mode. That would have been great. But this my friend equals lag. If there are more components working to get the same thing done there will always be lag involved. That's why I don't use SLI, because of the micro stuttering issue. Now they are practically forcing everyone to use SLI. Fluck it, I won't buy an Nvidia chipset in the future and that's that. They lost a customer already with this great idea they have...

  23. #23
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Eastcoasthandle,
    Huh? As if iGPUs and power saving modes were invented by AMD/ATi...
    You were not supposed to see this.

  24. #24
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by Eastcoasthandle View Post
    Copy AMD/ATI much? When will they come up with their own ideas For a company that has the market share that it does it amazes me that the only ideas they come up with are those from it's competitor.
    Yes i was exactly thinking the same

    I wonder if there decisions have anyting to do with the QPI licenses from Intel. There are some rumors which indicate nVidia won't get a new QPI license and some rumors say they will get a QPI licenses for future upcoming hardware. Maybe nVidia tries to get more market share with other solutions to play safe with Intel?
    Last edited by ownage; 03-27-2008 at 07:53 AM.
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  25. #25
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    I just hope this doesn't kill overclocking on nvidia chipsets. Otherwise it's a great idea, I look forward to the day when a 10w integrated GPU will take over for my multiple GPUs. I would much rather it be a universal thing rather than a strictly platform based concept though. I can already see it, users using an AMD card on an Nvidia board will not be able to use the power-saving features due to driver conflicts.

Page 1 of 3 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •