Page 4 of 4 FirstFirst 1234
Results 76 to 90 of 90

Thread: SLI on X58 without nF200 Chip

  1. #76
    Never go full retard
    Join Date
    Feb 2008
    Location
    Vegas
    Posts
    3,984
    SWEET!!!

  2. #77
    Xtreme Member
    Join Date
    May 2008
    Posts
    462
    So what does this mean for me:

    I am planning on using two GTX 280's in SLI but i also have a SAS controller that that is PCIe 8X. Will i need the nF200 chip so that I can have two lanes at 16X and one at 8X?

  3. #78
    Evil Kitty
    Join Date
    Jun 2002
    Location
    Detroit, MI
    Posts
    3,305
    SLI on an Intel based mobo.....finally. Honestly, I'm pretty done with these Nvidia based boards. The 680i and now 790i have worked well, but they never delivered the kind of stability I cherish.

    This also means I can go back to offering SLI on my client builds again (which I have not bean able to do since I will not warranty an Nvidia based mobo for OCing builds).

    Good news indeed.
    9900k @ 5.1Ghz
    Asus Maximus Hero XI
    32GB (8 x 4) Gskill @4000
    Strix 2080 Ti OC
    OS & Apps: Samsung 970 Pro 512GB
    Games: Samsung 970 Pro 1TB
    Storage: Crucial M500 2TB
    Seasonic Platinum 1000W
    Phanteks Evolv X

  4. #79
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by gojirasan View Post
    Is that supposed to be funny? I sure as hell wouldn't pay it. It's a different world now with AMD becoming so badass, born-again-hard. Nvidia now has some real competition in GPU-land. So they just can't pull that kind of crap anymore. Of far as I'm concerned SLI is already dead in the water with the availability of cards like the 4870x2. Maybe this move will help revive it. Why would I pay even an extra $20 for SLI support whether there is an extra silicon space heater onboard or not? To hell with that. F U Nvidia for ever having limited SLI to your own chipsets in the first place. Jerks.
    Wait... weren't you one of the biggest Nvidia fanbois..? What happened?

  5. #80
    iadstudio
    Guest
    Quote Originally Posted by gojirasan View Post
    Is that supposed to be funny? I sure as hell wouldn't pay it. It's a different world now with AMD becoming so badass, born-again-hard. Nvidia now has some real competition in GPU-land. So they just can't pull that kind of crap anymore. Of far as I'm concerned SLI is already dead in the water with the availability of cards like the 4870x2. Maybe this move will help revive it. Why would I pay even an extra $20 for SLI support whether there is an extra silicon space heater onboard or not? To hell with that. F U Nvidia for ever having limited SLI to your own chipsets in the first place. Jerks.
    Ha. Well said.

    I know SLI would be possible on the x38, but not readily enabled by nVidia and not easily hacked anytime soon. Apple and nVidia are both jacka$$'s for pulling the crap they do. Only reason I have nVidia right now is because it was the best quick fix at the time, but I can promise you that I will NEVER use nVidia for multi gpu solution unless it's on an Intel platform.

    I see where the brand equity BS comes into play with nVidia, but as long as they have AMD going head-to-head w/ them on performance, their brand isn't worth crap unless they are the performance king.

    This coudl be the only reason I end up with nVidia. Also the reason I puked a little bit in my mouth when they bought Ageia.

    http://www.tgdaily.com/content/view/37611/140/

  6. #81
    Xtreme Addict
    Join Date
    Dec 2003
    Location
    At work
    Posts
    1,369
    Nvidia,

    How's it feel to be pushed around by the big dog?? Kudos to Intel for standing firm against Nvidia and telling them the way things will go, instead of the other way around (with Nvidia calling the shots). Intel FINALLY decided to play hardball with NVidia...and predicably, NVidia lost. Now we'll see how many NVidia chipsets sell...personally, ever since I had one on my dual Opteron system, I've wanted nothing further to do with NVidia core logic.

    NVidia finally gets its comeuppance....today is a sweet day indeed...
    Server: HP Proliant ML370 G6, 2x Xeon X5690, 144GB ECC Registered, 8x OCZ Vertex 3 MAX IOPS 240GB on LSi 9265-8i (RAID 0), 12x Seagate Constellation ES.2 3TB SAS on LSi 9280-24i4e (RAID 6) and dual 1200W redundant power supplies.
    Gamer: Intel Core i7 6950X@4.2GHz, Rampage Edition 10, 128GB (8x16GB) Corsair Dominator Platinum 2800MHz, 2x NVidia Titan X (Pascal), Corsair H110i, Vengeance C70 w/Corsair AX1500i, Intel P3700 2TB (boot), Samsung SM961 1TB (Games), 2x Samsung PM1725 6.4TB (11.64TB usable) Windows Software RAID 0 (local storage).
    Beater: Xeon E5-1680 V3, NCase M1, ASRock X99-iTX/ac, 2x32GB Crucial 2400MHz RDIMMs, eVGA Titan X (Maxwell), Samsung 950 Pro 512GB, Corsair SF600, Asetek 92mm AIO water cooler.
    Server/workstation: 2x Xeon E5-2687W V2, Asus Z9PE-D8, 256GB 1866MHz Samsung LRDIMMs (8x32GB), eVGA Titan X (Maxwell), 2x Intel S3610 1.6TB SSD, Corsair AX1500i, Chenbro SR10769, Intel P3700 2TB.

    Thanks for the help (or lack thereof) in resolving my P3700 issue, FUGGER...

  7. #82
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Melbourne, Australia
    Posts
    1,478
    This might have already been posted, but bit-tech ran an article here:
    http://www.bit-tech.net/news/2008/08...-sli-support/1

    I found this bit interesting (about hacking the bios/drivers whatever nVidia are gonna use):

    So what about if a budding enthusiast manages to extract the key from one or more boards? Nvidia said it wouldn’t do anything to stop enthusiasts enabling SLI support on non-certified motherboards themselves. Tom Petersen, Technical Marketing Director in Nvidia’s chipset business unit, said that he’d be quite happy if enthusiasts did that because it’d mean they’d be using two (or more) Nvidia graphics cards in their system.
    But also:

    He added that the certification process is in place to ensure a great out-of-the-box experience – boards that aren’t certified by Nvidia may encounter problems and it’ll require some BIOS modification on the user’s part. I’m not quite sure how Nvidia will react to custom BIOS files enabling SLI support on non-certified boards being hosted on the ‘net, as the company’s legal team has had a fairly rocky relationship with modified driver developers in the past – things could play out either way here.
    Bring it boys, we got the best right here at XS! I'm sure we can figure something out...

  8. #83
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    Good news because it means that you can buy a mobo and then choose the best gpu option, down the road, for a given time period. If they did not do this, I think a lot of folks just would have opted for the CF solution only.

    Bad news is that nearly all X58's will have probably paid for this (the enthusiast models which will be almost all models) thus the costs will go up even for those who will never run SLI. But that's acceptable imo, because the whole NF200, add layers to the mobo, add heat to the setup, thing was just totally wrong.

  9. #84
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by iadstudio View Post
    Ha. Well said.

    This coudl be the only reason I end up with nVidia. Also the reason I puked a little bit in my mouth when they bought Ageia.

    http://www.tgdaily.com/content/view/37611/140/
    It uses OpenGL (has nothing to do with GPGPU), AMD demoed it on the Radeons (http://techreport.com/articles.x/14990/16). I'm gonna correct everyone until the the record's right.
    Last edited by Macadamia; 08-29-2008 at 04:22 AM.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  10. #85
    Xtreme Addict
    Join Date
    Sep 2005
    Location
    UK
    Posts
    1,696
    Quote Originally Posted by ShArKo View Post
    Yes its logical FOR US..but not for nV. Because if they allow SLI on chipset INTEL, the people will just buy INTEL CHIPSET and no longer nV chipset.. that why
    The true implications of this would be that they would sell many more gpu's and slightly less motherboards, people would still go lengths to buy nvidia over intel. If they did sell absolutely no motherboards then that would be a benefit to us and them. They wouldn't have to waste time supporting and building a crud chipset while all the OEM's like evga take the stick for everything and we, the users wouldn't have to use the crap in the first place.


    But look at it now, nv are no longer ABLE to make a chipset and intel is the only option (how it should be)
    Workstation:
    3960X | 32GB G.Skill 2133 | Asus Rampage IV Extreme
    3*EVGA GTX580 HC2 3GB | 3*Dell U3011
    4*Crucial M4 256GB R0 | 6*3TB WD Green R6
    Areca 1680ix-24 + 4GB | 2*Pioneer BDR-205 | Enermax Plat 1500W
    Internal W/C | PC-P80 | G19 | G700 | G27
    Destop Audio:
    Squeezebox Duet | Beresford TC-7520 Caiman modded | NAD M3 | MA RX8 | HD650 | ATH-ES7
    Man Cave:
    PT-AT5000E | TXP65VT30 | PR-SC5509 | PA-MC5500 | MA GX300*2, GXFX*4, GXC350 | 2*BK Monolith+
    Gaming on the go:
    Alienware M18x
    i7 2920XM | 16GB DDR3 1600
    2*6990 | WLED 1080P
    2*Crucial M4 256GB | BD-RW
    BT 375 | Intel 6300 | 330W PSU

    2011 Audi R8 V10 Ibis White ABT Tuned - 600HP

  11. #86
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by Xoulz View Post
    Wait... weren't you one of the biggest Nvidia fanbois..? What happened?
    Good contribution to the thread right there





    [ot] I though Nvidia has pulled out of Chipset business now but yeah, am I wrong or right?
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  12. #87
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Buenos Aires - Argentina
    Posts
    438
    Quote Originally Posted by 3NZ0 View Post
    The true implications of this would be that they would sell many more gpu's and slightly less motherboards, people would still go lengths to buy nvidia over intel. If they did sell absolutely no motherboards then that would be a benefit to us and them. They wouldn't have to waste time supporting and building a crud chipset while all the OEM's like evga take the stick for everything and we, the users wouldn't have to use the crap in the first place.


    But look at it now, nv are no longer ABLE to make a chipset and intel is the only option (how it should be)
    yes that is true.....but nV is trying to expan the market.......so they will try to get more and more marked....sellding chipset, VGAs, etc.
    Intel C2D E8400 @ 4Ghz // Asus Maximus Formula x38 @ FSB 500Mhz// G.Skill Pi 2x2GB 1000Mhz // 9800GTX+ SSC eVGA // WD Raptor X 150GB + Samsung Spinpoint 2TB // DVD-RW DL Asus // Powercooler 850W // CM Cosmos S // Samsung T260N / Edifier R251 // Logitech G9 // Logitech G15 Rev2

    Cooling System: Swiftech Apex Ultra H2O Kit + MCW60

  13. #88
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Wow tri-sli + 1 card for physX is supported

    Even tho I used to think that nvidia and intel were just playing games until an deal was signed, I was not so sure lately, seeing how they were constantly exchanging words and stuff..

  14. #89
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    352
    Quote Originally Posted by 3NZ0 View Post
    But look at it now, nv are no longer ABLE to make a chipset and intel is the only option (how it should be)
    I read an article where it states that nvidia does have a QPI license for nehalem, but they will not make any boards soon because it is not cost effective to them, in other words they will not sell too many boards. It is very tough beating intel at their own game

  15. #90
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    I wonder if it would be possible to use that NF200 chip for Xfire solutions? Perhaps an ATI engineer could teach a hacker how to "reverse-engineer" the NF200 so that it allows Xfire to take advantage of the extra PCIE bandwidth... ooooh *puts pinkie finger in the mouth*!

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

Page 4 of 4 FirstFirst 1234

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •