Page 9 of 24 FirstFirst ... 678910111219 ... LastLast
Results 201 to 225 of 593

Thread: GTX 560 Coming soon!

  1. #201
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    News-flash: HD6970 isn't the high end card in AMD's line-up. There is a HD6990 coming up and nVidia doesn't have a counter, no matter how much you'd want a GTX570x2 to appear.

    You have no idea what the price for both GTX560 and HD6950 1GB is going to be, but you specualte wildly and unconstrained. They might end up above what is called "mainstream".

    You argue 2GB card versions are pointless, but then say there will be a vacuum for nVidia to fill? 2GB makes a lot of sense from an OpenCL/CUDA perspective and for Eyefinity. Oh wait, nVidia doesn't do Eyefinity with one card. So who's got a bigger potential market for 2GB cards?? I won't even mention the obvious fact, that 2GB is being future-proof.

  2. #202
    Xtreme Mentor
    Join Date
    Jan 2009
    Location
    Oslo - Norway
    Posts
    2,879
    Quote Originally Posted by Final8ty View Post
    Reasoned debate ? !

    NV are going to kill off the 2GB version because they have a 1GB version out as well & there cant be both right ? by your logic.
    Yep, this is very logical. If you can follow with the logic. Maybe I have to repeat for you.


    AMD should somehow "upgrade" the 6970-2GB to take a good fight with 580. But they chose to "downgrade" it to 1GB to make it cheaper and fight in mainstream.

    If nVidia gets aggressive, and I believe they will (with several flavor of 560, price-cut on 570, and maybe e new GPU, etc .. ), then AMD has to sell 1GB mainstream card really cheap to compete. This will create a BIG price-gap between 1Gb and 2GB. AMD won't be able to justify that price-Gap for "2GB gains", and have to reduce the price on 2GB, but this move will affect the price on 1GB too. This will go in a circle.

    What makes this even a bigger problem: These 1GB and 2GB would probably perform almost he same for most gamers on 24". So the "cheap" 1GB eats most of the marked for 2GB too, on top of what nvidia eats.

    You got the logic?, I can't get into more details without a long post?

    ASUS P8P67 Deluxe (BIOS 1305)
    2600K @4.5GHz 1.27v , 1 hour Prime
    Silver Arrow , push/pull
    2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
    GTX560 GB OC @910/2400 0.987v
    Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
    CM Storm Scout + Corsair HX 1000W
    +
    EVGA SR-2 , A50
    2 x Xeon X5650 @3.86GHz(203x19) 1.20v
    Megahalem + Silver Arrow , push/pull
    3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
    XFX GTX 295 @650/1200/1402
    Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
    SilverStone Fortress FT01 + Corsair AX 1200W

  3. #203
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Sam_oslo View Post
    Yep, this is very logical. If you can follow with the logic. Maybe I have to repeat for you.


    AMD should somehow "upgrade" the 6970-2GB to take a good fight with 580. But they chose to "downgrade" it to 1GB to make it cheaper and fight in mainstream.

    If nVidia gets aggressive, and I believe they will (with several flavor of 560, price-cut on 570, and maybe e new GPU, etc .. ), then AMD has to sell 1GB mainstream card really cheap to compete. This will create a BIG price-gap with 2GB, and AMD won't be able to justify the price for "2GB gains". AMD has to reduce the price on 2GB to sell it, but this move will affect the price on 1GB too. This will go in a circle.

    What makes this even a bigger problem: These 1GB and 2GB would probably perform almost he same for most gamers on 24". So the "cheap" 1GB eats most of the marked for 2GB too, on top of what nvidia eats.

    You got the logic?, I can't get into more details without a long post?
    So far the only person who sees your logic is yourself because you ignore facts.
    Last edited by Final8ty; 01-18-2011 at 07:37 AM.

  4. #204
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    I can sort of see what Sam is getting at in a sense that AMD, if they lower the price of their cards low enough, the sales of the 2gb versions of their cards are going to plummet depending how well the gtx 560ti performs and how they price their 1gb cards. If the gtx 560 is priced around 199-229, AMD whose cards are usually priced better, will have to price their 6950 at $229. At that price, I can see it being a tough sale of AMD's 2gb cards unless AMD lower the price. I personally would rather have 2 6950 1gb at 229 each than 1 6970 2gb at 369. 90 bucks more for that much more performance will be totally worth it.

    This next set of 1gb cards from AMD and NV is going to make setting a gtx 580, Antilles and 6970 2gb a tough sell. The market is just so hyper competitive now and the value is just too good at the under 250 market.

    There is a small market of people that can take advantage of 2gb cards, but it is a seriously small market(a percent of a percent), I think someone had stats where basically it was somewhere around 0.12% or 0.0012 of the market use 30" monitors and eyefinity can't be that much better.

    It won't make the 2gb obsolete, however, 1gb cards will for most however be the better value and sales of the 2gb are going to plummet unless AMD does a price drop.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  5. #205
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by tajoh111 View Post
    I can sort of see what Sam is getting at in a sense that AMD, if they lower the price of their cards low enough, the sales of the 2gb versions of their cards are going to plummet depending how well the gtx 560ti performs and how they price their 1gb cards. If the gtx 560 is priced around 199-229, AMD whose cards are usually priced better, will have to price their 6950 at $229. At that price, I can see it being a tough sale of AMD's 2gb cards unless AMD lower the price. I personally would rather have 2 6950 1gb at 229 each than 1 6970 2gb at 369. 90 bucks more for that much more performance will be totally worth it.

    This next set of 1gb cards from AMD and NV is going to make setting a gtx 580, Antilles and 6970 2gb a tough sell. The market is just so hyper competitive now and the value is just too good at the under 250 market.

    There is a small market of people that can take advantage of 2gb cards, but it is a seriously small market(a percent of a percent), I think someone had stats where basically it was somewhere around 0.12% or 0.0012 of the market use 30" monitors and eyefinity can't be that much better.

    It won't make the 2gb obsolete, however, 1gb cards will for most however be the better value and sales of the 2gb are going to plummet unless AMD does a price drop.
    Yes it will effects sales but that happens in all segments for the most part anyway & has always been the case so that's nothing new.

    NV don't even need to be around to effect the sales of the 2GB versions as just the release of a 1GB version will have effects on sales of 2GB version & any product that has more than one type will have an effect on the other.

    If the 1GB was only available then everyone would buy it but as soon as a 2GB version comes it it will have some effect on the sales of the 1GB version, far less effect but still.

    I seen allot of people jump from 4xxx 512MB cards to 4xxx 1GB cards when the 4xxx 1GB came out.
    Last edited by Final8ty; 01-18-2011 at 07:56 AM.

  6. #206
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by Final8ty View Post
    Yes it will effects sales but that happens in all segments for the most part anyway & has always been the case so that's nothing new.
    I think the difference this time around though is the performance of the under 250 is going to be so high that justifying paying more than 300 is going to be insanely tough until better looking games come out that have higher requirements. In earlier generations, the gains in games were tangible by going up to the 300+ dollar cards because cards were slower and games actually tested the cards.

    However since games are being held back graphically because of consoles, for people to really test their cards and push them have to use unnecessary amounts of AA to really notice an appreciable performance gain that their eyes can see.

    The 230 dollars card when it comes to performance will be faster than ever and the appeal will start attracting a larger portion of the enthusiast market who used to spend 350+ since the gains from spending that much are getting smaller and smaller. Especially when you take into account the unnecessarily high frame rates of these cards and the high turnover rate of a new generation of cards coming out every year.

    The value at 28nm is going to be ridiculous next generation. If they fix naming next generation and shrinks go as planned, they are going to be able to make 69xx chips 180mm2. Which they can fit into the $159 market or the 77xx series. Getting that much performance when games are still not graphically demanding is great and sad at the same time.

    The gains of 1gb vs 512 were noticeable, the gains of 1gb vs 2gb are only noticeable under rare instances and monitors. The gains from 1gb to 2gb are not as appreciable as they should because performance requirements haven't gone up that much as crysis is still one of the most demanding games out there.
    Last edited by tajoh111; 01-18-2011 at 08:06 AM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  7. #207
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by tajoh111 View Post
    I think the difference this time around though is the performance of the under 250 is going to be so high that justifying paying more than 300 is going to be insanely tough until better looking games come out that have higher requirements. In earlier generations, the gains in games were tangible by going up to the 300+ dollar cards because cards were slower and games actually tested the cards.

    However since games are being held back graphically because of consoles, for people to really test their cards and push them have to use unnecessary amounts of AA to really notice an appreciable performance gain that their eyes can see.

    The 230 dollars card when it comes to performance will be faster than ever and the appeal will start attracting a larger portion of the enthusiast market who used to spend 350+ since the gains from spending that much are getting smaller and smaller. Especially when you take into account the unnecessarily high frame rates of these cards and the high turnover rate of a new generation of cards coming out every year.

    The value at 28nm is going to be ridiculous next generation. If they fix naming next generation and shrinks go as planned, they are going to be able to make 69xx chips 180mm2. Which they can fit into the $159 market or the 77xx series. Getting that much performance when games are still not graphically demanding is great and sad at the same time.
    I have no argument with that.
    MY point is that it will not add up to the 2GB versions being dropped from ATI without a replacement.

  8. #208
    Xtreme Enthusiast
    Join Date
    Jan 2003
    Location
    MA
    Posts
    916
    This is great news for me,Im building a new rig and was just about to buy a gtx460 and now Ill wait it out and just use my old 8800gt.this card is going to sell very well,mainstream gamers dont have the money to spend on the highend stuff and if this card is under 230 for the 1gb its going to sell out fast.

  9. #209
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by Sam_oslo View Post
    AMD should somehow "upgrade" the 6970-2GB to take a good fight with 580. But they chose to "downgrade" it to 1GB to make it cheaper and fight in mainstream.
    You got it all wrong...

    The 6970 was not meant to fight the with 580, the 6990 is.

    The rest of your conclusions are wrong too, if you started out with a wrong assumption.

    And in case nobody noticed, there already where 2GB versions of a 5870 sold, with a quite big price gap and somehow nobody complained.

  10. #210
    Xtreme Member
    Join Date
    Aug 2010
    Location
    Athens, Greece
    Posts
    116
    For a single monitor up to 1080p resolution there is no real need for 2G memory. 1G is and will be sufficient for this year.

    For Eyefinity and 2560x1600 resolution monitors, 2G memory is a mast and I expect the 2G memory to be more mainstream next year but not today.

    AMD will not EOL the 2G HD6900 cards, but the introduction of the 1G HD6900 cards will kill the sales of the 2G HD6900 series.
    I expect the 1G cards to be very close vs the 2G cards up to 1920x1080 (97-98%??) and so most of the users will go for the 1G cards.
    Last edited by Aten-Ra; 01-18-2011 at 08:21 AM.
    Intel Core i7 920@4GHz, ASUS GENE II, 3 x 4GB DDR-3 1333MHz Kingston, 2x ASUS HD6950 1G CU II, Intel SSD 320 120GB, Windows 7 Ultimate 64bit, DELL 2311HM

    AMD FX8150 vs Intel 2500K, 1080p DX-11 gaming evaluation.

  11. #211
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by DarthShader View Post
    The 6970 was not meant to fight the with 580, the 6990 is.
    No. That is incorrect.

    The HD 6970 was meant to compete with the GTX 480. NVIDIA effectively one-upped AMD by releasing the GTX 580 and GTX 570 which meant that AMD's new card suddenly found itself competing with a $350 GTX 570 instead of a $500 GTX 480.

    Personally, I hope that NVIDIA never releases a dual GPU card. It's just a waste of time and resources in an era where every motherboard in the $200+ range sports dual card functionality. Back when P35 was around, it made sense. Now it's just a play for bragging rights regardless of how buggy and problem-prone the card will become. And whether or not any will admit it, there is no magical mojo that can be added to a dual GPU product which will make it any more appealing than two dedicated separate cards.



    Back on track:

    Like it or not there is a reason why AMD is going to be allowing their board partners to release 1GB version of the higher-end SKUs and will be releasing a "performance" 11.1a "hotfix" right before NVIDIA introduces new products. At this point in time they are worried over the implications of a potential sub-$300 card which could potentially put a good portion of their current lineup to shame. NVIDIA did the same with the GTX 460 OC versions when the HD 6800-series was introduced.

    Unfortunately, for all intents and purposes it looks like the mid-Feb release of 1GB AMD SKUs could be a pipe dream. I have talked to a number of board partners and quite a few don't even have these lower-priced cards on their roadmaps yet. Hopefully these won't be vaporware like the GTX 460 SE ended up being.

  12. #212
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Quote Originally Posted by SKYMTL View Post
    No. That is incorrect.
    Unfortunately, for all intents and purposes it looks like the mid-Feb release of 1GB AMD SKUs could be a pipe dream. I have talked to a number of board partners and quite a few don't even have these lower-priced cards on their roadmaps yet. Hopefully these won't be vaporware like the GTX 460 SE ended up being.
    Sapphire 1GB 6950 is already listed, as is HIS. Of course it's up to the AIBs to make the 1GB cards and probably not all of them are rushing it and some probably will never offer such SKU.

    IIRC some AIBs were going to release higher OC 6870's so the 1GB 6950 and 6870 OC models should be enough of a response for GTX 560. This most certainly won't be killing 2GB models, but competition with a 1GB 6950 could drag the 2GB model prices down along with it.
    "No, you'll warrant no villain's exposition from me."

  13. #213
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    PHX
    Posts
    1,494
    460 SE's all over Fry's wall...do you mean something else?

  14. #214
    Xtreme Enthusiast
    Join Date
    Mar 2010
    Location
    Istanbul
    Posts
    606
    Update (01/18): Gigabyte commented on this article. The company outright denied to have anything to do with whatever is in those pictures, and alleged it to be some kind of a "malicious attack" on it. In a statement, it said: "the information is false and the data is simulated from our old card. The picture is incorrect and was obviously photoshopped from our previous GTX460 model. The GTX560 card looks nothing like pictured on the article. We have good reason to believe this is a malicious attack."
    http://www.techpowerup.com/138621/Gi...ured.html?cp=1

  15. #215
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by Jodiuh View Post
    460 SE's all over Fry's wall...do you mean something else?
    this is possibly the LARGEST Canadian etailer...

    http://www.ncix.com/search/?categoryid=0&q=gtx+460+se

    stocking a whole 2 at prices the same as what you can get a real 1GB GTX 460 for...

    only real place to get an SE in Canada.

    http://www.newegg.ca/Product/Product...460+se&x=0&y=0

    personally I think these will be great cards for the 1680x1050 crowd along with the GTX 560ti but 1GB is just not enough for any higher resolution... if they happened to discontinue the 2GB 6950 because of these cards that would be the worst move they could do. because for all of us on XS the 6970 is a joke considering it is more expensive then a GTX 570 and the 6950's can be unlocked so easy... the 6950 2GB is AMD's best looking card at the moment and even once then 1GB cards come out it will still be the best looking card...

    edit: wopps totally thought this was the HD69XX's 1GB thread, pretty much the same discussion going on anyways
    Last edited by [XC] hipno650; 01-18-2011 at 09:58 AM.
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  16. #216
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by SKYMTL View Post
    The HD 6970 was meant to compete with the GTX 480. NVIDIA effectively one-upped AMD by releasing the GTX 580 and GTX 570 which meant that AMD's new card suddenly found itself competing with a $350 GTX 570 instead of a $500 GTX 480.
    Thank you for confirming, that the 6970 was not meant to fight the 580.

    Personally, I hope that NVIDIA never releases a dual GPU card. It's just a waste of time and resources in an era where every motherboard in the $200+ range sports dual card functionality. Back when P35 was around, it made sense. Now it's just a play for bragging rights regardless of how buggy and problem-prone the card will become. And whether or not any will admit it, there is no magical mojo that can be added to a dual GPU product which will make it any more appealing than two dedicated separate cards.
    You should say to that EVGA. They listened to you with their guarantees, maybe they will agree with you here too and drop their dual card.

    Joking aside, I am not sure I can agree with you. Part of what makes us currently percieve dual cards as useless is the 32nm cancelation/ 28nm slips which made the single cards come close to the 300W wall. At that point SLI/corssfire makes for both better performance and easier cooling. If nVidia was able to, I can guarantee they'd have a dual card already, 100%.

    A dual card can still have it's uses. In the ultra high-end, it's easier to do quadfire with two dual-cards. In the lower regions, depending on pricing, it might be cheaper to get a cheap P67 board that doesn't have two at least x8 slots, a 2500K and a dual card, than an expensive board and two regular cards. There's also the mATX and ITX form factor, where Crossfire/SLI is troublesome or impossible. See Sugo SG07 for example. Finally, there are OpenCL/CUDA applications, where packing twice as much chips in the same space is going to give moar perforamnce etc.

  17. #217
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by DarthShader View Post
    Thank you for confirming, that the 6970 was not meant to fight the 580.


    You should say to that EVGA. They listened to you with their guarantees, maybe they will agree with you here too and drop their dual card.

    Joking aside, I am not sure I can agree with you. Part of what makes us currently percieve dual cards as useless is the 32nm cancelation/ 28nm slips which made the single cards come close to the 300W wall. At that point SLI/corssfire makes for both better performance and easier cooling. If nVidia was able to, I can guarantee they'd have a dual card already, 100%.

    A dual card can still have it's uses. In the ultra high-end, it's easier to do quadfire with two dual-cards. In the lower regions, depending on pricing, it might be cheaper to get a cheap P67 board that doesn't have two at least x8 slots, a 2500K and a dual card, than an expensive board and two regular cards. There's also the mATX and ITX form factor, where Crossfire/SLI is troublesome or impossible. See Sugo SG07 for example. Finally, there are OpenCL/CUDA applications, where packing twice as much chips in the same space is going to give moar perforamnce etc.

    im not going to say dual cards are not cool and have some place, but i dunno about that part. i have not seen any P67 boards without dual 8x slots and if your buying a 600-700 dollar video card your not buying a $125 mobo... and if you are well then you have a thing or two to learn....

    now the argument for mATX and miniITX is decent but something like a 5970 can't fit in half of the mid towers on the market without modding let alone most things smaller. MAYBE mATX but then again most good mATX mobos can run a pair of dual slot video cards so normal CFX or SLI is still possible.

    the only real use I can see for dual GPU cards is for n00bs with to much money who want 4 GPU's in their system. and good for them but I would take a TRI SLI of 580's any day before a quad 5970 setup as scaling beyond 3 GPU's is questionable at best.

    I do think it would be cool to see the GTX 460's offer TRI SLI support but I highly doubt that happening.
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  18. #218
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Sam_oslo View Post
    You nailed right on the problem, yep 6970 is AMD's high-end card, but tell me, why is this card fighting in mainstream now?

    The way I look at it, AMD has chosen to "degrade" it to 1GB to make it cheaper to fight in mainstream.
    Using one core to fill several market segments makes sense economically.
    And 2GB cards aren't going anywhere... They didn't "degrade" 6970, they are just releasing another card, and these two are not supposed to compete with each other... They are just giving people more options, so they can spend just as much as they need for their resolution while still keeping a very fast core.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  19. #219
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Pantsu View Post
    Sapphire 1GB 6950 is already listed, as is HIS. Of course it's up to the AIBs to make the 1GB cards and probably not all of them are rushing it and some probably will never offer such SKU.

    IIRC some AIBs were going to release higher OC 6870's so the 1GB 6950 and 6870 OC models should be enough of a response for GTX 560. This most certainly won't be killing 2GB models, but competition with a 1GB 6950 could drag the 2GB model prices down along with it.
    As we have seen in the past, listing cards does not mean sufficient (or any) stock.

    The GTX 460 SE was picked up by a few retailers but never reached widespread availability. Again, I am not saying this will happen with lower end SKUs of the HD 6950 since I am hoping to see at least SOME competition.
    Last edited by SKYMTL; 01-18-2011 at 10:42 AM.

  20. #220
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    743
    Quote Originally Posted by Sam_oslo View Post
    Yep, this is very logical. So the "cheap" 1GB eats most of the marked for 2GB too, on top of what nvidia eats.

    You got the logic?, I can't get into more details without a long post?
    The 1gb 6950 is only going to be $20 cheaper.
    As an enthusiast what would you choose? 6950 2gb flash to 6970 or 1gb 6950 $20 less -most likely unflashable

    As a complete non enthusiast unaware of the former above: 1gb vs 2gb omg only $20 more. Tempting toss up decision.

    I hope there is a price war between GTX 560 and 1gb 6950.

  21. #221
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by DarthShader View Post
    You should say to that EVGA. They listened to you with their guarantees, maybe they will agree with you here too and drop their dual card.
    Remember that the dual chip card you are talking about was simply a proof of concept for the time being. It was never planned for release.


    Joking aside, I am not sure I can agree with you. Part of what makes us currently percieve dual cards as useless is the 32nm cancelation/ 28nm slips which made the single cards come close to the 300W wall. At that point SLI/corssfire makes for both better performance and easier cooling. If nVidia was able to, I can guarantee they'd have a dual card already, 100%.
    Not necessarily. If NVIDIA was willing to push past the 300W envelope they would do so by simply handing off a basic reference design to their board partners and letting them handle the fallout. Personally, I think they see it the way I do: why bother with a 300W, dual GPU card when thermal and efficiency limitations mean it will offer relatively minimal performance increases over the flagship single chip products while retailing for significantly more?

    The questions above lead me directly to another point: the issues AMD is having with Antilles. They are likely grappling with the fact that Cayman runs hot and can consume significantly more power than the previous HD 5800series. This means scaling back the architecture in order to meet expectations regarding power consumption, etc. but performance will also suffer. Considering the GTX 580's performance against a single fully-endowed HD 6970, AMD has to strike a very delicate balance with Antilles in order to make it a worthwhile purchase.

    A dual card can still have it's uses. In the ultra high-end, it's easier to do quadfire with two dual-cards. In the lower regions, depending on pricing, it might be cheaper to get a cheap P67 board that doesn't have two at least x8 slots, a 2500K and a dual card, than an expensive board and two regular cards. There's also the mATX and ITX form factor, where Crossfire/SLI is troublesome or impossible. See Sugo SG07 for example. Finally, there are OpenCL/CUDA applications, where packing twice as much chips in the same space is going to give moar perforamnce etc.
    None of these points I agree with unfortunately. I have yet to see a game that scales well with more than two GPUs. Tri-SLI and quad Crossfire have never performed up to expectations.

    You are also talking about price here. Every P67 board I have seen and will likely see has at least two x16 PCI-E slots that operate at 8 / 8 when dual GPUs are detected.

    Going further down the list, we get into the H67 territory which opens up a whole new can of worms. Even though most H67 boards lack the typical dual GPU capability of their P67 siblings, they also come without overclocking features people are looking for and some will still feature dual GPU support. In addition, why would someone cheap out on a motherboard and then spend mega bucks on a $600+ dual GPU card, a suitably high end monitor for it to work on AND a bleeding edge CPU?

    You and I both know that dual GPU cards are loud, hot and power hungry so why would anyone want one in a HTPC or SFF case? As it stands, there isn't a single ITX sized power supply with enough capacity to power a GTX 580 let alone a dual GPU monster. mATX brings us into another area altogether since there will be plenty of P67 mATX boards introduced in the coming months.

    So I will repeat what I said: dual GPU cards may look great and allow a company to plant the flag in a dramatic way but past that, I fail to see much use for them in most cases.

  22. #222
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by SKYMTL View Post
    None of these points I agree with unfortunately. I have yet to see a game that scales well with more than two GPUs. Tri-SLI and quad Crossfire have never performed up to expectations.
    Multi GPU as been around long enough for people to be aware of the scaling issues so expectations should of come in to line with the facts a long time ago & what people expect from there multi GPU setups varies depending on there own needs.

    My needs are to keep above 60fps & if that 3rd or 4 GPU only added 10fps that keept me at 60fps in stead of if 50fps if i dropped a GPU then tri or quadfire is doing its job based on my needs & wishes.

  23. #223
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by [XC] hipno650 View Post
    i have not seen any P67 boards without dual 8x slots
    http://www.msi.com/product/mb/P67A-C45.html
    Or checkout the Gigagyte UD3 series... the second slot is x4.

    and if your buying a 600-700 dollar video card your not buying a $125 mobo... and if you are well then you have a thing or two to learn....
    There's no problem in putting a high-end video card in a cheap motherboard. As well as there is no problem in putting a 125$ video card in a 600-700 dollar mobo.

    And when somebody saved up a given budget to build a gaming rig and won't be able to invest in it further, he is going to try to maximise the performance for the given costs. Since most games are GPU limited, he's going to buy the best GPU he can get and decent CPU. Mobo won't be high on that list, since it hardly influences gaming performance. If cost structure is right, ie. double card is cheaper or only a bit more expensive than equivalent two cards, he's going to choose the dual card.

    Yes, it's a specific hypothetical scenario, but a valid one.

    Quote Originally Posted by SKYMTL View Post
    The questions above lead me directly to another point: the issues AMD is having with Antilles. They are likely grappling with the fact that Cayman runs hot and can consume significantly more power than the previous HD 5800series. This means scaling back the architecture in order to meet expectations regarding power consumption, etc. but performance will also suffer. Considering the GTX 580's performance against a single fully-endowed HD 6970, AMD has to strike a very delicate balance with Antilles in order to make it a worthwhile purchase.
    They don't have to, they have powertune to handle this. They just need to pick low-leakage dies, so powertune has to be as least restrictive as possible. nVidia doesn't have this, what's more their flagship chip eats up way too much power on it's own, even after the "fix". In their case they would indeed have to give up too much performance to save power and make the product not worthwile as a result. But it's the fault of the architecture, not a concious strategy decision. IMO. The GTX460 already showed and the GTX560 will further prove how inefficient the GF100(b) chip is.

    I have yet to see a game that scales well with more than two GPUs. Tri-SLI and quad Crossfire have never performed up to expectations.
    True. 3dMark players will care though. And scaling improves:
    http://lab501.ro/placi-video/his-hd-...tii-multi-card

    3 cards start to make sense - and you most like can pair a 6990 with a 6970. Needing only two slots.

    Every P67 board I have seen and will likely see has at least two x16 PCI-E slots that operate at 8 / 8 when dual GPUs are detected.
    Then look above.

    You and I both know that dual GPU cards are loud, hot and power hungry so why would anyone want one in a HTPC or SFF case? As it stands, there isn't a single ITX sized power supply with enough capacity to power a GTX 580 let alone a dual GPU monster.
    Hmmm... Lanparty rigs?

    http://www.silverstonetek.com/produc...pno=SG07&area=

    Has a PSU inside... don't tell you wouldn't like to have such gaimng rig.

    Bottom line, it sure isn't a huge market, but if AMD decided to do this, then they see it being profitable. For me personally, I can hardly imagine paying more than ca. 280$ for a graphics card, so the dual card market is not for me.

  24. #224
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    PHX
    Posts
    1,494
    Quote Originally Posted by [XC] hipno650 View Post
    ...SE...personally I think these will be great cards for the 1680x1050 crowd along with the GTX 560ti but 1GB is just not enough for any higher resolution...
    I've not ran an SE, but a 460 1GB @ 850Mhz is "just" enough to get me by in BC2 on a 4Ghz 760 w/ minimum framerates of 60 and an avg of 90. That's @ 1680, 4x MSAA, High details, HBAO off, Nvidia optimizations off in a jam packed server on the map Cold War.

    A 470 clocked to the same provides me w/ quite a bit more performance and allows me to use HBAO. So IMO, a 560 vanilla will be the bare minimum to enjoy that game @ that res for the discerning user.

    Quote Originally Posted by SKYMTL View Post
    The GTX 460 SE was picked up by a few retailers but never reached widespread availability.
    Probably because I live in Phoenix and have a pair of Fry's around, but it has definitely reached widespread availability for me. I routinely get SD alerts for etailers w/ SE models in stock including Newegg, Amazon, and Tigerdirect.

    However, they're ALWAYS more expensive than the 768's and rarely less than the 1 GB's as far as "deals" go. The EVGA's also come in a very sad little box.

    I want to hide them everytime I see them in the store...almost like I'm ashamed to be seen next to them.

    EDIT: In regard to that little Silverstone case...and especially the FT03...DO WANT!! I'm no he man and carrying this 800 lb Stacker to LAN's isn't the most exciting thing in the world to do.
    Last edited by Jodiuh; 01-18-2011 at 12:51 PM.

  25. #225
    Xtreme Mentor
    Join Date
    Jan 2009
    Location
    Oslo - Norway
    Posts
    2,879
    Good one It was a lazy photoshop-job, and I said it was fake yesterday:

    Quote Originally Posted by Sam_oslo View Post
    The 1GHz sounds good, but the picture seams to be fake.
    It looks like a photoshop GTX460 to me, they forgot, that OpenGL 3.2 should be 4.1 .



    The 1GHz may still be true tho, hopefully.

    ASUS P8P67 Deluxe (BIOS 1305)
    2600K @4.5GHz 1.27v , 1 hour Prime
    Silver Arrow , push/pull
    2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
    GTX560 GB OC @910/2400 0.987v
    Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
    CM Storm Scout + Corsair HX 1000W
    +
    EVGA SR-2 , A50
    2 x Xeon X5650 @3.86GHz(203x19) 1.20v
    Megahalem + Silver Arrow , push/pull
    3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
    XFX GTX 295 @650/1200/1402
    Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
    SilverStone Fortress FT01 + Corsair AX 1200W

Page 9 of 24 FirstFirst ... 678910111219 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •