Page 10 of 42 FirstFirst ... 7891011121320 ... LastLast
Results 226 to 250 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

  1. #226
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by MaddMuppet View Post
    To be honest the 580 doesn't run that hot. Mine are quite a bit cooler than the 480's I was using. There new heatsink seems to be doing quite well.

    but my question was about 2 580 on the same board .... how much cooling and downclocking can we expect out of those things
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  2. #227
    Xtreme Member
    Join Date
    Jan 2009
    Location
    Central PA/Southern NH
    Posts
    177
    Why does anyone care about the power, heat, or stock clock speed. Only enthusiasts would purchase this card, the dollar for performance would likely make the GTX580 the nVidia card for the mainstream high performance buyers. The additional cost of a 1kW+ power supply to feed the card would also limit the amount of people who would seriously consider such a card.

    Who is going to purchase a dual GF110 and leave it stock?
    [Intel core i7 4820K..........Asus Rampage IV Black Edition] LAN Parties attended:
    [512GB Samsung 840 PRO 512GB......2xWD Black 7200 2TB RAID0] FITES [fites.net] 2012, 2011, 2010, 2009, 2008, 2007
    [32GB G.Skill DDR3 2400 4x8GB....2xEVGA GTX780ti Classified] L'Pane NorEaster [lpane.net] 2010, 2009, 2008
    [Corsair 900D.......Primochill CTR 250......Corsair AX1200i] Quakecon [quakecon.org] 2010, 2009
    [MCP35x2.........Swiftech Apogee HD......Swiftech MCR420-XP] PAX East [east.paxsite.com] 2012

  3. #228
    Xtreme Member
    Join Date
    Oct 2008
    Location
    Colorado
    Posts
    312
    Exactly how much PCB space is left? Just wanted to point out that that is one full PCB. But yes its probably a dual 570 unless you clock down the chips severely.
    My rig the Kill-Jacker

    CPU: AMD Phenom II 1055T 3.82GHz
    Mobo: ASUS Crosshair IV Extreme
    Game GPU: EVGA GTX580
    Secondary GPU 2: EVGA GTX470
    Memory: Mushkin DDR3 1600 Ridgeback 8GB
    PSU: Silverstone SST-ST1000-P
    HDD: WD 250GB Blue 7200RPM
    HDD2: WD 1TB Blue 7200RPM
    CPU Cooler: TRUE120 Rev. B Pull
    Case: Antec 1200


    FAH Tracker V2 Project Site

  4. #229
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    probably everyone if it ends up with VRM's like the 570's.

    All along the watchtower the watchmen watch the eternal return.

  5. #230
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    Quote Originally Posted by Sn0wm@n View Post
    but my question was about 2 580 on the same board .... how much cooling and downclocking can we expect out of those things
    That depends on what chips and cooling solution Nvidia has planned. That is if this card is real though and not something fake.

    Quote Originally Posted by ZX2Slow View Post
    Why does anyone care about the power, heat, or stock clock speed. Only enthusiasts would purchase this card, the dollar for performance would likely make the GTX580 the nVidia card for the mainstream high performance buyers. The additional cost of a 1kW+ power supply to feed the card would also limit the amount of people who would seriously consider such a card.

    Who is going to purchase a dual GF110 and leave it stock?
    Exactly. If you are the kind of person that looks at power usage when buying a video card, then both the GTX590 and the ATI 6990 aren't for you. Both I would think will be at or over the 300Watt limit.
    Last edited by MaddMuppet; 01-31-2011 at 08:39 PM.

  6. #231
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by MaddMuppet View Post
    That depends on what chips and cooling solution Nvidia has planned. That is if this card is real though and not something fake.



    Exactly. If you are the kind of person that looks at power usage when buying a video card, then both the GTX590 and the ATI 6990 aren't for you. Both I would think will be at or over the 300Watt limit.
    If I was buying these cards, I would if anything, the more power it consumes the better with some rationality. Basically the more power these things consume, the closer these things will be to original spec. A 300 watt limit is going to castrate AMD just as much Nvidia, as the gtx 570 have similar performance and power usage.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  7. #232
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    Quote Originally Posted by tajoh111 View Post
    If I was buying these cards, I would if anything, the more power it consumes the better with some rationality. Basically the more power these things consume, the closer these things will be to original spec. A 300 watt limit is going to castrate AMD just as much Nvidia, as the gtx 570 have similar performance and power usage.
    I agree. I would like to see both using fully clocked high end chips. That would really heat things up in more than one way.
    If they did, then these card should be superb performers. Here's hoping.

  8. #233
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by ZX2Slow View Post
    Why does anyone care about the power, heat, or stock clock speed. Only enthusiasts would purchase this card, the dollar for performance would likely make the GTX580 the nVidia card for the mainstream high performance buyers. The additional cost of a 1kW+ power supply to feed the card would also limit the amount of people who would seriously consider such a card.

    Who is going to purchase a dual GF110 and leave it stock?

    The question is not who will buy them, but who will reviews them and produce them.. If you end with 375W on it and with a chips who run at 100°C average gaming, you go in trouble... Without saying the additionnal cost for a beastly cooler and all the power phase, the PCB layer for support thoses load etc etc.. All of this have a cost, a cost you will need to put in the final price... I have see a lot of 295 or 5970 users who don't even know a **** about hardware buying thoses cards, they wanted the faster and the higher price... they have the money, they buy it ( there's full of peoples on "classic "hardware forums like that )

    If they use GF114. they can maintain fast clock, and high OC margin... if they use GF110, they will need reduce the core and memory speed ( less of 700mhz for the core... ) this is what ask Zallbard about performance and where will be the real fight between the 6990 and this one. if the card perform around or under 570 level, 2x 6970 with 6950 clock speed ( who close 2x 580 SLI without problem ) will surely be enough... All in one, this will be a nice fight between both cards... i will not be surprised to see quickly Evga release Superclocked 590 and why not Sapphire / Asus release faster 6990 in response then. ( can be funny to watch as this is not really mainstream market... )

    AIB as Evga, Asus, can pass the limit, no problem ( Ares etc ), they provide a better cooling ( include watercooling as for the GTX295 ). But Nvidia or AMD can't ... they need to comply to certain rules ( and one is the PCI express 300W limit ).
    Last edited by Lanek; 02-01-2011 at 02:28 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  9. #234
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Argentina
    Posts
    223
    Quote Originally Posted by Lanek View Post
    The question is not who will buy them, but who will reviews them and produce them.. If you end with 375W on it and with a chips who run at 100°C average gaming, you go in trouble... Without saying the additionnal cost for a beastly cooler and all the power phase, the PCB layer for support thoses load etc etc.. All of this have a cost, a cost you will need to put in the final price... I have see a lot of 295 or 5970 users who don't even know a **** about hardware buying thoses cards, they wanted the faster and the higher price... they have the money, they buy it ( there's full of peoples on "classic "hardware forums like that )

    If they use GF114. they can maintain fast clock, and high OC margin... if they use GF110, they will need reduce the core and memory speed ( less of 700mhz for the core... ) this is what ask Zallbard about performance and where will be the real fight between the 6990 and this one. if the card perform around or under 570 level, 2x 6970 with 6950 clock speed ( who close 2x 580 SLI without problem ) will surely be enough... All in one, this will be a nice fight between both cards... i will not be surprised to see quickly Evga release Superclocked 590 and why not Sapphire / Asus release faster 6990 in response then. ( can be funny to watch as this is not really mainstream market... )

    AIB as Evga, Asus, can pass the limit, no problem ( Ares etc ), they provide a better cooling ( include watercooling as for the GTX295 ). But Nvidia or AMD can't ... they need to comply to certain rules ( and one is the PCI express 300W limit ).
    If not good people like EK exists and creates some superb blocks for them

  10. #235
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by Pekalion View Post
    If not good people like EK exists and creates some superb blocks for them
    Lol i can only agree to this, with my sig.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  11. #236
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Argentina
    Posts
    223
    Hey guys, in my community they are saying that the GTX 590 will be really limited because Nvidia is doing some cherry picking process to use the best GF110, and because of this the cost will blow up any pocket.

    Can someone confirm this please or is just a stupid rumour?

  12. #237
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Pekalion View Post
    Can someone confirm this please or is just a stupid rumour?
    Nobody knows, that's why it's just a rumour. Whether it's true or not, we'll soon find out, I guess.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  13. #238
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Argentina
    Posts
    223
    Quote Originally Posted by zalbard View Post
    Nobody knows, that's why it's just a rumour. Whether it's true or not, we'll soon find out, I guess.
    Seems like it will end like that. I was trying to find a decent place with that very same thing said but nope, only in an underground third hand blog of my country and someone post it in mi community, I believe I should warn them to pick it like a grain of salt.

    Beside of that, have you heard the same rumour?

  14. #239
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Pekalion View Post
    Beside of that, have you heard the same rumour?
    I believe it originates from here, but I wouldn't exactly trust the source.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  15. #240
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Can't believe how many people believe NVIDIA will launch a dual gf110 card... wake up guys! Dual 560 is plenty so what's the point? NVIDIA doesn't do monster cards, Asus and sapphire etc do, and asus is kinda hesitating cause they don't make that much money on those cards... the volume is tiny...

  16. #241
    -100c Club
    Join Date
    Jun 2005
    Location
    Slovenia, Europe
    Posts
    2,283
    It's hard to keep being quiet and watch you people arguing about it when you know what GPUs do power these cards

  17. #242
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by saaya View Post
    Can't believe how many people believe NVIDIA will launch a dual gf110 card... wake up guys! Dual 560 is plenty so what's the point? NVIDIA doesn't do monster cards, Asus and sapphire etc do, and asus is kinda hesitating cause they don't make that much money on those cards... the volume is tiny...
    You should know Nvidia's mentality, winning comes first power consumption, heat & price come second.

    If 560 is plenty for them then they would have launched a dual 460 based card since they had ample time to bring such a card to market.

    As you mention volume is tiny at the ultra high end so why pull your punches with a dual 560. The people buying these cards want monster cards and are willing to pay the monster price. You don't win this tiny segment by offering products that are enough, you win this segment by offering products that are over the top OMG I can't believe they did it and charge for it.

    This segment is about bragging rights and marketing for the manufacturer for delivering the single fastest graphics adapter which should help sell the brand in all segments.
    Last edited by highoctane; 02-03-2011 at 12:30 PM.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  18. #243
    Xtreme Member
    Join Date
    Jan 2011
    Location
    New Zealand
    Posts
    441
    Last weeks we repored about graphics circuit maker NVIDIA working on the launch of its new flagship GeForce GTX 590. With 1024 CUDA cores and dual GF110 graphics circuit it is an extreme graphics card demanding a lot from the other component, where NVIDIA only uses the finest of its GPU samples and availability will be limited.

    NVIDIA had a successful launch of GeForce GTX 580 where it knocked down AMD with the new GF110 GPU that with high performance and relatively reasonable power consumption took over as the king of the market. AMD couldn't counter with a single Cayman GPU, but it still has the fastest card around with its dual-GPU Radeon HD 5970.

    AMD's plan was to maintain this, for PR mostly, trump with the launch of Radeon HD 6990 "Antilles" sporting dual Cayman circuits, but NVIDIA's plan is to crash the party through an unsuspected launch of a Fermi-based graphics card with dual GPUs.

    As we revealed we will be dealing with fully featured GF110 circuits with 512 CUDA active cores each. To make sure the graphics card won't go above any power consumption specifications NVIDIA has had to turn down both voltages and clock frequencies, but it has also starting sorting out the circuits at TSMC where the finest GF110 samples are removed from the belt and put on line for the flagship GeForce GTX 590.

    NVIDIA wants the GF110 circuits with the least leakage to maximize the clock frequencies without power consumption skyrocketing. It comes as no surprise that GeForce GTX 590 will be in limited supply. NVIDIA has no hopes of making any real money from this monster card, but it sees an opening to whip AMD and launch what it could call the fastest card in the world - no bars.

    NVIDIA's reasoning is most likely that the positive PR is going to boost sales overall.

    http://www.nordichardware.com/news/7...e-gtx-590.html

  19. #244
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by highoctane View Post
    You should know Nvidia's mentality, winning comes first power consumption, heat & price come second.

    If 560 is plenty for them then they would have launched a dual 460 based card since they had ample time to bring such a card to market.

    As you mention volume is tiny at the ultra high end so why pull your punches with a dual 560. The people buying these cards want monster cards and are willing to pay the monster price. You don't win this tiny segment by offering products that are enough, you win this segment by offering products that are over the top OMG I can't believe they did it and charge for it.

    This segment is about bragging rights and marketing for the manufacturer for delivering the single fastest graphics adapter which should help sell the brand in all segments.
    No 460 dual card cause it didn't make sense compared to a 480 and 5970
    In the past NVIDIA never used their top of the line gpus on dual gpu cards... not a single time...
    If they use gf110 is be surprised but it might actually work, but dual 580... that would be a first for nvidia and I just don't see it happening even with superbinned parts

  20. #245
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    161
    Quote Originally Posted by zalbard View Post
    I believe it originates from here, but I wouldn't exactly trust the source.
    You are joking right.
    Nordichardware is a very respected site and all make perfect sence to even make it possible to use the 580 core and not break 400W. I would agree if it was SA, fud, The Inq or other known crapsite and author happens to be a familiour Charlie we love to hate and he has made it a mission to take a crap on everything nvidia does.
    EVGA Classified E762- i7-980X cooled by EK Supreeme HF- 12GB Corsair Dominator GT 2000- 3x 100GB OCZ vertex2 SSD@raid0- 3x Gainward GTX580 Phantom 3GB (soon)ooled by EK)- Silverstone strider ST1500 1500W- Win7 Ultimate X64- LG W3000H- X-Fi Titanium Pro -Logitech Z5500
    Custom WC-cooling with Thermochill PA120.3, PA140.3, 2x Feser 480 Quadrad, Scythe GT1850rpm, Noiseblocker PK3 and Swiftech MCP355 with XSPC-tops

  21. #246
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by saaya View Post
    Can't believe how many people believe NVIDIA will launch a dual gf110 card... wake up guys! Dual 560 is plenty so what's the point? NVIDIA doesn't do monster cards, Asus and sapphire etc do, and asus is kinda hesitating cause they don't make that much money on those cards... the volume is tiny...
    uh wut?

    8800gtx, 9800GX2, GTX 280, GTX 295, GTX 480.... Nvidia IS monster cards... sure board partners do bigger ones like the Asus MARS or ARES but still the GTX 295 when it came out made the 4870X2 look like a small little thing in every way, performance, power consumption, heat output, noise...
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  22. #247
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by [XC] hipno650 View Post
    uh wut?

    8800gtx, 9800GX2, GTX 280, GTX 295, GTX 480.... Nvidia IS monster cards... sure board partners do bigger ones like the Asus MARS or ARES but still the GTX 295 when it came out made the 4870X2 look like a small little thing in every way, performance, power consumption, heat output, noise...
    You have forget RMA in your list ( it's a joke don't take it bad and surely far to be true.) But it's what i see coming with 2 GF110 untill they are clocked at 400mhz each .

    Seriously, you imagine the size of this monster with 2x 550mm2 chips, heat, power consumption, the massive cooling system... and the price for produce it and sell it... I don't imagine Nvidia want sell it at the production price and don't do money with it, or their PR is going crazy.. Let's imagine it beat the 6990..at what cost ? even if it beat the 6990, it will be a win win for AMD cause the card will cost nearly same to produce of a dual 560.... far of what can cost a dual GF110, with both high performance results, AMD adjust his price and win on all aspect.
    Last edited by Lanek; 02-04-2011 at 09:51 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  23. #248
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    tripple slot cooling fan from nvidia .. and quad slot for the whatever special omgwtfbbq special edition oc'd dual 580 aka 590???


    i now can see it ... it cost over 900 ... takes up so much space .. gives out so much heat .. and needs a nuclear reactor to power up
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  24. #249
    Xtreme Member
    Join Date
    Jun 2008
    Location
    Vilnius, Lithuania
    Posts
    130
    Quote Originally Posted by Sn0wm@n View Post
    tripple slot cooling fan from nvidia .. and quad slot for the whatever special omgwtfbbq special edition oc'd dual 580 aka 590???


    i now can see it ... it cost over 9000 ... takes up so much space .. gives out so much heat .. and needs a nuclear reactor to power up
    Fixed

  25. #250
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by E.R View Post
    You are joking right.
    Nordichardware is a very respected site and all make perfect sence to even make it possible to use the 580 core and not break 400W. I would agree if it was SA, fud, The Inq or other known crapsite and author happens to be a familiour Charlie we love to hate and he has made it a mission to take a crap on everything nvidia does.
    i really like nordichardware, but... they have, a few times, been way off in the past when it came to vga stuff...
    wouldnt even blame them, probably just bad sources, or sources seeding wrong info on purpose...

    Quote Originally Posted by [XC] hipno650 View Post
    uh wut?

    8800gtx, 9800GX2, GTX 280, GTX 295, GTX 480.... Nvidia IS monster cards... sure board partners do bigger ones like the Asus MARS or ARES but still the GTX 295 when it came out made the 4870X2 look like a small little thing in every way, performance, power consumption, heat output, noise...
    well, let me do some googling...

    all these peercentage numbers are theoretical specs, not actual performance!
    but it should be around that range in actual perf as well on average...

    7950gx2 was based on the 7950, not the 7900GTX 512, the fastest card at that time. compared to the latter it came downclocked by 150mhz on the core and 400mhz on the memory or 23% and 25%
    a 7950gx2 consumed only 25W more than a 7900GTX
    its only slightly above 100W, but nvidia didnt go higher than that since heatsinks back then werent that great (no heatpipes) and seeded psus werent that beefy...

    9800gx2 was based on the 9800gt, not the 8800GTX/8800Ultra.
    compared to those it came with 300mhz/666mhz lower shader clocks and 200mhz faster/160mhz slower memory clocks or 20%/44% and +11%/8% AND the 9800gx2 only had 16rops per gpu, while a 8800GTX/Ultra had 24, so thats 50% less rops per gpu for the gx2.
    the 9800gx2 wasnt even based on the fastest G92 chip, the 9800gtx and 9800gtx+ were notably faster than the 9800gt bin nvidia used.
    the 9800gx2 was more of a modern dual gpu card hitting TDPs of over 200W, but it was still not limited by this... it ran very hot but thats cause nvidia didnt spend as much on the heatsink as it could have.

    gtx295 was based on the same chip as the 280/285 but came with a bin/gpu config of a 275 and clocked at the same speed as a 260. it came with only 4 rops less than the highest gpu config, the 280/285, only a 15% cut down, but it came clocked 25mhz/75mhz lower on the core and shader and 200mhz/500mhz lower on the memory or 5%/11% and 10%/25% respectively.
    this made it the fastest dual gpu card so far, compared to the fastest single gpu card, as it wasnt as cut down as previous dual gpu cards. its bin/config was still around 20% slower than that of the highest end gpu bin/config at its time though.

    lets wrap it up:
    7950gx2 - based on mainstream/highend bin which was around 40% slower than nvidias highest end sku at the time
    9800gx2 - based on mainstream/highend gpu+bin which was around 30% slower than nvidias highest end sku at the time
    gtx295 - based on mainstream/highend gpu+bin which was around 20% slower than nvidias highest end sku at the time

    so THERE!
    NO! :P
    nvidia NEVER did a monster card using their best and fastest gpus on a dualgpu card. not a single time!
    and now nvidia has the hottest gpu on their hands in their company history that is close to the tdp spec for an add in card all on its own, and you want to tell me they will do a dual gpu card based on that, with all bells and whistles.... come on man

    dual gf110 is definitely possible! but id be surprised if they did that... dual gf114 costs them less, works on simpler pcbs and can be tweaked to around the same performance within a 300W envelope as a dual gf110 card can (im guessing, but look at the perf/watt numbers) so whats the point in using a dual gf110...?

    the only point would be for enthusiasts...

    what id LOVE nvidia to do, is do something like ati...
    make it a dual gf110 card, use crippled 110 chips and clock them down, BUT allow us to unlock those gpus to full 512sps and overclock it...
    now that would be an EPIC card!

    but i highly doubt nvidia would do that...
    they want to have the fastest stock card... and for that gf114 x 2 is just as good as dual gf110... actually worse, cause gf114 is cheaper

    they could still go for dual gf110 because while at stock it would probably be as fast as dual gf114 when crippled to fit in 300W, when overclocked, a dual gf110 would probably be a tad faster than a dual gf114 card, even if the gf110 is crippled gpu wise and it cant be unlocked...
    Last edited by saaya; 02-04-2011 at 11:18 PM.

Page 10 of 42 FirstFirst ... 7891011121320 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •