Page 4 of 10 FirstFirst 1234567 ... LastLast
Results 76 to 100 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #76
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by fragmasterMax View Post
    Unless you show me some data sheets, i am going to take that with a mound of salt.
    According to this a 4870 with 1gb of gddr5 uses only ~20 more watts than a 4850. The 512 4870 uses 36 more watts. I think most of the difference in power consumption is due to the 125MHz core gain which bofox already mentioned. Show me a 4850 at 4870 clocks and then lets do a comparison.

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    I can show you a 4870 512 clocked to 900MHz and 300MHz:



    That is power consumed from the PSU. From the wall, assuming 80% efficiency of my Odin 800W, that makes 31.25W difference. I was a bit wrong before. Also, if you want to see how core clock affects, see below:



    As you see, core clock means close to nothing. 300MHz less means 3W less. GDDR5 speed, A LOT. Also see the voltage regulator current sensor in all cases.
    Last edited by STaRGaZeR; 10-09-2008 at 08:23 AM.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  2. #77
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by v_rr View Post
    Taken directly from the review.
    Itīs unfair compare a 17% overclocked 260gtx core-216 to a stock clocked 4870 1Gb.
    I should have read the review better,
    but why is it that it's power draw at idle is 114 (when it's overclocked) compared to 166 for the (stock?) 4870?
    (Thats about a 50% increase btw)
    Last edited by fragmasterMax; 10-09-2008 at 08:24 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  3. #78
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Toronto, Canada
    Posts
    1,491
    Quote Originally Posted by fragmasterMax View Post
    I should have read the review better,
    but why is it that it's power draw at idle is 114 compared to 166 for the 4870?
    (Thats about a 50% increase btw)
    46%
    RIG 1 (in progress):
    Core i7 920 @ 3GHz 1.17v (WIP) / EVGA X58 Classified 3X SLI / Crucial D9JNL 3x2GB @ 1430 7-7-7-20 1T 1.65v
    Corsair HX1000 / EVGA GTX 295 SLI / X-FI Titanium FATAL1TY Pro / Samsung SyncMaster 245b 24" / MM H2GO
    2x X25-M 80GB (RAID0) + Caviar 500 GB / Windows 7 Ultimate x64 RC1 Build 7100

    RIG 2:
    E4500 @ 3.0 / Asus P5Q / 4x1 GB DDR2-667
    CoolerMaster Extreme Power / BFG 9800 GT OC / LG 22"
    Antec Ninehundred / Onboard Sound / TRUE / Vista 32

  4. #79
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Clairvoyant129 View Post
    ... Where is the 50% more power at idle in that link? I love how Nv fanboys likes to pull numbers out of no where.

    Someone wise, in response to a fanboy once said:

    Quote Originally Posted by fragmasterMax View Post
    dude
    wow
    This is a throwback to third grade, when everybody thought they had the best everything, and couldn't do simple math.
    gtx 260- 114
    4870- 166
    hmmm 114/2=57+114=171 watts (what the pwr consumption would have to be to be 50% more)
    the power consumption of a 4870 is 166, a bit less than 50% more than the gtx 260.
    Last edited by fragmasterMax; 10-09-2008 at 12:14 PM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  5. #80
    Xtreme Mentor
    Join Date
    May 2006
    Location
    Croatia
    Posts
    2,542
    Why is all this important again...
    It's not like there is anything new or anything that can't be played on current hardware anyway...
    Let them rename their stuff and relesae some silly spinofs if the think public needs it...
    Quote Originally Posted by LexDiamonds View Post
    Anti-Virus software is for n00bs.

  6. #81
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Macadamia View Post
    The source is right about the die size at least. 10/100!

    p/s: There IS a reason why even the simpler G80 never even had a 2XPCB.


    Plus, think of it. We're now 3.5 months after the launch of the current gen. The next gen should come in 6 months. Why get these (that will launch in Winter) when the next batch of chips are immensely better? (At least the RV870. Near GT200GX2 performance for cheap. And yes, it's by common guesstimates.)
    " ding, ding, ding, ding, ding..." That's it's right there. Why would the majority worry about buying a card within a month if for example:
    -just upgraded
    and
    -new skus are coming out by mid winter

    The timing is off IMO.
    [SIGPIC][/SIGPIC]

  7. #82
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by Epsilon84 View Post
    Yeah, my mistake, but if its system power consumption then that makes a 280GX2 even more plausible doesn't it? If a GTX280 SLI setup only draws 55W more than a 4870X2, then surely a 55nm 280GX2 should get the power consumption down to 4870X2 levels at the very least, even if nVidia uses the 'double cheeseburger' approach for the GX2.
    Yeah, I think its doable...very complex but doable... maybe on a 55nm process and undervolted/underclocked. But it still would draw alot of power, since the gtx280 uses something like ~230W.

    With that said, they could do an x2 with gtx260 cores instead. Would be alot easier...but at the same time it would most likely not get the performance crown from the 4870x2 in most games. Maybe if it had high clocks and better mem it would tho...

    To fully beat the 4870x2 on short term, they would probably need an dual GPU card based on the gtx280. An improved, faster 280 or an new GPU would take too long. They need an answer soon.

    EDIT: some power consumption tests should be taken with an pinch of salt. If the bottleneck is somewhere else other than the video cards, the power consumption is not fully unlocked. Specially with multiple cards, since CPU bottleneck plays an bigger role. I dont think gtx280 sli draws only 55W more than an 4870x2. From what I've seen, if the cards were fully loaded the difference would be larger. Not much, like 100W or so, but you got my point
    Last edited by Tonucci; 10-09-2008 at 08:59 AM.

  8. #83
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by STaRGaZeR View Post
    I can show you a 4870 512 clocked to 900MHz and 300MHz:



    That is power consumed from the PSU. From the wall, assuming 80% efficiency of my Odin 800W, that makes 31.25W difference. I was a bit wrong before. Also, if you want to see how core clock affects, see below:



    As you see, core clock means close to nothing. 300MHz less means 3W less. GDDR5 speed, A LOT. Also see the voltage regulator current sensor in all cases.
    Thats actually a really cool piece of software. I think i know of what my next psu will be..

    You have a 4870x2 with 2gigs of the stuff, who knows if nvidia will implement that much, or if there frequency changing software will work correctly.

    While the higher memory bandwidth drives the performance in discrete graphics systems higher by up to 40 percent, something the user can see on his computer, computer manufacturers benefit from high signal integrity and low power consumption. "GDDR5 lowers manufacturing costs since it is possible to use simple 4-layer boards for memory subsystems", explained Feurle. Typically, graphics memory PCBs currently require six layers.

    In addition, the chips consume 30 percent less power than their GDDR3 counterparts. This redundantizes separate voltage control circuits required for graphic memories, explains Feurle. With mass production for buried-wordline chips scheduled for coming autumn, Qimonda developers will be able to reduce the I/O supply voltage to 1.2V and then to 1V which again will reduce power consumption. "We believe that in the future, the innovation in memory chip development increasingly will focus on power aspects," Feurle said in an interview with EE Times Europe. In order to reduce the power consumption, chip designers will optimize I/O circuits, do away with external power regulator ICs and, as mentioned, switch to buried wordline technology.


    I see your point, heat from gddr5 shouldn't be that big of a problem though when nvidia finally adopts it.
    Last edited by fragmasterMax; 10-09-2008 at 08:50 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  9. #84
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by fragmasterMax View Post
    This place? like xs forums? rofl. xs forums is unbiased. It's a copout to say everyone is biased against you when you are wrong.
    Dude, you need to read what you're replying to before you click send. I did not say this place was biased, but the BS that was written over at that fanboi site "The Inquirer". Obviously this place is unbiased, and is why we don't need crap like that posted here.

    It's a refresh. Nothing more. They make it sound like it's some kind of conspiracy in true ATi fanboi fashion. Go back and read through this thread. They are well known for it.
    Last edited by T_Flight; 10-09-2008 at 08:54 AM.

  10. #85
    Xtreme Member
    Join Date
    Jan 2005
    Location
    No(r)way
    Posts
    452
    Quote Originally Posted by STaRGaZeR View Post
    Do you know why? In the 4870 its GDDR5 doesn't reduce speed at idle, it remains at 900MHz. If you have ever had a card with GDDR5 you'd know that it consumes a buttload of power at idle if the frequencies are not lowered. Example: at idle, 300Mhz vs 900Mhz does almost nothing if you're talking about GDDR3, but with GDDR5 you're talking about 40+ watts. You know why 4870 doesn't clock the RAM to 500MHz at idle? The screen flickers when the change is done, and unfortunately 2D/3D detection on the 4870 sucks ass, so they can't lower it. The 2D/3D detection is changed in the 4870X2, so they can clock the GDDR5 to 500Mhz without flickering in not desired moments. That's why you see the 4870X2 consuming only a little bit more than a single 4870 at idle, even having twice the ram chips and two RV770s. That's also why you see the tremendous difference between 4870 and 4850 at idle.

    Be careful with what you say about GDDR5, it's fast but sucks power like if it was no tomorrow. And that's precisely what you don't want in a GX2 card.
    That completely contradicts everything I have ever read about GDDR5, like this: In the energy consumption plan GDDR-5 win over GDDR-3 by about 30%. To autumn Qimonda engineers will reduce the working GDDR-5 voltage from 1.2 to 1.0 v which also favorably will affect the energy efficiency .
    This:In addition, the chips consume 30 percent less power than their GDDR3 counterparts.
    Or this: The new Samsung graphics memory operates at 1.5 volts, representing an approximate 20 percent improvement in power consumption over today's most popular graphics chip - the GDDR3.
    The 4870 may draw quite a bit of power, but that is not due to the use of GDDR5, I'm quite sure.
    Obsolescence be thy name

  11. #86
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Tonucci View Post
    Yeah, I think its doable...very complex but doable... maybe on a 55nm process and undervolted/underclocked. But it still would draw alot of power, since the gtx280 uses something like ~230W.

    With that said, they could do an x2 with gtx260 cores instead. Would be alot easier...but at the same time it would most likely not get the performance crown from the 4870x2 in most games. Maybe if it had high clocks and better mem it would tho...

    To fully beat the 4870x2 on short term, they would probably need an dual GPU card based on the gtx280. An improved, faster 280 or an new GPU would take too long. They need an answer soon.
    An oc'd 9800gtx+ at 825/2000(gpu/shader) with gddr5 implementation in a dual slot, would beat the 4870x2.
    It's all about the memory, which is the only thing separating the 4850 from a 4870 besides 150 mhz oc on the core.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  12. #87
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by T_Flight View Post
    Dude, you need to read what you're replying to before you click send. I did not say this place was biased, but the BS that was written over at that fanboi site "The Inquirer". Obviously this place is unbiased, and is why we don't need crap like that posted here.

    It's a refresh. Nothing more. They make it sound like it's some kind of conspiracy in true ATi fanboi fashion. Go back and read through this thread. They are well known for it.

    We where in agreement the whole time.
    my bad
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  13. #88
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Toronto, Canada
    Posts
    1,491
    Quote Originally Posted by fragmasterMax View Post
    An oc'd 9800gtx+ at 825/2000(gpu/shader) with gddr5 implementation in a dual slot, would beat the 4870x2.
    It's all about the memory, which is the only thing separating the 4850 from a 4870 besides 150 mhz oc on the core.


    You've gone from trying to prove a point about power consumption to pure fantastical speculation now.
    RIG 1 (in progress):
    Core i7 920 @ 3GHz 1.17v (WIP) / EVGA X58 Classified 3X SLI / Crucial D9JNL 3x2GB @ 1430 7-7-7-20 1T 1.65v
    Corsair HX1000 / EVGA GTX 295 SLI / X-FI Titanium FATAL1TY Pro / Samsung SyncMaster 245b 24" / MM H2GO
    2x X25-M 80GB (RAID0) + Caviar 500 GB / Windows 7 Ultimate x64 RC1 Build 7100

    RIG 2:
    E4500 @ 3.0 / Asus P5Q / 4x1 GB DDR2-667
    CoolerMaster Extreme Power / BFG 9800 GT OC / LG 22"
    Antec Ninehundred / Onboard Sound / TRUE / Vista 32

  14. #89
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    http://www.qimonda.com/static/downlo...Flyer_0806.pdf

    one of the features is "frequency related power scaling" (in the r hand side) as stargazer just demonstrated. They use alot of power at higher frequencies, apparently, but that stuff is probably straight from the fabs operating at >1.3v
    it might use less power when downclocked as far as it will go. When those conditions are not met, apparently it uses significantly more power.
    Last edited by fragmasterMax; 10-09-2008 at 09:07 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  15. #90
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by zlojack View Post


    You've gone from trying to prove a point about power consumption to pure fantastical speculation now.
    It would never happen, but it's fair to say since a overclocked 55nm 9800gtx+ will beat/match/ get beaten slightly by a 4850 at ~700mhz, then the addition of gddr5 to both would put the 9800gtx ahead/on the same level. Then again i could be (and probably was) wrong.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  16. #91
    Xtreme Member
    Join Date
    Apr 2008
    Location
    Stockholm, Sweden
    Posts
    324
    Do you even know the main advantage GDDR5 has over GDDR3? BANDWIDTH. And if you didn't know, the GTX280 already has more bandwidth then the HD4870. So if GTX280 and HD4870 suddenly swapped memory and memory bus width, NOTHING would change.
    Last edited by Eson; 10-09-2008 at 09:10 AM.

  17. #92
    Xtreme Cruncher
    Join Date
    Feb 2007
    Posts
    594
    Quote Originally Posted by STaRGaZeR View Post

    As you see, core clock means close to nothing. 300MHz less means 3W less. GDDR5 speed, A LOT. Also see the voltage regulator current sensor in all cases.
    I can confirm the core frequancy means almost nothing, only a couple of watts and I have a meter between the wall and the PSU.

    But going from 900mhz on the mem to 200mhz brings down power consumption by ~33w

  18. #93
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by fragmasterMax View Post
    An oc'd 9800gtx+ at 825/2000(gpu/shader) with gddr5 implementation in a dual slot, would beat the 4870x2.
    It's all about the memory, which is the only thing separating the 4850 from a 4870 besides 150 mhz oc on the core.
    It has been proven that the mem made a huge difference for the 4870, but thats because it has an 256bit bus and alot of shading power to make use of high bandwidth.

    We dont know if GDDR5 would make such a difference for the 9800GTX+, since it lacks in other areas, like shader power, to make good use of more bandwidth. The gtx260 and 280 cards have alot of shading power, therefore bandwidth is more useful, hence the wide bus (instead of achieving higher BW via faster mem like GGDR5).

    We can only speculate, but I dont think an dual 9800GTX+ would beat the 4870x2.

  19. #94
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by fragmasterMax View Post

    We where in agreement the whole time.
    my bad
    Sorry man. I just didn't want people to think I was slamming on XS. This place is a great site, and everywhere I go on the Internet it is widely known as being the best source for high end computing on the net bar none.

    On the flight sim forums I go to, even on my high power rocketry hobby sites when people ask about computers. This really is the Xtreme.

  20. #95
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Nah, once you low the frequency everything is all right, and heat is vastly reduced like power consumption. Problems start to appear when you don't or can't (the case with the 4870) do it.

    While I agree with you in the theoretical part, the fact is that when you have the card in your hands and you start doing things here and there you discover that either that's a load of BS or ATI's implementation is xtremely poor and bad planned. I'd pick the later seeing the results and knowing the theory. As I say, once you low the clocks everything is all right, that's why they changed it in the 4870X2.

    If anyone is interested in this, there's a nice analisis done by an user in TechPowerUp forums.
    Last edited by STaRGaZeR; 10-09-2008 at 09:23 AM.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  21. #96
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by T_Flight View Post
    Dude, you need to read what you're replying to before you click send. I did not say this place was biased, but the BS that was written over at that fanboi site "The Inquirer". Obviously this place is unbiased, and is why we don't need crap like that posted here.

    It's a refresh. Nothing more. They make it sound like it's some kind of conspiracy in true ATi fanboi fashion. Go back and read through this thread. They are well known for it.
    Regardless, many of us LOVE the INQ for its humor, for it bringing so many half-assed or half-erect rumors since the INQ is sometimes much closer to the source than the rest of us are.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  22. #97
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by v_rr View Post
    Taken directly from the review.
    Itīs unfair compare a 17% overclocked 260gtx core-216 to a stock clocked 4870 1Gb.
    Tech Report should have done benchmarks with a 260 216 card at stock to clarify the confusion..

    But, lets get real.

    Even if overclocked, the HD 4870 1GB couldn't touch an overclocked GTX 260 216.

    This was the primary reason I bought my GTX 260. It's a helluva lot more overclockable than the HD 4870.

    If you're going to overclock, the Nvidia card is definitely the better solution hands down
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  23. #98
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Eson View Post
    Do you even know the main advantage GDDR5 has over GDDR3? BANDWIDTH. And if you didn't know, the GTX280 already has more bandwidth then the HD4870. So if GTX280 and HD4870 suddenly swapped memory and memory bus width, NOTHING would change.
    Your confused. I already made the point that the gtx 280 has more bandwidth than the 4870, due to it's 512 bit bus.
    If you put gddr5 on the 512 bit bus of the gtx280, the bandwidth would be much higher than it would be on a 256 bit bus like the 4870x2. This won't be to hard for nvidia to do (adopt gddr5). Though it will be impossible for amd to make a 4870 512 bit. The gpu is designed around the memory bus width. Remember amd's last 512 bit gpu, the 2900xt? well the higher the bit bus was one of the reasons for its higher the power consumption.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  24. #99
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Carfax View Post
    Tech Report should have done benchmarks with a 260 216 card at stock to clarify the confusion..

    But, lets get real.

    Even if overclocked, the HD 4870 1GB couldn't touch an overclocked GTX 260 216.

    This was the primary reason I bought my GTX 260. It's a helluva lot more overclockable than the HD 4870.

    If you're going to overclock, the Nvidia card is definitely the better solution hands down
    Really? So it's the overclock potential that's more important to you then the actual performance. I see now.
    [SIGPIC][/SIGPIC]

  25. #100
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by Eastcoasthandle View Post
    Really? So it's the overclock potential that's more important to you then the actual performance. I see now.
    I wouldn't say it's more important, just equally so.

    The GTX 260 216 and the HD 4870 1GB are practically neck and neck at stock clocks, but the GTX 260 overclocks like a bat out of hell.

    Whats not to like about free performance?

    Another factor why I chose Nvidia again, is because of the drivers. Some people may think that ATI has better drivers, and they're free to believe that.

    However, I don't think many would doubt that Nvidia's ability to squeeze performance out of their cards through driver optimizations is much more impressive than ATI's.

    Nvidia has a track record of this, going back to the old detonator days. Hopefully the 180xx drivers continue this trend.
    Last edited by Carfax; 10-09-2008 at 11:16 AM.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

Page 4 of 10 FirstFirst 1234567 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •