Page 4 of 9 FirstFirst 1234567 ... LastLast
Results 76 to 100 of 223

Thread: Nvidia GTX 580 Reviews

  1. #76
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by SKYMTL View Post
    However, many seem to forget that in order to add high level geometry processing horsepower, an wicked increase in die size and power consumption is a must.
    I don't see where you managed that leap of logic, but it seems like that would be something you could not possibly know. I'll give you the benefit of the doubt though assuming you explain your reasoning.

  2. #77
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    Australia / Europe
    Posts
    1,310
    Quote Originally Posted by hurleybird View Post
    i don't see where you managed that leap of logic, but it seems like that would be something you could not possibly know. I'll give you the benefit of the doubt though assuming you explain your reasoning.
    lmao

  3. #78
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    SKYMTL: I find it funny how you backtrack on the 5970 in your review, but I understand why you did that. All cards at launch suck at performance compared to after being out a year. Nvidia always boosts their cards with driver updates. I would liked to see you guys do a SLI review in a few days other sites are showing almost a 2x boost from the standard 580.. Back to the 5970, most users that have these are not going to be attracted to the 580GTX in single gpu form and I wish you would have talked about that. I do commend you though on using the latest drivers for the ATI products though All in all I like your review more than any of the others, that includes Anandtech and Guru3d's reviews. I also like the folding comparison that you did between the 480 and 580.

    I think people should take into fact that all the dual gpu cards minus the Asus one all use a 8pin and 6pin connectors just like the single gpu cards do. According the benchmarks the 5970 is still the single card king.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  4. #79
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    Australia / Europe
    Posts
    1,310
    and will be until the arrival of the 69XX series,
    GTX580 is the tessellation king
    5970 is the Graphics card king

  5. #80
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by Sam_oslo View Post
    Yeah, AMD has started the fake-new-generation number-game in this round. nVidia is following the same path, and soon they will both run out of numbers, as I've said before.
    You sound like a broken record...
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  6. #81
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by ElSel10 View Post
    AMD has done the same thing with power monitoring since either the HD3000 or HD 4000. Do you have a problem with that?
    While I actually think this sort of thing is great, the reasons are different:

    ATI: Anand has stated this is implemented to save the digital VRM modules.

    That problem reared its head a lot for the RV770 in particular, with the rise in popularity of stress testing programs like FurMark and OCCT. Although stress testers on the CPU side are nothing new, FurMark and OCCT heralded a new generation of GPU stress testers that were extremely effective in generating a maximum load. Unfortunately for RV770, the maximum possible load and the TDP are pretty far apart, which becomes a problem since the VRMs used in a card only need to be spec’d to meet the TDP of a card plus some safety room. They don’t need to be able to meet whatever the true maximum load of a card can be, as it should never happen.
    Nvidia: You can actually make the card exceed 300W.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  7. #82
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Location
    Montreal
    Posts
    791
    All I see is "5970 > gtx580".

    Don't use that "single gpu vs dual gpu" garbage, when I look at a graphic card, I look at the performance in games I am likely to play, at 2560x1600. I want the fastest card on the market? I buy a 5970. Well, let's hope for a "lightning" edition, pre-overclocked lol

  8. #83
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Quote Originally Posted by antiacid View Post
    All I see is "5970 > gtx580".

    Don't use that "single gpu vs dual gpu" garbage, when I look at a graphic card, I look at the performance in games I am likely to play, at 2560x1600
    Do you also look at whether those frames per second are rendered in a synchronized manner? You know, a card could render 60 frames in 0.001 seconds, and nothing else for the remaining 0.999 seconds, then theoretically it would be giving you 60 FPS, whereas it would feel absolutely no different from 1 FPS. Obviously I'm referring to microstuttering.

    Now, I don't make the claim that 5970 has tons of microstuttering - I am merely stressing that FPS is, just by itself, not the single deciding factor about how good and smooth a gaming experience will be.

    I had to say this since you said you would just look at the FPS numbers, which would be on average a fine thing to do if you were comparing single GPU setups, but when it's Single GPU vs. Dual GPU, different things are factored in the equation, and it gets murkier.

    I am certain that, given the small (~0-5%) average FPS difference between 580 and 5970; 580 would be giving you a better gaming experience overall than HD5970 - and this is not even counting the usual SLI/CF problems like driver dependence and games that don't scale.

  9. #84
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by hurrdurr View Post
    Do you also look at whether those frames per second are rendered in a synchronized manner? You know, a card could render 60 frames in 0.001 seconds, and nothing else for the remaining 0.999 seconds, then theoretically it would be giving you 60 FPS, whereas it would feel absolutely no different from 1 FPS. Obviously I'm referring to microstuttering.

    Now, I don't make the claim that 5970 has tons of microstuttering - I am merely stressing that FPS is, just by itself, not the single deciding factor about how good and smooth a gaming experience will be.

    I had to say this since you said you would just look at the FPS numbers, which would be on average a fine thing to do if you were comparing single GPU setups, but when it's Single GPU vs. Dual GPU, different things are factored in the equation, and it gets murkier.

    I am certain that, given the small (~0-5%) average FPS difference between 580 and 5970; 580 would be giving you a better gaming experience overall than HD5970 - and this is not even counting the usual SLI/CF problems like driver dependence and games that don't scale.
    Microstuttering is the oldest bunk that's been around

    It's been thoroughly discredited as being way misunderstood & overblown and people STILL keep harping on it, esp. whenever one side doesn't have a dual GPU option or whatever

    Even the people who discovered it say that most people who talk about it don't have a clue what they're actaully looking at

  10. #85
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by ElSel10
    AMD has done the same thing with power monitoring since either the HD3000 or HD 4000. Do you have a problem with that?
    I wasn't aware of that. Yes, I have a problem with that. I am beginning to believe that all corporations are inherently evil entities and should be abolished. They are a perversion of capitalism. A corporation is like an individual who doesn't care about anything except money. Someone with no ethics or empathy at all. Someone who would sell their own mother into slavery, prostitution, or for body parts for a dollar and do so without the slightest hesitation. I don't think that human beings are 100% pure evil with no redeeming qualities. But by hiding behind the corporate banner people feel free to encourage all of their worst, most short-sighted, selfish, every-man-for-himself impulses. Capitalism is the most efficient way we have come up with to produce goods and services, but when you take away personal responsibility you end up with something that can be just as bad as when the government runs everything.

    Technology only moves ahead as much as it has to for the companies to be profitable. If AMD and Nvidia could come up with some kind of pact where neither advances, where neither puts a single dollar into R&D ever again I have no doubt that they would do so. Would we ever have moved past the 8800GTX if AMD hadn't become competitive again? The way that these companies think is just so wrong. If I were the CEO of either company technological innnovation would come first. Any concern for profits would just be as a means to have money to move the technology forward. Making the fastest video card would not be just a means to the end of making as much money as possible. It would be the end itself, the whole reason that I was in business. A small business with the owner at the helm could very well have concern for making the best product possible or providing the best service, but a corporation doesn't care at all about such things. Of course, my stay as CEO would be very short, maybe only a few days. As soon as the board of directors realized that I put something ahead of their profits I would be out on my ass. And that is the problem with corporations.

  11. #86
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by zerazax
    Microstuttering is the oldest bunk that's been around

    It's been thoroughly discredited as being way misunderstood & overblown and people STILL keep harping on it, esp. whenever one side doesn't have a dual GPU option or whatever

    Even the people who discovered it say that most people who talk about it don't have a clue what they're actaully looking at
    Really? Is it just the "micro" part that has been discredited? There seemed to be a hell of a lot of anecdotal evidence that multi-GPU setups (especially SLI) could lead to less consistent frame rates last time I looked into it (around the time the 4870x2 was released). Do you have any links to a microstutter debunking? Although I've never owned a multi-GPU setup, the idea still seems very plausible to me.

  12. #87
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by hurleybird View Post
    I don't see where you managed that leap of logic, but it seems like that would be something you could not possibly know. I'll give you the benefit of the doubt though assuming you explain your reasoning.
    Sure.

    In order to add additional geometry processing horsepower, several things need to be looked at. First of all, you need an expanded fixed function stage which contains additional tessellation horsepower which is necessary for much of that increased geometry performance. To match the increased tessellation power, an expanded shader processor array needs to be created in order to actually PROCESS the large increase in draw calls, etc. Finally, beefing up the cache structure is also a necessity in order to quickly and efficiently move information along the rendering pipeline.

    Unfortunately, all of those items above take up space on a GPU die which in turn raises the number of transistors. NVIDIA did all of that and they ended up with a GPU sporting three billion transistors. AMD will need to do the same thing with their upcoming 40nm-based cards since if anything they will likely expand the SP count significantly (why wouldn't they). None of this come "free".


    Quote Originally Posted by safan80 View Post
    SKYMTL: Back to the 5970, most users that have these are not going to be attracted to the 580GTX in single gpu form and I wish you would have talked about that.
    I think I understand what you are saying. However, I don't think the possibility of going from a HD5970 to a GTX 580 is even being discussed. What I did mention though is the inherent performance instability that dual card or dual GPU cards are known for. This isn't a case of "the drivers don't work" but simply an observation that single GPU cards will have consistent performance (most of the time) while dual solutions are usually waiting for driver updates, etc in order to live up to their absolute potential in the newest games.

    I don't know about you but I am sure most gamers would want near-full performance from day one rather than waiting for drivers / app profiles weeks after a game is released in order to achieve full performance from their $500 purchase.

  13. #88
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    One things for sure. In Canada, the gtx 580 is the better deal as newegg.ca is still reflecting the 600 dollar + pricing and so is pretty much everywhere else in Canada while a gtx 580 is findable for around the 500 dollar range.

    I would still wait, but I have a feeling that the 6970 is going to perform a bit worse than the gtx 580.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  14. #89
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by tajoh111 View Post
    I would still wait, but I have a feeling that the 6970 is going to perform a bit worse than the gtx 580.
    Remeber, AMD has a much larger power envelope to play with than Nvidia did. Based on that fact alone, I think AMD has the potential to put out an overall more powerful card, even if it's not the tessellation monster that the GTX 480/580 is.

    We'll wait and see. I'm glad Nvidia has already launched the GTX 580, but honestly, there's no reason to buy it yet. We still don't know what AMD has cooking with the 6970.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  15. #90
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Old Vizima
    Posts
    952
    The 5970 is a great card and it is two GPUs on a single PCB that uses Crossfire. I've seen some who seem to think because the two GPUs are on single PCB some how the card is now like single GPU single PCB product. It is not. It requires proper driver profiles to function correctly in games. A single GPU does not. If it doesn't have said proper profiles you may get worse than a single GPU performance because of it. You can go back to getting single GPU performance in those cases by disabling CAT AI.

    If folks like MGPU they ought to wait and look at the 6950 or 6970 in Crossfire. If they scale like 6870 Crossfire currently is doing, they're going to be darn impressive. Of course there's always the 6990 as well.
    Last edited by Blacklash; 11-09-2010 at 09:53 PM.

  16. #91
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by Mad Pistol View Post
    Remeber, AMD has a much larger power envelope to play with than Nvidia did. Based on that fact alone, I think AMD has the potential to put out an overall more powerful card, even if it's not the tessellation monster that the GTX 480/580 is.

    We'll wait and see. I'm glad Nvidia has already launched the GTX 580, but honestly, there's no reason to buy it yet. We still don't know what AMD has cooking with the 6970.
    One thing I have a sneaking suspicion is that it will be priced appropriately. If it beats the gtx 580 it will cost 500 dollars or more, if it loses it will be priced between 400-500 dollars.

    Quote Originally Posted by Blacklash View Post
    The 5970 is a great card and it is two GPUs on a single PCB that uses Crossfire. I've seen some who seem to think because the two GPUs are on single PCB some how the card is now like single GPU single PCB product. It is not. It requires proper driver profiles to function correctly in games. A single GPU does not. If it doesn't have said proper profiles you may get worse than a single GPU performance because of it. You can go back to getting single GPU performance in those cases by disabling CAT AI.

    If folks like MGPU they ought to wait and look at the 6950 or 6970 in Crossfire. If they scale like 6870 Crossfire currently is doing, they're going to be darn impressive. Of course there's always the 6990 as well.
    It is a fine card, but it was too expensive when it came out and it had wonky performance at times. Catalyst set 10.5-10.7 come to mind and was already shown in the tech powerup review. A single GPU is more stable in regards to framerates, easier to overclock, more even min framerates, and doesn't live and die by its drivers. They also don't get nearly as messy when you try to add another card to the mix. This is a problem with all multigpu cards.

    Driver support for the 4870x2 has kind of died out so driver support for multicards really dies out when it successor comes out.
    Last edited by tajoh111; 11-09-2010 at 09:58 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  17. #92
    Xtreme Addict Chrono Detector's Avatar
    Join Date
    May 2009
    Posts
    1,142
    I'll wait for the GX2 card from NVIDIA, as I believe that it isn't really much of an upgrade if you already have the GTX 480.
    AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160

  18. #93
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    This card looks really good, much better than the GTX 480 was when comparing performance to watt, and also lower temps.

    I wouldnt ever be interested in the reference cooled designs though, I'd like to see some custom cooled ones soon hopefully.

    It is far too expensive though.

  19. #94

  20. #95
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    America's Finest City
    Posts
    2,078
    Quote Originally Posted by zalbard View Post
    Sure!
    Nice to see you guys included min, max and avg framerate data.

    Good idea!
    Should be both added to the OP.
    OTH doesn't like BSN* so he'll never add any of our reviews to his lists. Class act.
    Quote Originally Posted by FUGGER View Post
    I am magical.

  21. #96
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    Quote Originally Posted by SKYMTL View Post
    I think I understand what you are saying. However, I don't think the possibility of going from a HD5970 to a GTX 580 is even being discussed. What I did mention though is the inherent performance instability that dual card or dual GPU cards are known for. This isn't a case of "the drivers don't work" but simply an observation that single GPU cards will have consistent performance (most of the time) while dual solutions are usually waiting for driver updates, etc in order to live up to their absolute potential in the newest games.

    I don't know about you but I am sure most gamers would want near-full performance from day one rather than waiting for drivers / app profiles weeks after a game is released in order to achieve full performance from their $500 purchase.
    People that run 2560x1600 need dual gpu solutions at this moment in time because no single gpu setup can handle it. Most gamers know and understand the risks of using dual gpus. Yes it would be nice if the dual gpu setups worked like the old 3Dfx SLI, but that had to do with glide and it was the only reason worked it. People wait all the time for profile updates to their SLI/Crossfire setups all the time. Frame rates are not stable in a lot of games regardless of dual/single gpu unless you're capping the frames with vsync. Switching from different areas of a game cause dips in frame rates and you can't tell me a single gpu card can hold a certain fps throughout a game unless they are cpu bound and running a low resolution. If you want to talk about bad dual gpu performance look at the original 6800 SLI you could not play many games at all with a good fps without a profile. Both Nvidia and ATI have come a long way since and it is foolish to ignore this market segment of gamers and lately crunchers that want or need to use multi gpu setups. It would be good to realize that all the people that buy dual video card like the GTX460 or 6870 and X58 motherboards are going to run them in dual gpu setups. Check the Tech report and guru3d review they use SLI and crossfire in both.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  22. #97
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by safan80 View Post
    People that run 2560x1600 need dual gpu solutions at this moment in time because no single gpu setup can handle it.
    Thats totally exaggerating, I've been gaming at 2560x1600 since before even the 8800gtx and still do today on the same old monitor. If you want to game with everything on uber max details and filtering dual gpu solutions are the best option sure but single gpu's will without a doubt handle gaming at 2560x1600 with reasonable settings fine.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  23. #98
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Posts
    526
    Quote Originally Posted by ElSel10 View Post
    AMD has done the same thing with power monitoring since either the HD3000 or HD 4000. Do you have a problem with that?
    AMD iplementation does not detect what software you run, nVidias does detect. AMD's throttles anyway as a protective way. nVidias throttless anyway When Furmark or OCCT is ran.

    See the difference?

  24. #99
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    im surprised about reviews this early...
    Nvidia has also added temperature and power draw monitoring to the GeForce GTX 580 1.5GB via two additional chips on the card. This means that if the GPU or the card’s VRMs get too hot or try to draw more power than is safe, the GPU will clock down to avoid damage to the hardware.

    There are three things to note about this power management technology, the first being that as it’s enabled by two separate chips on the card - board partners can choose to leave them off to lower the cost of their card. Secondly, the GPU won’t increase in frequency if the power draw or temperature are lower than the maximums – the technology is more akin to Intel’s SpeedStep than Turbo Boost.
    hope it doesnt cause issues when ocing, especially on ln2... sounds like a pita since the throttling cant be detected...
    i hope gigabyte, msi and asus offer jumpers to disable this throttling...

    Quote Originally Posted by Anandtech
    Quite frankly the GTX 580 should be the GTX 485
    agree...
    has anybody actually run gf100 and gf110 side by side at the same clocks and compared performance?
    im curious how much those tweaks boost performance and in what situation...

    funny, everybody complains about how hot the 480 was, and how the 580 is so much better.
    and then they credit the gpu redesign and quite nvidias "transistor tweaking" when power consumption is actually almost identical between both.
    i bet the only reason the 580 consumes 10% less is because it has a much better cooler.
    lower temps = lower power consumption, we all know that...
    so id love to see somebody switch heatsinks on a 480 and 580 or compare both cards with either one of the two heatsinks

    i dont think it deserves to be called the 580, but it looks very like a very nice card actually!
    those min fps numbers look awesome...
    if i had to chose between a 580, two 6870s or two 470s id get a 580...
    the other options are cheaper and/or deliver better performance, but they are xfire and sli...

    now if only there were 200hz+ 3d displays... then buying such a card would actually make sense :P

    i really like the bittech review!
    best vga review ive read in a long time!
    a lot of focus on min fps, 5970 got disqualified in some tests cause of stuttering, perfect res and settings chosen to compare cards, not too low to be playable and not too high to not make sense... a lot of focus on customer value of the cards... very nice!
    Last edited by saaya; 11-10-2010 at 01:12 AM.

  25. #100
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    Quote Originally Posted by highoctane View Post
    Thats totally exaggerating, I've been gaming at 2560x1600 since before even the 8800gtx and still do today on the same old monitor. If you want to game with everything on uber max details and filtering dual gpu solutions are the best option sure but single gpu's will without a doubt handle gaming at 2560x1600 with reasonable settings fine.
    I only use max details and filtering, if i wanted to deal with anything less I would play games on consoles. I myself used an 8800GTX on my 3007wfp. Is card fashion it would only give me 30-40 fps in games, that is ok for single player games, but not good at all for multi-player games. Later I switched to dual GTX 285s and they gave me 60-80 fps which made multi-player games playable without AA, but i don't use that for multi-player anyways. Now a days I like to keep the frames around 70-100fps for online play.
    Last edited by safan80; 11-10-2010 at 01:16 AM.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

Page 4 of 9 FirstFirst 1234567 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •