Page 29 of 42 FirstFirst ... 192627282930313239 ... LastLast
Results 701 to 725 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

  1. #701
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by Callsign_Vega View Post
    840Mhz? Not bad. Considering an over clocked 6990 doesn't match the performance of stock clocked 580's in SLI in 95% of benchmarks, the GTX590 might turn out to be faster after all.
    Naa this is the clock reported overclocked using the 3th party software Afterburner, the card seems to be only 607mhz... I hope the card can oc to more of 840mhz anyway.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  2. #702
    Xtreme Member
    Join Date
    Jul 2009
    Posts
    319
    Lanek's right: 840Mhz is the MSI afterburners limit. I don't think you'll be seeing much over 700Mhz with the stock cooling.
    Quote Originally Posted by Cleatus View Post
    Just cause you pour syrup over crap dont make it pancakes

  3. #703
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by Aten-Ra View Post
    You mean Desktop cards based on Fermi architecture, because Tesla cards based on Fermi architecture DO have full DP support.
    Of course I mean desktop cards. The 590 is a desktop cards after all. Me saying non-full support implies there is a card with fulldp support. Otherwise I wouldn't have even mentioned it. Geez.

    Quote Originally Posted by halfwaythere View Post
    Lanek's right: 840Mhz is the MSI afterburners limit.
    Good spot, the translated source says "38% overclocking potential". They should set the limit at 9001mhz. Imagine the marketing possibilites.

  4. #704
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Or do it like asus does when they claim 50% OC, as in 50% better than the default volt oc, say 100 Mhz +50% = 150 MHz OC. You just gotta love marketing.
    "No, you'll warrant no villain's exposition from me."

  5. #705
    Xtreme Member
    Join Date
    Aug 2010
    Location
    Athens, Greece
    Posts
    116
    Quote Originally Posted by DarthShader View Post
    Of course I mean desktop cards. The 590 is a desktop cards after all. Me saying non-full support implies there is a card with fulldp support. Otherwise I wouldn't have even mentioned it. Geez.
    The way you wrote it implied that Fermi cards in general don’t have full DP support.

    Even without full DP support, DP analogy for GTX590 will be higher than GTX295 but I don’t think he cares about DP with Cuda for Folding

    http://www.anandtech.com/show/2977/n...th-the-wait-/6

    Last edited by Aten-Ra; 03-20-2011 at 03:04 AM.
    Intel Core i7 920@4GHz, ASUS GENE II, 3 x 4GB DDR-3 1333MHz Kingston, 2x ASUS HD6950 1G CU II, Intel SSD 320 120GB, Windows 7 Ultimate 64bit, DELL 2311HM

    AMD FX8150 vs Intel 2500K, 1080p DX-11 gaming evaluation.

  6. #706
    Xtreme Member
    Join Date
    Dec 2010
    Location
    Nor*cal
    Posts
    351
    Quote Originally Posted by Aten-Ra View Post
    The way you wrote it implied that Fermi cards in general don’t have full DP support.

    Even without full DP support, DP analogy for GTX590 will be higher than GTX295 but I don’t think he cares about DP with Cuda for Folding

    http://www.anandtech.com/show/2977/n...th-the-wait-/6

    fulldp is not utilized in folding @ home, at least at the time of that review.

  7. #707
    Xtreme Member
    Join Date
    Oct 2006
    Posts
    311
    Quote Originally Posted by Callsign_Vega View Post
    840Mhz? Not bad. Considering an over clocked 6990 doesn't match the performance of stock clocked 580's in SLI in 95% of benchmarks, the GTX590 might turn out to be faster after all.
    Since all the reviews you are talking about were done with 11.1 and 11.2 drivers that were out BEFORE the 6990, you are so right.

    But 11.4 BETA are turning the table aroung. Look at the slides. Here, 3 of the most demanding games out there, and 6990 and 580 SLi are head-to-head.

    Blah blah blah about the limited VRAM on the 580 Nvidia. But Nvidia are selling those POS card 500$, not me. And the 6990+6970 cost also 1000$. It's a real massacre. 580 SLI is totally obliterated.

    And this is with BETA 11.4 drivers! So why talk about ''95% of the benchmarks'' with drivers that were out before even the 6990 was out! So relevant! 11.4 are the first drivers to support the 6990. Nah. Pesky little details. So easy to compare with old drivers not supporting the card.

    The WEAK point of the 590 is the limited amount of VRAM, not the clock speeds. Look at the slides! Look what not enough VRAM is giving you! The results with the 590 will be the same, or lower then 580 SLI, since it's with lower clock speeds. And the limiting amount of VRAM will the same. 1.5Gb only. Results will be the same with the 590 against 6990.

    Please look at those slides. Where do you see 580 SLI ''dominating'' the 6990? I don't see that. Point me to it. Please, analyze those slides, and don't elude the question by posting irrelevant videos. And not some old benchmarks done with 11.1 or 11.2. drivers... There are plenty of those stupid reviews done with drivers that were out before the 6990 was even out on the market. Too easy. But commenting those slides with 11.4 is tougher. And shouting ''troll fanboy FACT FAIL AMD DRIVERS, blah blah blah'' is so much easier and convenient.

    The 6990 on OC BIOS is clearly head-to-head with 580 SLI 1.5GB. Please explain the ''domination''. I see 6990+6970 dominating the 580 SLI at the same price point, but not the 6990 OC BIOS versus 580 SLI. And the 590 will also by limited with only 1.5Gb. Same thing. Look at the slide. You can call me troll or fanboy all you want, but the topic is the 590, and that card will be underclock compared to 580 SLI, and also have only 1.5GB like those. So those slides are totally relevant to the topic.

    People really beleive the 590 will BEAT 580 SLI? No. Or else Nvidia would sell the 590 1200$. They won't antagonize their 580 SLi market.














    And why not post your famous videos of 3X580 1.5GB, but against 3X6970 2GB this time with 11.4 drivers? Where are they? No. Since it would be too logical to do it. And since the 3X580 1.5Gb are VRAM limited, 3X6970 would be better. No. All happy to do 2X6990 against 3X580 1.5gb. But 3X 6970 should also be threre to compare, and with 11.4 (not 11.1) to be relevant.

    Or why no videos of 6990+6970 against 580 SLi 1.5Gb?

    No. Too easy. 2X6990 against 3X 580, while even my old mother knows that Quad-Fire on 2 cards doesn't scale well, just like Quad-SLI (2X590) will totally loose against 3X 6970 2Gb. SAME THING.

    Why not do a more logical video of 2X6990 against 2X590 next week? That,s what I want to see.
    Last edited by Levesque; 03-20-2011 at 04:27 AM.
    i7 3930k EK-Supreme HF - Asus Rampage IV X79 - QUAD-Fire: 4X Asus 7970 EK waterblocks - 4X4GB=16GB RipjawsZ 2400 CL9 - Crucial C300 128Gb M4 128Gb 2x Intel X25-M 160GB 3X Seagate 2TB - Mountain Mods Extended Ascension + Pedestal 24 - Dual-PSU: Antec HCP-1200 + Corsair AX850 - EyeFinity 3X 30'' LCD - Windows 7 64

  8. #708
    Xtreme Member
    Join Date
    Aug 2010
    Location
    Athens, Greece
    Posts
    116
    Quote Originally Posted by jeremyshaw View Post
    fulldp is not utilized in folding @ home, at least at the time of that review.
    Does Folding@Home use DP at all ?? I had the impression it only uses FP32.
    Intel Core i7 920@4GHz, ASUS GENE II, 3 x 4GB DDR-3 1333MHz Kingston, 2x ASUS HD6950 1G CU II, Intel SSD 320 120GB, Windows 7 Ultimate 64bit, DELL 2311HM

    AMD FX8150 vs Intel 2500K, 1080p DX-11 gaming evaluation.

  9. #709
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by Levesque View Post

    The WEAK point of the 590 is the limited amount of VRAM, not the clock speeds. Look at the slides! Look what not enough VRAM is giving you! The results with the 590 will be the same, or lower then 580 SLI, since it's with lower clock speeds. And the limiting amount of VRAM will the same. 1.5Gb only.
    I have often expressed my concerns over the lack of VRAM on high end GPU's Not only for frame buffer and >1920*1200 resolution, but also as there are games which love the extra VRAM (especially when you start adding high resolution texture packs to them).

    Unless nVidia either significantly increase the efficiency of memory allocation on their GPU's and/or special editions of the GTX590 with 3GB per GPU are released your analysis will be absolutely correct.

    Oh and IF special edition 590's with 3GB per GPU are released they will be very, very, very expensive

    John
    Stop looking at the walls, look out the window

  10. #710
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Levesque View Post
    Same old, same old... I'm running a 6990+6970 in Tri-Fire with 11.4 preview BETA drivers and everything is fine for me.

    Played 4 hours of Dragon Age II yesterday, everything maxed out with PC high-texture pack, Metro 2033, Crysis, Stalker COP, every settings maxed-out with 4AA/8AA. Played all night long. Slept only 3 hours. Smooth fps all along.

    Not a single bug/freeze/crash/BSOD all night! And with BETA drivers!

    Mobo didn't melt, the noise was not bad, didn't wake my wife or my children, even my neighbor was snoring more loudly then my cards! And my room was not warmer after all those hours playing...

    AMD cleaned the house recently, and 11.4 is a good signs of what's coming for them in the next months

    I know I know. ''Yes, but I had a 3xxx serie ATI cards 8 years ago and it was a POS AMD drivers are so much FAIL''... blah blah blah.

    It's not true anymore.
    i never said ati drivers suck...
    i had just as many issues with ati drivers as i had with nvidia drivers...

    Quote Originally Posted by DarthShader View Post
    The excuses and spins are already starting, even before the card is officialy out and none really knows about it's real performance, temperatures and noise levels.
    yepp, a lot of people already made up their mind and are trying to justify their decision

    Quote Originally Posted by DarthShader View Post
    In regards of overclocking, while I can see that the potential is there with so much downclocked cores, I would be worried about cooling them with a non-stock cooling solution, be it air, water or ln2, due to the close proximity of power supply plugs and the second GPU core.
    thats true... the position of the power plugs on the 590 are really bad for ln2 and not great for water either...
    i dont think that itll make people chose a 6990 over a 590... people who want a 590 will still go for it i think...

  11. #711
    Registered User
    Join Date
    Apr 2010
    Posts
    11
    Quote Originally Posted by Aten-Ra View Post
    Does Folding@Home use DP at all ?? I had the impression it only uses FP32.
    Correct, the GPU code only uses FP32 currently. And in fact a lot of scientific code use DP only as a safety measure against cumulative precision loss.

  12. #712
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Mad Pistol View Post
    So why is it that there is it that a $230 CPU can accept virtually any multi-GPU configuration and offer virtually no bottlenecks in gameplay? The Sandy Bridge i5/i7's are very powerful and offer up extremely high framerates if the GPU is fast enough to keep up with it.

    Sorry, but I think GPUs are holding back current CPUs, and those CPUs are on dinky coolers too. In essence, a 300+ watt GPU is holding back a 95-watt CPU from reaching its full potential... something is wrong with that picture.
    I really don't understand what you're asking. 3D rendering is an extremely compute intensive process and the vast majority of that burden falls to the graphics card. What do you mean by the GPU is the bottleneck? The reason a cheap dinky CPU is good enough is because 3D rendering is a mostly GPU intensive task. Anybody who follows this scene should know that.

    Try rendering a game on your CPU and see how well your 95w CPU does at that task.

  13. #713
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by trinibwoy View Post
    I really don't understand what you're asking. 3D rendering is an extremely compute intensive process and the vast majority of that burden falls to the graphics card. What do you mean by the GPU is the bottleneck? The reason a cheap dinky CPU is good enough is because 3D rendering is a mostly GPU intensive task. Anybody who follows this scene should know that.

    Try rendering a game on your CPU and see how well your 95w CPU does at that task.
    That's not what I'm talking about. There are plenty of tasks for a CPU to do. It is running the rest of the computer's programs as well. A single core CPU doesn't cut it anymore. That's why we have 4 and 6 core CPUs. Also, GPU's will always be better at their given tasks. A GPU with hundreds of cores is far better at chucking out pixels very quickly than a few cores from the CPU. However, a CPU is much better at taking linear instructions.

    What I'm saying is that a very CPU intensive game such as BC2 can be maxed out on a high-powered dual core or a mediocre quad core, and the framerate will remain above 60 FPS if the video cards are powerful enough. It calculates a lot of stuff. Gun shots/hit detection, environmental interactions, physics calculations, plotting character movement and AI interactions, etc. On top of all that, it runs any sorts of programs the OS is running in the back ground as well, including the sound driver, video driver, hardware controllers, etc. The CPU is the brain of the computer. Somehow, that dinky CPU can do a ton of stuff while the GPU is completely devoted to rendering in games.

    For reference, I bet we could have games that uses twice as much CPU resources as BC2, and if you get a SB i5/i7 CPU (around $230) a $110 motherboard, and $50 worth of memory, it will chew it up and spit it right back in your face. You do the same thing for games (create a game that uses twice as much rendering power) and the best video cards out there will choke and sputter, even the uber powerful $1000 video card setups. Hell, we still can't max out Crysis at 60 FPS without spending $1000+ on video hardware. Even a couple 6990's can't keep Crysis above 60 FPS the entire time. (Start @ 17:49 for Crysis) That game was released 4 years ago. Why is it still a benchmark for which hardware companies are trying to overcome?


    GPUs are very complex pieces of hardware. No doubt. But a GPU cannot function on its own with out a CPU to drive it. A GPU does not render that scene by itself unless it's programmed to do so. It still has to have the coordinates for rendering and other instructions given by the CPU. That's why a benchmark like Heaven requires about 5-20% of CPU power to run flawlessly, even though it has virtually no AI or interactions requiring CPU power.
    Last edited by Mad Pistol; 03-20-2011 at 07:17 AM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  14. #714
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Sorry but I'm really not getting your point here. You started off saying that GPUs are "bottlenecking CPUs". That's completely false. A cheap CPU is enough to run most games but we need expensive GPU hardware. That's cause games don't need much CPU power but a lot of GPU speed. That's not the fault of GPU manufacturers or GPU hardware, it's just a fact based on how the software is written.

    A bottleneck occurs when a component is unable to do more work because it's held up by another component. That's not the case here. CPUs don't do more work cause the software isn't giving them more work to do.

  15. #715
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by trinibwoy View Post
    Sorry but I'm really not getting your point here. You started off saying that GPUs are "bottlenecking CPUs". That's completely false. A cheap CPU is enough to run most games but we need expensive GPU hardware. That's cause games don't need much CPU power but a lot of GPU speed. That's not the fault of GPU manufacturers or GPU hardware, it's just a fact based on how the software is written.

    A bottleneck occurs when a component is unable to do more work because it's held up by another component. That's not the case here. CPUs don't do more work cause the software isn't giving them more work to do.
    So you're saying that the first statement is false, and the second statement is true? A bottleneck happens when a piece of hardware isn't fast enough to keep up with the rest of the computer. Very rarely is a CPU the bottleneck. Most of the time the GPU is the bottleneck in a game. I guess what I'm trying to say is I don't see how the first half and the second half of that statement don't go hand in hand. When a GPU bottlenecks a CPU, it means the GPU isn't fast enough to keep up. It also means that you can use lower power CPU's to run games in most instances.

    You just reinforced my point about GPUs not being able to keep up with CPUs. Thanks.
    Last edited by Mad Pistol; 03-20-2011 at 08:51 AM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  16. #716
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Mad Pistol View Post
    So you're saying that the first statement is false, and the second statement is true? A bottleneck happens when a piece of hardware isn't fast enough to keep up with the rest of the computer. Very rarely is a CPU the bottleneck. Most of the time the GPU is the bottleneck in a game. I guess what I'm trying to say is I don't see how the first half and the second half of that statement don't go hand in hand. When a GPU bottlenecks a CPU, it means the GPU isn't fast enough to keep up. It also means that you can use lower power CPU's to run games in most instances.

    You just reinforced my point about GPUs not being able to keep up with CPUs. Thanks.
    I'm not sure what the performance will be on a 590 but from what I recall you can bottleneck a cpu with software audio requiring enough hardware threads and cpu cycles that an overclock may be needed. And the thing is most of the newer games coming out today use software audio. This isn't implying that those games audio bottleneck the cpu. I'm just throwing out a hypothetical situation when now you want the full CPU's attention to run a dual gpu solution.
    [SIGPIC][/SIGPIC]

  17. #717
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Too true. My fiance's system with an e8400 idles at around 5-7% CPU usage because of that stupid audiodg file. Everything sound wise is done via software now if you're running Vista or 7. This makes a lot of sound cards useless except for post-processing effects.

    I've never found the extra CPU overhead to be a problem though.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  18. #718
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Mad Pistol View Post
    So you're saying that the first statement is false, and the second statement is true?
    Yep.

    Remember this all started with you complaining about the size of GPU coolers and their power consumption. I'm simply saying that GPUs have higher power consumption and bigger coolers cause they do far more of the work. Your complaints ignore the realities of the software being run on modern systems.

    If you want to argue that high end CPUs are "bottlenecked" in games cause they hardly have any work to do during 3D rendering I can get behind that I would say CPUs are underutilized by game engines. There is a lot more work CPUs could be doing that have no dependency at all on the GPU - better sound, AI, animation, physics etc. The fact that your six core CPUs are idling is no fault of the graphics card.

  19. #719
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    I'm just going to say that with a single 6950 I saw some nice improvements with a few games with a conservative overclock on my i7 860 which is still a top notch cpu.

  20. #720
    Xtreme Member
    Join Date
    Jan 2010
    Posts
    162
    I want this
    Internal Watercooling Antec 900 Build Log*

    WC loop: Black ice extreme 240 radiator, XSPC 120 radiator, HK 3.0, D5 pump with Bitspower top, DD fillport, Primochill tubing, distilled + ptnuke

    Pink Floyd is #1

  21. #721
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    743
    There are more games out now and down the pipeline that are requiring the latest SB cpus. Battlefield is the perfect example of this. at 19xx res and up you almost double your fps going from a quad 775 to a a 2500k/2600k overclocked and that is with just single player. With multi player it CPU comes more into play. MMO's and RTS are also known CPU hogs.

  22. #722
    Xtreme Addict
    Join Date
    Mar 2010
    Posts
    1,079
    When I changed my E8500 for my current x3370 I did notice an increase in the frame rate but I don't think that a much more powerful CPU will be as useful.
    Going from 30fps to 60 fps is a nice performance increase, but going from 60 to 120 is absolutely useless (as long as you don't use 3D, that's it)

  23. #723
    Xtreme Member
    Join Date
    Apr 2005
    Location
    Sweden
    Posts
    324
    Quote Originally Posted by kadozer View Post
    There are more games out now and down the pipeline that are requiring the latest SB cpus. Battlefield is the perfect example of this. at 19xx res and up you almost double your fps going from a quad 775 to a a 2500k/2600k overclocked and that is with just single player. With multi player it CPU comes more into play. MMO's and RTS are also known CPU hogs.
    can you please post your source on that, about SB getting double fps in battlefield than a 775 quad? bad company 2 i presume.
    E6600"L630A446"? @3600@1.?v cooled by Tunic Tower sitting on Abit AB9 Quad GT played on ASUS 8800GTX opperated by a lazy slacker!

  24. #724
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by kadozer View Post
    There are more games out now and down the pipeline that are requiring the latest SB cpus. Battlefield is the perfect example of this. at 19xx res and up you almost double your fps going from a quad 775 to a a 2500k/2600k overclocked and that is with just single player. With multi player it CPU comes more into play. MMO's and RTS are also known CPU hogs.
    http://www.guru3d.com/article/core-i...600k-review/21 plus this: http://www.overclock.net/intel-gener...4-955-3-a.html

    Sorry to say, but that statement is false. The framerate is virtually identical. At that level, you're GPU limited. Even with an HD 6990 or GTX 590, you're still going to have some limitations. There's no doubt that an i7 2600k is faster than a Phenom II X4 @ 4Ghz or a similar Core 2 Quad, but when you're arguing that a faster CPU will benefit the gaming experience, in this case, I just don't see how that can be possible. I mean, even when the CPU is the limitation at 1024x768, BC2 is still chucking out 90+ FPS on a Phenom II quad core. As soon as you move up one more level to 1280x1024 or 1600x900, the CPU is no longer an issue. Would anyone here seriously consider that a bad thing?

    Those tests were done with a GTX 580, too. No way that thing could be considered a bottleneck.
    Last edited by Mad Pistol; 03-21-2011 at 04:05 AM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  25. #725
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Mad Pistol View Post
    There's no doubt that an i7 2600k is faster than a Phenom II X4 @ 4Ghz or a similar Core 2 Quad, but when you're arguing that a faster CPU will benefit the gaming experience, in this case, I just don't see how that can be possible. I mean, even when the CPU is the limitation at 1024x768, BC2 is still chucking out 90+ FPS on a Phenom II quad core. As soon as you move up one more level to 1280x1024 or 1600x900, the CPU is no longer an issue. Would anyone here seriously consider that a bad thing?
    Exactly. A CPU does the same amount of work to render a frame at 1024x768 as it does at 5760x1080. It contributes nothing to the increase in quality and IQ when you raise the resolution. That's why mediocre CPUs work just fine.

Page 29 of 42 FirstFirst ... 192627282930313239 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •