Point is its a VRM design flaw/corner-cut in ATi reference cards ONLY thats been said.
So any talk about 'Games don't crash/ My Nvidia card pulls 10000Amps and don't crash WTF' is irrelevant at this point in the thread. Maybe not irrelevant when he started it, but if you read the thread, its irrelevant now.
If the card is incapable of limiting itself without a crash based on high load, or simple dealing with high load, A STOCK CLOCKS AND VOLTAGES, its a design flaw. And the reason I'm saying its a design flaw is that OTHER manus who used NON-REFERENCE cards spent the money for their customers to make the power circuitry that DOES work in all situations.
Last edited by [XC] riptide; 05-25-2009 at 11:49 PM.
Who cares if your "new test" causes issues with certain cards, lastr I checked it was the performance and stability of GAMES that's important, not "tests".
Core 2 Duo(Conroe) was based on the Intel Core Duo(Yonah) which was based on the Pentium M(Banias) which was based on the Pentium III(Coppermine).
Core 2 Duo is a Pentium III on meth.
Agree it's a test but what if a game needs 82A from VRM's? Will be 2 choices:
- end user will stare at a black screen and curse ATI
- game developer will phone to ATI and ATI will tell him to reduce something from graphics because they don't need end users to curse them.
Anyway ATI never stops to amaze me.
PS. How come that the rest of ATI's models are fine?![]()
If it ain't broke... fix it until it is.
Because they dont need more than 82Amps on a 3 phase budget VRM and the OCP is set higher.
Facepalm + Sigh....
No, it wouldn't be the same issue as the 3dmark(or any program for that matter) "optimizations". As long as the output is not touched, or changed, then all they are doing is artificially limited your program to prevent damage.
If ATi goes and through their drivers and modify the way it OUTPUTS, then yes its an issue, but by limiting the performance and not changing the output at all? Sure, having to limit performance in the name of stability is disappointing, but ultimately null point.
Wrong, there is a third choice, ATi implents a fix in the drivers and limits the performance slightly. Heres my issue with the whole thing as long as the OUTPUT remains the same, then performance limited or not, who gives a damn.
Its like the TLB bug, the issue affected 0.001% of people, the fix caused a overall performance drop of around 25%. Luckily with GPU's, you can fix the way the GPU/card performances in the specific(0.001%) application, instead of having to castrate the WHOLE thing.
Sure, its a disappointment that the performance of that specific(0.001%) application is lowered, but how often do you think it will be detrimental to you enjoyment? Probably never. Again, the amount of games/programs which *might* affect by this is almost none(0.001%).
Again, luckly any "bug fixes" in drivers would only affect that one specific program and not cause a huge drop all across the board. as long as the output is not changed or degraded, who cares.
If the game you love to play is somehow effected by this, and loss caused by the fix doesn't fit your *performance* requirements, then go buy a card that does. Last I checked ATi never guaranteed the the performance of all games and programs will meet a certain level, and neither does nVidia for that matter.
Last edited by Vienna; 05-26-2009 at 01:46 AM.
Core 2 Duo(Conroe) was based on the Intel Core Duo(Yonah) which was based on the Pentium M(Banias) which was based on the Pentium III(Coppermine).
Core 2 Duo is a Pentium III on meth.
I'm having problems with my 4850, it is stock and never been overclocked. Far Cry 2, Brothers In Arms Hell's Highway and Call of Duty. I upgraded my system thinking it was anything but the Video card. So if it were only this test, I wouldn't give a flying flickering #uck but that's not the case. This test crashed on me even at lower settings. Maybe my card is bad.
I pulled the 3870 from my wife's rig. It Ran the test and all 3 of the Games perfectly. I have five computers in my house, they all have ATI video cards and I'm pretty anti-nVidia for the record. X800, X1800XT, 3650, 3870 and 4850.
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
The Cardboard Master Crunch with us, the XS WCG team
Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64
When playing @ 2560 x 1600 these games pull higher amps for me:
Crysis
Crysis Warhead
Mass Effect
Empire: Total War
GTAIV
1680 x 1050 = 1764000 pixels
2560 x 1600 = 4096000 pixels (more than double)
More than double the pixels seems to mean higher current draw (at least on my current hardware)....
The highest thus far is Empire Total war which draws a *lot* of current when displaying the campaign map at 1600P....
That misrepresents my post.
I have read this thread diligently from start to finish and I am soooooo far from a fanboy the concept is ridiculous.
All I was attempting to contribute was the fact that at very high resolutions the current draw through ALL brands of graphics card vrm's is increased.
I was just trying to find some games that when played at this resolution (2560 x 1600) approach 80 amps in current draw...
I just wish I had an ATI card to test with atm because when gaming with my Nvidia card at 2560 x 1600 the current levels are almost in the zone where OCCT helps some reference 4870 / 4890's to fall over.
Now I know that Nvidia and ATI cards will have different current draws in the same situation due to dufferent architecures and configurations etc so I would be very interested if someone could test with ATI hardware @ 2560 x 1600 just to see if the current draw gets over 80 amps at this resolution when gaming...
Anyone out there with ATI hardware and a 30" screen?![]()
Last edited by Biker; 05-26-2009 at 10:38 AM.
X5670 B1 @175x24=4.2GHz @1.24v LLC on
Rampage III Extreme Bios 0003
G.skill Eco @1600 (7-7-7-20 1T) @1.4v
EVGA GTX 580 1.5GB
Auzen X-FI Prelude
Seasonic X-650 PSU
Intel X25-E SLC RAID 0
Samsung F3 1TB
Corsair H70 with dual 1600 rpm fan
Corsair 800D
3008WFP A00
Last edited by Talonman; 05-26-2009 at 08:32 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
i950 (3035B684)
Gigabyte EX58A-UD3R
3x4GB G.Skill PC3-12800 7-8-7-24
HIS Radeon HD 6970 2GB & Dell 3007WFP-HC
Asus Xonar DX
128GB C300, Velociraptor & Sammy F3's
Corsair AX850W
Windows 7 Ultimate x64 SP1
I would only add my opinion: "I don't play OCCT, neither Furmark nor this stupid tests means something to me .. if it crashed after 8 hours play of Crysis, then it means something."
Its same stupidity as Linpacks etc. its load that cant be created normal way.. eg. its meaningless.
i7 930 D0 - 4,2 GHz + Megashadow
3x4GB Crucial 1600MHz CL8
Foxconn Bloodrage rev. 1.1 - P09
MSI HAWK N760
Crucial M500 240GB SSD
SeaGate ES.2 1TB + 1TB External SeaGate
Corsair HX 850W (its GOLD man!)
ASUS STX + Sennheiser HD 555 (tape mod)
Old-new camera so some new pics will be there.. My Flickr My 500px.com My Tumblr
Here is one piece of answer on the very same computer, with a GTX280 :
OCCT vrmA usage : 84.68 A
http://www2.noelshack.com/uploads/occtGTX280033111.jpg
Furmark vrmA Usage (the donut is not taken as the user had a beautiful black box showing) : 79.84A
http://www2.noelshack.com/uploads/furmarkGTX050607.jpg
I'm currently asking what Furmark configuration the user used. And especially if he could retake the screenshot to include it in the pic (so that i don't get accusation like "you faked it - blablabla", even if it is easy to fake - sigh). But no - i'm not gentle on Nvidia cards. I asked him to use the same resolution as OCCT to be able to compare them.
I did not drop that FPS matter yet, mind you.
No. The user told me that my test was biaised because too gentle on Nvidia chips. Like castrated on purpose, i put a limiter if i found the "Nvidia" string somewhere, because i worked for them. Like i was paid by Nvidia to kill AMD cards, something like that.
He didn't say that "clearly", but that what's transpired from his posts.
I'm just saying that my test is more stressfull on ATI cards than Furmark.
And that my test is more stress full on Nvidia cards too than Furmark.
So he can't accuse me of being a Nvidia spy, or anything like that. Unless furmark is also paid by Nvidia, of course.
now, the readings being what they are, they are definitly NOT the same.
Especially since those were made in Windowed mode. Not fullscreen. That in itself makes a huge difference.
well clearly the gtx 295 should perform about the same as my 2 4890s (which do not crash @ 900mhz core)... but it doesn't at all, its off by nearly 40% performance wise
so whats the problem? your not putting the same performance load on the nvidia cards
Last edited by Shadov; 05-26-2009 at 11:22 AM.
is it possible that the nvidia card cannot simply keep up with these kinds of tasks? i thought ati cards also killed nvidias in furmark?
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
The load just proved that my test isn't gentle on Nvidia cards.
And yes, i'm starting to think the Nvidia cards cannot keep up at those kind of tasks, but they are still giving all they've got, looking at their power consumption.
Seems like the configuration he used for furmark is as follows : 1680x1050, extreme burning, no AA. I don't know if it is fullscreen or not.
OCCT GPU was 1024x768, windowed mode. The OCCT conf can be seen on the screen. I asked him to redo it at the same res than OCCT for a good comparison.
GTX295 should perform on par with 2 4890s,but it doesn't.The only thing that could be responsible for this is the code in the test(and no,I'm not implying it is deliberate attempt by the author).It could be that test is really not stressing Nv cards as much as ATi's.
It is not the code, it is the architecture that is not as effective as ATI on those particular instructions.
I did contact Nvidia about that matter. Let's give the matter some time. Aside from the FPS value, the vrmA rating shows that the cards are put under heavy stress, which is want we want.
I remind you that OCCT is *NOT* a benchmark. For that, use 3Dmark![]()
Then you must have a hard time understanding this graph:
Take a look at the 2560x1600 test. See how horrible Nvidia cards are. This Red Alert 3 benchmark is confirmed by several other hardware review sites in addition to Xbitlabs.
Do you think that the game is "really not stressing Nv cards as much as ATi's"?
It is a known fact that when it comes to FP16 texture filtering throughput, a 4870 has over 20% higher fill-rate than a GTX 280 as shown in a 3dMark Vantage test. Do the numbers 1.2 TFLOPS for a 4870 and 933 GFLOPS for a GTX 280 mean anything to you? Perhaps this OCCT benchmark is pushing somewhere close to 1.2 TFLOPS?
Last edited by Bo_Fox; 05-26-2009 at 12:38 PM.
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
I checked my cards found one that had Vitec 9853 regulator on it
shut down on full screen(1680x1050) and 1024x768
but I was able to run 20 OCCTGPU's at once
Now that's a stress test
![]()
I have shown that those numbers don't make sense. Nvidia cards still should be performing better then they do. With the numbers from a GTX285, they are not using the MUL to it's full potential, one of the things that was supposedly "fixed" from G80. It is not performing at the 1.06TFlop(MADD+MUL) level nor at the 708GFlop(MADD) level but an inbetween. Basically the MUL is only being exploited ~45% of the time.
Sure, it is loading the cards but it is not exploiting all the shaders that the G200 has to offer.
People are complaining that they cannot fully utilize their RV770 cards, since some have to be downclocked to be stable in OCCT.
The flipside is that you are also not getting full advantage of all that G200 has to offer with the results of OCCT.
When using a G200 you are only receiving ~82% of the performance.
Last edited by LordEC911; 05-26-2009 at 01:26 PM.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
Bookmarks