I'm not in any way trying to be rude here, but until I see some type of white papers, what that engineer said sounds like complete and utter crap.
I've worked at an electronics manufacturer for the last ten years. I've taken multiple classes in AC and DC electrical theory, and solid state circuit design. I haven't cared to finish a degree yet, but I'm not that far, trust me.
There is no logical way that a card would "sense" that it had its fan removed, and then for some reason all of a sudden start sucking more power. It just doesn't add up to me. There is no way that I know of that cooling a component will suddenly cause it to start drawing more current, nor can I see any reason that they would design the card to self destruct with aftermarket cooling which is basically what you have quoted from the AMD engineer.
What I am guessing is that this guy does not quite know what he is talking about. Perhaps he means that when people put aftermarket cooling on and overclock the card it can still overheat, but even that doesn't make any sense from what I've seen.
Remember, I've had two (now three) cards testing for going on two months now running perfectly cool and without the slightest sign of stress or strain. The reported current draw in gpu-z never increased upon removal of the stock cooler on any of the cards.




Reply With Quote

Bookmarks