Hello guys.
Between Western Digital and Seagate, which brand do you think have lower heat generation (or better heat dissipation) for hard drives with similar specs like same number of platters, rotational speed and/or storage capacity.
We had an argument regarding the thermal efficiency of HDD's on our local tech forum and most are of the opinion that Western Digital tends to stay cooler than Seagates.
I have both Seagate and WD HDD's in my Rig and my own opinion has been that the Western Digital are indeed cooler than the Seagate's. As you can see from the Everest sensor readings, the two HDD which were loaded and the Seagate 320GB (2 Platters, 7200RPM) runs hotter than the Western Digital 500GB (3 Platters, 7200RPM).
What is your opinion on this guys?
We are also going try and get some readings on the temperatures of various brands of HDD's using external thermal measuring devices.
1. We have an Infrared Thermometer at our disposal, so it it fine if we take readings with it. Will it be accurate enough to compare the temperatures?
2. Which places on the HDD should we take the reading from. Currently I am of the opinion that the temperature at the bottom is the one that matters and though we would be taking a reading from the top too, we would not be using it as a measure as this will not give the correct picture the thermal efficiency of the HDD. Is this strategy alright? Any thing more you what to add here is welcome.
Bookmarks