unclewebb, so what about mac os X version?
unclewebb, so what about mac os X version?
Intel Q9650 @500x9MHz/1,3V
Asus Maximus II Formula @Performance Level=7
OCZ OCZ2B1200LV4GK 4x2GB @1200MHz/5-5-5-15/1,8V
OCZ SSD Vertex 3 120Gb
Seagate RAID0 2x ST1000DM003
XFX HD7970 3GB @1111MHz
Thermaltake Xaser VI BWS
Seasonic Platinum SS-1000XP
M-Audio Audiophile 192
LG W2486L
Liquid Cooling System :
ThermoChill PA120.3 + Coolgate 4x120
Swiftech Apogee XT, Swiftech MCW-NBMAX Northbridge
Watercool HeatKiller GPU-X3 79X0 Ni-Bl + HeatKiller GPU Backplate 79X0
Laing 12V DDC-1Plus with XSPC Laing DDC Reservoir Top
3x Scythe S-FLEX "F", 4x Scythe Gentle Typhoon "15", Scythe Kaze Master Ace 5,25''
Apple MacBook Pro 17` Early 2011:
CPU: Sandy Bridge Intel Core i7 2720QM
RAM: Crucial 2x4GB DDR3 1333
SSD: Samsung 840 Pro 256 GB SSD
HDD: ADATA Nobility NH13 1GB White
OS: Mac OS X Mavericks
Uncle Webb, maybe you can make a version that supports an 8core chip on your free time. Thanks...
Apologies if am asking too much.
Last edited by maxxx; 10-29-2012 at 08:15 AM.
Hi maxxx, I am pretty sure I did make an 8 core version of RealTemp once upon a time but the guy I wrote it for said it was too big and ugly so I scrapped it. Unfortunately I don't have enough time or hardware for an 8 core update now.
Lately I have been working on a special edition of RealTemp for my friends over at Tech|Inferno which is just about done.
It includes the return of VID and VID based power consumption monitoring for Intel's 3rd Generation Core i CPUs.
I also added a column to report the CPU Package temperature but in my testing, that just looks like the maximum of the 4 cores so that is not too useful.
There is also a new C-State reporting window. I like using C3/C6 to save some power but most enthusiasts at XS turn that junk off when overclocking.
WaterFlex: I don't know of any easy way to read CPU registers in OS X. In Windows I am using the WinRing0 library but I don't know of anything equivalent for Mac. I also don't own a Mac or any sort of development tools so you're out of luck unless you are running Boot Camp.
hey why i have such big difference between cores?
ASUS P5K-E // E8400 Q746A519
G.Skill F2-8000CL5D-4GBPQ
LC 550W GP// XPERTVISION 9600GT
Maybe you don't install your CPU Cooler well or maybe you don't put the TIM ok ?
difference is around 5-7?c from coolest cpu 0 to warmest cpu 3....using i5 2500k/asrock z68 e4g3 etc.
ASUS P5K-E // E8400 Q746A519
G.Skill F2-8000CL5D-4GBPQ
LC 550W GP// XPERTVISION 9600GT
The difference is ok.
Last edited by Vladutz20; 11-04-2012 at 06:03 AM.
RealTemp seems to have a problem starting on Windows 8. Will run and work fine when you click on it, but will not start with Windows 8. Tried putting shortcut in start-up folder, reg entry in current user run and even task scheduler with elevated privileges, it just wont run at start-up. Even tried another PC with Windows 8, same thing. Any ideas Uncle Webb? Thanks.
edit: OK got it to start with task scheduler. It only worked when I selected the "run when user is logged on" option, it wont work if "run whether user is logged on or not" .
Seems 8 is a bit more difficult to work with start up programs:
http://stackoverflow.com/questions/1...n-requirements
Last edited by aamar; 11-09-2012 at 08:00 AM.
if you run it before a user logs on, then you won't be able to see the system tray icon or open the window when you log on. this is part of how win7 works so i assume win8 is the same.
Main: i7-930 @ 2.8GHz HT on; 1x GIGABYTE GTX 660 Ti OC 100% GPUGrid
2nd: i7-920 @ 2.66GHz HT off; 1x EVGA GTX 650 Ti SSC 100% GPUGrid
3rd: i7-3770k @ 3.6GHz HT on, 3 threads GPUGrid CPU; 2x GIGABYTE GTX 660 Ti OC 100% GPUGrid
Part-time: FX-4100 @ 3.6GHz, 2 threads GPUGrid CPU; 1x EVGA GTX 650 100% GPUGrid
Starting RealTemp with Windows 7.
http://forum.notebookreview.com/hard...ml#post6865107
http://www.xtremesystems.org/forums/...essors/page146 Post #3645
I think that works in Windows 8 but I haven't done any recent testing.
Maybe this weekend I will get around to finishing off RealTemp T|I.
http://img707.imageshack.us/img707/2...empaisuite.png
Last edited by unclewebb; 11-12-2012 at 10:20 PM.
Unclewebb: may we know the difference from the regular version of RealTemp? TIA....Maybe this weekend I will get around to finishing off RealTemp T|I.
Here's the README file for the new T|I version.
http://www.mediafire.com/?ouo9lzeuu9r0tkq
The program should be ready for download within the next day. I will post a link when it is up.
I am just beta testing a few minor features and updates, mostly for the newer Sandy and Ivy Bridge CPUs.
RealTemp Tech|Inferno Edition
http://www.techinferno.com/downloads/?did=53
My HeatWare: http://www.heatware.com/eval.php?id=70151
This looks great - thanks unclewebb! (unfortunately I have no 'thanks' option in the forum).
Powerlimits feature is working nicely on my ASRock mobo.
Going to swap to this version to get some good baseline temps before I de-lid my 3770k.
Intel 6700K @ 4.3 GHz
Gigabyte GA-Z170-HD3 DDR3
4x4GB Corsair DDR3-2800
GTX980Ti MSI Lightning
Samsung 950 Pro 512GB
4 x Intel 80GB x25-m SSD's RAID0
Enermax Revolution 85+ 1250W
Unclewebb, I notice theres a difference in temps between RT v3.70 and RT-TI (v3.75), roughly about 5c degree less per core for RT 3.70. I had also noticed the same difference between RT vs other hardware monitoring apps (Aida64, HWinfo64, etc). But RT-TI now seems in line with these mentioned apps. Did not find any explanation on this in the readme. CPU used is i5-3570k.
aamar: Open up the RealTemp Options-Settings window and make sure both versions of RealTemp are using the same value for TJ Max. If both programs are not using the same value, you need to click on the Defaults button if you want RealTemp to read the correct value for your CPU.
The wrong value for TJ Max can get set in the INI file if you manually set this in RealTemp 3.70 or you might have swapped from a 2nd Generation Sandy Bridge CPU to a 3rd Generation Ivy Bridge CPU without updating RealTemp. When this happens, RealTemp will continue to use the Sandy Bridge TJ Max value found in the INI file when it should be using the 105C value that is correct for Ivy Bridge CPUs.
Adjustable TJ Max was a good idea at the time during the Core 2 era but it's not really necessary anymore so I have disabled this feature in the T|I Edition of RealTemp.
If TJ Max is set equally, both versions of RealTemp should be reporting the same temperatures. When lightly loaded and using some of the C sleep states like C3/C6, the core temperatures can fluctuate rapidly so put a load on your CPU for a fair comparison. Prime 95 Small FFTs is still my fav for equally loading the cores.
Thank you Unclewebb. That was it. TJ-Max in RT v3.70 was set at 99 (for previous CPU I used, i5-760). RT-TI is at 105c which is for the 3570k. All is well, thanks.
Just wondering if someone can give me some guidance with my calibration? When I adjust the idle the TJmax changes also? Not sure if I'm missing something?
The core variance seems a lot to me:
Last edited by one80; 04-28-2013 at 04:34 AM.
Asus Rampage IV Gene | Intel i7 3960X @ 4.8 | G.Skill RipjawsZ 16GB 2133 | EVGA GTX 680 Classified | Thermalright Venomous-X & Shaman | Seasonic X560 | Samsung 840 Pro | WD 4TB Reds
Anyone?
Asus Rampage IV Gene | Intel i7 3960X @ 4.8 | G.Skill RipjawsZ 16GB 2133 | EVGA GTX 680 Classified | Thermalright Venomous-X & Shaman | Seasonic X560 | Samsung 840 Pro | WD 4TB Reds
any updated version?
ASUS P5K-E // E8400 Q746A519
G.Skill F2-8000CL5D-4GBPQ
LC 550W GP// XPERTVISION 9600GT
What version of RealTemp are you trying to use and where did you download it from?
Try one of these versions and let me know if either of them work.
RealTemp 3.70
http://www.techpowerup.com/downloads...eal-temp-3-70/
RealTemp T|I Edition
http://www.overclock.net/t/1330144/realtemp-t-i-edition
In theory, either of these versions should work with the new Haswell processors.
I plan to release an updated version once I get some feedback from someone that owns a Haswell CPU.
Bookmarks