max/min are simply the highest and lowest recorded temps during the time realtemp is running. I think 71c intel limit is tcase max which is quite different than actual core temperature.
Printable View
max/min are simply the highest and lowest recorded temps during the time realtemp is running. I think 71c intel limit is tcase max which is quite different than actual core temperature.
loonym: You could try running CPU-Z and in the cpuz.ini file set
SENSOR=OFF
to see if CPU-Z reports the same VID as RealTemp.
During testing, RealTemp and CPU-Z always reported the same thing for VID. To be honest, the Intel docs I read were not clear when it came to the correct way to read VID. I assumed that CPU-Z was getting this right and tried to follow along. You might also want to try CoreTemp which I think recently changed its VID calculation for the 45nm chips.
I don't know if these CPUs have a VID number stored in the chip that is totally independent of the motherboard it is installed on. If anyone finds in the Intel documentation how to read VID information then post it here and I will compare that to what I'm doing to see if I screwed up.
I know on my wife's Dell laptop, T7200, that RealTemp and CPU-Z report two different voltages for VID depending on whether it is in CxE power saving mode or not.
CERO: My choice of TjMax=95C for a Q6600 - G0 was based on testing of my Q6600 with an IR thermometer.
The previous assumption by every other program that TjMax=100C seems to be based on that is what Intel TAT uses even though TAT was specifically designed for testing mobile core processors and was never updated or designed for use with Core desktop processors.
If you trust TAT more than you trust my testing then it is easy enough to go into RealTemp and set TjMax=100C.
Edit: Here's some info from my original test:
http://www.xtremesystems.org/forums/...&postcount=568
It was a brand new processor that I was unfamiliar with so I didn't do a thorough test but hopefully I will have some time later this week to re-test my Q6600. After using this processor for a couple of months, the temperature sensors on core0 and core1 seem to be the most reliable and give consistent / balanced results.
@ loonym
@ Uncle
heres short description of a situation of different VID value within the SAME SYSTEM , Same components, even same run (b00t)
only software side interaction (Asus Ai Suite, Ai Gear3 with EPU driver) is causing this delta - 0.1000v (vid-value) Maximus Extreme /QX9650 /
just check both screens
(sorry these NOTES, screens are 1.5 months old (or somthing ..), So RealTemp was not used to monitor VID, ( sorry )
http://img218.imageshack.us/img218/7...end3pf9.th.png
http://img169.imageshack.us/img169/9...wer3ku8.th.png
--
clean XP setup / AiGear 3 installed / fresh drivers from asus site
more explanation .. coming .. just need to review other saved bench-screens and recall my actions from this period
--------------
Now, i have already un installed all this' Asus Ai ''CRAP' '' from that machine,
so currently can't REPLAY this situation,
anyway those ''Green Things'' made more troubles than use of it,
so returned back to my manual 24/7 settings @3600 with C1E/ speedstep enabled,
i didn't mean to prove or complain anything, what so ever,
just tried to describe the situation, i saw then..
-----------
loonym: I just remembered something. When VID got moved off of the main page of RealTemp, I also changed it so its value is no longer being updated in real time. My thinking was that I didn't want users inputting new settings and having things get screwed up by constant updates to the Settings page so the VID reported doesn't change. It's now read every time you open the settings page and if you were in C1E mode, the act of opening the page might or might not switch it to the higher value. The design of the motherboard could also effect what shows up.
It's possible that your chip has two VID settings built into it like the mobile chips. My E8400 only seems to have one VID but that might be because of the older Asus P5B I have it on.
You could try running v2.5 which is still available from the TechPowerUp site to see if that's the problem. I had planned to move VID back to the main page so it's updated along with temps on a regular basis. Thanks for reminding me of this bug / limitation and giving me some motivation to get VID back on the main page.
Thanks for the suggestions, I'll give it go. Not really a huge deal anyway, just one of those things that makes you go hmmmm....
I really don't think it is anything to do with realtemp or any other app. After trying the same cpu in a couple different boards I'm quite sure it's the gigabyte bios. Maybe I'll flash a different version and have a look at that.
Where are you Grandpa?
I need a beta tester that has problems with RealTemp reporting the wrong MHz when the CPU MHz option is enabled in the bios. That option is supposed to make RealTemp compatible with SetFSB but it has a problem on some motherboards where the MHz that RealTemp reports is too high.
I put some new code to calculate CPU Frequency into a separate utility for testing purposes. My board always reports the MHz the same as CPU-Z so I have a hard time seeing if I've made any progress.
http://www.fileden.com/files/2008/3/...07/MHzCalc.zip
MHzCalc should show your maximum MHz based on your present FSB times your maximum multiplier. When using the highest multi, if this app reports the CPU MHz higher than CPU-Z then do me a favor and post a screen shot of both programs so I can have a look. Thanks.
http://images30.fotosik.pl/224/62e66621d7f7d0f1.jpg
Whats is real temp ? ;) e2140 @ wc
As long as I check the CPU MHz option box in realtemp it keeps up fine on the few setups I've tried it on. Neat little applet, that MHzCalc though unc.
edit: MHzCalc shows the large swing the board makes when using setfsb in real time. I like it.
Arni: TjMax=100C is a myth for the L2 processors and all desktop processors that I know of. Intel TAT assumes TjMax=100C because it is a mobile / laptop testing tool. It was never designed for desktop processors but most programmers accept anything that TAT says as true because it has that nice little Intel logo on it.
Read the RealTemp docs and try doing the calibration test and post your results.
http://www.techpowerup.com/realtemp/docs.php
loonym: I would need to add support for the multipliers to make it a useful stand alone app. I'm just hoping that it calculates the MHz properly on all boards when using SetFSB. You have to write a lot of ugly code and formulas to try and get some accurate timing out of a Windows based computer when using SetFSB. Other than CPU-Z, there isn't any choice that I know of for accurate MHz when overclocking from within Windows. If I get some positive feedback from Grandpa or another user that has a board with this problem then I will add this new MHz code to RealTemp.
If you are using RealTemp you shouldn't touch the default TjMax setting for your L2 processor.
85C is correct, 100C is wrong.
ok thx . No calibrate also ?
Many L2 processors need a 1.0 or 2.0 calibration adjustment but the sensors on every processor are different. It's best to read the documentation about Calibration. Just ask if you are not sure about anything.
Uncie...throw up a new beta, go on...you know you want to!
I Have it set to 460Mhz
ok i set for cores 3.0 and temp now in idle 33* / 32* looks real [in room 20-23*] Nice soft rly! One sugestion in next version plz give option for minimalize to watch and "start with widows" :D
Cya and again thx
Grandpa: Looks like I'm making things worse! I might have to concede 100% compatible with SetFSB to CPU-Z but I really hate to give up.
Arni: Here's the easy way in XP to make RealTemp start with Windows.
http://img74.imageshack.us/img74/620...windowstz7.png
Click on the Start button, go to All Programs and drag RealTemp into the Startup folder.
I don't like any program that tries to add stuff into my Startup area so I like doing it manually. RealTemp already has a Start Minimized feature in the Settings window.
One user with Vista x64 SP1 says version 2.60 doesn't minimize to the System Tray anymore like version 2.5 did. I think he just had the Task Bar option selected. If there are any bugs like this then let me know.
Richard: You know me too well! I'd love to release a beta, like yesterday. I'm debating whether to scrap SetFSB compatibility. Most users run CPU-Z anyhow when posting screen shots so this would simplify things and help keep RealTemp nice and lean so it doesn't stress Vista too much!
Is there anyway to make real temp, show highest cpu temp (on quad core) when minimized on tray?
@unclewebb i know that Autostarts option ;) but could be nice that options in same progam [one marking and no more "drag and drop"]
About tray yea i don't had marked showing * in tray, so "Tray bar" was autodisabled ...now is great :)
If you have all 4 cores being shown in the System Tray then you can go into the System Tray menu and select Maximum and it will show you the highest temps for each core.
Would you like to see the highest temp of all 4 cores in one system tray icon instead? I would need to add another option for that.
I kind of thought so. Thanks for the feedback. There haven't been too many complaints with the latest release but I try to track down and fix anything that is brought to my attention, either here or by direct e-mails. The timer on Grandpa's motherboard and many other boards is different than the one on my motherboard. It's simple to get accurate MHz out of my motherboard no matter what code I write but on other motherboards you have to write additional code to correct for errors in timing. CPU-Z is about the only program I know that gets this right on all motherboards. RealTemp still needs work.Quote:
Version 2.60 does start minimized with the proper options set in both Vista x32 and x64.
I sent you a PM Grandpa so you can try and provide me with some additional info. Thanks.
Yes, it would be nice to see the highest temp of all 4 cores on the system tray, so we can have a secure measurement of the proc temp.
Or Is there anyhow I can make it show with the 2.6 version ? because so far I only see option to show a specific core number temp.
In 2.60 you can show the highest temp of each individual core in the System Tray but there is no way to show the temp of only the one highest core. I'll try to add a new feature for that.
Anyone that has problems with the CPU MHz option NOT showing the correct MHz can try running this testing app and post their results alongside what CPU-Z is saying.
http://www.fileden.com/files/2008/3/...z%20Tester.zip
I need some more data to try and get things sorted out.
http://img149.imageshack.us/img149/8...testingml1.png
My board is too consistent so it makes troubleshooting impossible.
I ran the RT MHZ tester, does it support multipliers that are not the default multi? I have a Q6600, with 8x multi and C1E/SS enabled. It got the FSB of 450 correct, but it used a 9x instead of the 8x that I use. This is on a DFI P35-T2R. I can post a SS later, I'm using RDC to connect to my PC to test it.
Why the differences?
Attachment 79806
This is on the Gigabyte system.
WoZZeR999: Don't worry about the multis. They get taken care of separately. This testing program will provide me with some data that I need to help understand this problem better. It's making more sense already. Post a screen shot when you get a chance and you can include CPU-Z as well or just tell me the multis and processor you're using.
msgclb: The 14.318180 MHz number in the lower right is provided by a frequency generator chip. You can see that my board works completely differently. I've tried a couple of Dell laptops and they seem to use 3.579545 MHz which is exactly one quarter the speed of what your board uses.
http://img393.imageshack.us/img393/6...ester01ma0.png
If you use SetFSB you could try dropping your FSB to 300 MHz and post a screen shot of that. No need to overclock more. Just overclock a little less.
[QUOTE=msgclb;3039195]I'd like to know what motherboard Grandpa is using!
ASUS Maximus Formula 1201 bios
Q6700
4-2GB (8GB) Mushkin PC2 8500 @ 1105Mhz 5-5-5-12-2T-50
4-Seagate 7200.11, 500GB (2TB) 32MB cach, 0-Raid
2-MSI 3870X2
1300W CoolMax
Thermotake Kandalf
Liquid cooled
Grandpa: A screen shot like above is just what I need so I have a chance of getting this figured out. There's nothing wrong with your board. It just works differently than the one I'm using.
I actually didn't have SetFSB installed on this system but I do now.
Attachment 79834
Grandpa, thanks.
unclewebb, is this a motherboard, chipset, manufacturer or don't know?
Maybe it's the clock generator!
Here you go. I have changed my settings I was playing Crysis, I need a little more horse power when I am playing it so I crank her up a little so I can play on High settings.:D
You might want to check your e-mail I sent you a screen shot earlier.
Hope this helps. I use a Q6600 with 8x multi/C1E enabled. So cpu-z shows 6x since taking screen shots are taken at idle times. I included the standard 450, and SetFSB'd it down to 445 and 440. DFI P35-T2R.
Yahoo Mail was trying to help me out by throwing your e-mail into the spam folder but I found it. Thanks Grandpa. Your results make the problem pretty obvious, to me at least. Now I just need to find a solution that is compatible with the boards that RT is working properly on.
The column of numbers at the far right above the Go button represents the amount of time that the program asks for to count MHz. On my board and on msgclb's board, the actual time is almost exactly the same as the requested time. On your board, it is always about 0.014 to 0.015 seconds more so RT ends up counting more MHz than your computer is actually running at. Hope that makes sense.
The problem is that when overclocking with SetFSB, you get almost the same effect. I just need to create some formulas to figure out when the clock generator chip is being generous so I can correct for that or when the user has his hand on SetFSB. I think with this new info, it's doable!
WoZZeR999: Perfect! I'm knee deep in data now so there's no reason why I shouldn't be able to come up with a solution. Your results show what I already sort of knew. The inconsistencies in timing are consistent. The timing numbers when not over or underclocked are almost exactly the same as what happens on Grandpa's board. My goal isn't to make RT as good as CPU-Z. I won't be happy until I can make it better! ;) CPU-Z floats around more than it should after the decimal point.
@unclewebb - great job with 2.60! You planning to add the following:
-ability to select where on screen the gamer mode projects the temps and allow custom coloring?
-ability to monitor additional temps such as NB/chipset, PWM, CPU, HHD's etc. (have a look at what hwmonitor can do.
Really unneeded but would be cool: allow for conditional coloring of gamer mode and/or tray mode temps such that the user can define three different colors and temp ranges. For example:
20-45
46-56
57-67
Again, the key would be that users can define both the ranges and colors.
Keep up the great work!
It's weird. Yesterday, when I gave you the report of it getting the FSB correct, All of the numbers on the left were within .1mhz of each other.
When all the numbers in the left column are very close together, then RT will probably be correct. When the numbers in the left column gradually decrease towards the correct MHz, then I will need to check for that and do a correction. The clock generator chip might be good one time and not so good the next time. It's up to software to figure out what it is saying and to interpret it accordingly. No worries now. The next version of RT should work correctly for all motherboards.
On my board when the numbers line up in the left column, I find that when I use SetFSB, the number at the bottom of the second column stays the same. On your board when using SetFSB, the number at the bottom of the second column was actually the correct MHz based on the maximum multi for your chip. It's easy enough to read the multis and come up with the correct total MHz from this.
http://img65.imageshack.us/img65/9465/setfsbdownqo8.png
I went from FSB 333 MHz to 300 MHz which screws up the timings in the secs column. There's a reason why only CPU-Z gets this right. It gets confusing when using SetFSB. Hopefully RealTemp will become the second program I know of to get the MHz right.
Anyone else have problems getting 2.60 to remember the settings specified? For example, I customized the core temp colors/font and enabled gamer mode and saved the settings. After closing/reloading the app, none of the settings were saved. This a bug?
Thanks graysky, I'll have a look at that. I know Gamer mode doesn't get saved but that was by design. It's not quite ready for prime time and wasn't designed for 24/7 use. I would like to do Gamer mode properly by feeding RivaTuner the info if I can get that figured out. The rest of the options are supposed to get saved.
Edit: If you click on the Save button in the Settings window it should work. If you click on the Use button it will only use your present settings but won't save them. The close gadget at the top right will dump any changes you make. That is the way Windows apps are supposed to work. The close gadget is equivalent to hitting the Cancel button if there is a Cancel button available.
I'll have a look at a few more combos to see if I can repeat your problem. Your custom coloring idea is interesting.
@uw - cool. BTW, gamer mode flickers when playing games. Here is a quick video of what I'm talking about with GTA:SA. click here.
That's the problem with Gamer mode at the moment. I did a half ass job on this feature and it shows. It only took about a dozen lines of code so I didn't invest too heavily into development. For me, it was usable enough to see some temps during some games which is better than nothing.
If I do this properly with RivaTuner, there should be zero flicker since it will update on screen at the exact same rate as whatever game you're playing. I think that would be a worthwhile feature. As always, RealTemp is a work in progress.
@unclewebb - cool man, just wanted to let you know about it... wasn't meant as criticism. Also, looks like coolest just raised the bar with the public beta of CTGrapher.
RealTemp on my Asus Striker II Extream gives very strange readings...
40C on Core1 and 30C on Core2 ? @ E8500
How can it be possible?
Don't blame RealTemp for the sensors that Intel has chosen to use. These sensors were never designed for accurate reporting of idle temperatures and the 45nm sensors are even worse at this than the previous generation. You're lucky you don't have a Quad core because some of them look more like random number generators.
Read the RealTemp docs about Calibration and you might learn a little more about the problems and how RealTemp can be used to get some more realistic temperatures out of your sensors. No other software will let you get even close compared to what RealTemp can do for your processor so don't blow it off yet.
I'd start by running Prime small FFTs on both cores and see if the temps get a little better balanced. You either have bad sensors or possibly an IHS to core alignment / contact issue.
graysky: Competition has been good for RealTemp and CoreTemp. I've previously thought about adding graphing to RealTemp but so far it hasn't been too high of a priority.
Unclewebb,
Instead of re-inventing the wheel on the graphing & logging, why don't leave those graphic/chart/logging stuffs to the built-in Windows Performance Monitor and you just need to provide the counters to the perfmon. Just love a program that is lean & "mean".
Think of the possiblity on running the graph on CPU load with the cpu temp together in the chart or other system components as well.
Another suggestion, make it as double modes which can run as a normal program and also as "SERVICE" mode, that will be truly awesome. ;)
.
Thanks for the suggestions bing. I'm not familiar with the capabilities of Windows Performance Monitor but now I've got something new to check out.
I'm a lean and mean fan as well and that's my top priority at the moment. I downloaded some free tools from Sysinternals and have already found some room for significant improvements. I'm taking it easy this weekend. A much more efficient RealTemp should be ready by early next week for some testing.
Great new features in RT... ;)
uncle, concerning bing's suggestion regarding window performance monitor:
what about creating a plugin for rivatuner? i guess most people here are using rivatuner for tweaking and monitoring their graphic cards anyway. i use some standard plugins and everest to display cpu and memory utilization, all temperatures available via motherboard sensors and also fan speeds. its also possible to log this information to a file and display the graphs later. very handy during an overnight run of stability tools. thus rivatuner turned to my favorite tool for my monitoring tasks!
have not yet looked into plugin creation myself but from what i have read it should not be to complicated. may be its easier than integration into windows performance monitor.
Realtemp as a save image option hotkey?
where can i find RT MHz tester?
Here ;)
After reading the documentation, I must say that I impress with the flexibility of real temp for user to set options.
It surely comparable to the coretemp which I used before.
I'm still calibrating though to see "real value" at real temp in my mobo and proc.
Don't forget to add a feature which like automatically show the highest temperature of all cores at system tray, so it read all cores temp, and show which one the highest one to show at system tray :). actually its the only thing that kept me to completely using real temp, cause I'm using both, core temp and real temp right now to see which one show "real" temp in various configuration.
fgw: RivaTuner compatibility is my 'A' plan. It looks like this is the easiest way to properly get temps on screen during gaming and I'm assuming that I will also be able to use RivaTuner's graphing capabilities if I go this route.
as soon as you have a plugin to feed realtemp's data into rivatuner, you sure can use the graphing capabilities of rivatuner! e.g i'm using the everest plugin and thus be able to display all values found under everest system sensors via rivatuner in rivatuners hardware monitoring graphs.
@uncle: is there a way to get realtemp to start during windows startup?
You only have to go back two pages to learn how to drag RealTemp into your start up folder.
http://www.xtremesystems.org/forums/...postcount=1318
That saves me from writing code that accesses your registry. I hate utilities that do that.
Thanks fgw for the info.
Unclewebb, when you test the CPUs with your IR thermometer. do you put the CPUs into the motherboard and turn it on (without a heat sink) and point the thermometer at the center of the CPU case or do you physically cut the CPU case off to expose the actual CPU cores and read the temps from the actual cores?
martymonster: I have not yet cut an IHS / heat spreader off.
All temperature measurements have been made with CPU-Z reporting a core voltage of 1.08 volts while usually running at ~1600 MHz. The IR thermometer is pointed at the heat spreader which is covered by a thin piece of masking tape to reduce shine. This seems to improve accuracy and repeatability and after sitting for 8 hours with the power off, there is no difference between measurements of the CPU and the room temperature.
I'm also able to move the thermometer around in search of the maximum temperature so my readings are probably not coming from the geometric center where Intel recommends Tcase readings should be taken.
There are some users that still believe that there is a large temperature difference between where I am measuring and the temperature of the core. I've reduced this difference to a minimum by doing my testing with the computer as close to idle as possible. When a CPU is working, gradients constantly develop from one part of a core to another depending on what type of instructions are being executed but at idle, Intel's own testing has shown that the difference is only about 0.5C which is within the accuracy of my Fluke 62 Infrared thermometer.
No program, including RealTemp, is perfect but I still believe that when calibrated, RealTemp is getting closer to the real core temperature across the entire operating range than any other program is providing.
Here's a test of my E8400:
http://www.xtremesystems.org/forums/...&postcount=573
Thanks for that info.
I thought thats what you did
On another note:
I notice that if you adjust the idle compensation temps, then the TEMPS + TEMPS_TO_TjMAX do not equal TjMAX
Eg:
Default settings the current temp is say 27, the TjMax is 95 so distance to TjMax is 68, all is OK
BUT, if you adjust the idle calibration temps then you get current temp of say 27, distance to TjMax is 74, add them together you get 101 not 95 the TjMax
I have also noticed this on many posts in this forum including your own.
It would be nice if the distance to TJMax took the idle calibration adjustment into account :up:
It work good, but the corespeed ist wrong. :(
http://xtremesystems.3d-nexus.de/RealTemp_001.png
deepsilver: I'll admit that the core speed is a work in progress but in your screen shot it looks like you have C1E / SpeedStep enabled. If this feature is enabled in your bios then at idle the multiplier drops from x8.0 to x6.0 so your cpu speed will be reported at 400x6.0~2400MHz. Run Prime95 and put a load on your computer and see if things change. Did you set your bus speed to 400 MHz in the bios or did you use SetFSB? I promise the next version of RealTemp will be better. CPU-Z is still the king of MHz.
martymonster: As ChrisZ said, the Distance to TjMax is the raw data coming from the on chip DTS sensors and that needs to be reported as is. RealTemp takes the DTS data and simply interprets it differently than other programs when converting that number to an absolute core temperature. When using Calibration factors, if you work backwards and add the two values together then it will look like RealTemp is using a variable TjMax.
You can use this knowledge to figure out how much correction RealTemp is providing to your reported temps. If TjMax on the Settings page is fixed at 95C and Core Temperature + Distance to TjMax adds up to 100C then it is easy to calculate that the Calibration RealTemp is using is boosting your reported temps by 5C. If properly calibrated, this will hopefully better reflect your real core temperature.
I think Real Temp reads the MHz it detects when you start it. So if he started it in Idle then it will read the "low multi MHz"... coz i've had it show 3.2GHz and 2.4GHz on my e4400 @ 3.2GHz with EIST/C1E enabled thru CrystalCPUID.
Anyway... i also got a L2 stepping e4400 and temps shown with Everest and CoreTemp are ~14° above what Real Temp shows. Well I went thru the 55 pages of this thread and read about the TJunction issue with those old C2Ds. But still it doesn't seem to be finally concluded.
With prime95 running, Everest shows 70-74° each core! Room temperature at the time being ~25-28°. Real Temp shows about 59-60° which is a decent value for 3.2GHz @ ~1.344 VCore. Cooled with Xigmatek's HDT-S1283. Idle Temps are around 32-35° with calibration set to 2.0 for each core.
But if it was really 70-75° its kinda too hot in my optinion, so it would be great to finally get an answer to the question which Tjunktion it actually has!
For the L2 processors the competition uses TjMax=100C while RealTemp uses TjMax=85C so when uncorrected, there will be a difference of 15C.
kaltblut: Have you ever tried running my test where you drop your processor down to about 6x266MHz or 6x200MHz and drop the core voltage to about 1.10 volts if possible. Show us your idle temps compared to your room temperature with zero calibration and I think it will be pretty obvious what your real TjMax is. Most L2 processors when using TjMax=85C will report an idle temperature during this test a couple of degrees below the room temperature.
This is impossible so CoreTemp and others decided a good fix would be to boost TjMax up to 100C. That covered up the inaccurate sensor issues at idle but now these programs report load temps that are 15C higher than the actual temperature. Read the docs to learn more about this issue:
http://www.techpowerup.com/realtemp/docs.php
I know about that issue, and also about the low MHz-test to see what the idle temp says... but still it could be a very heaty CPU or the IHS is not well connected with the die or the cooler doesn't fit the CPU surface well enough.
It would just be cool to know the definate answer to the question ;) Like measuring it with an infrared thermometer.
I wrote RealTemp so users could get some reasonably accurate temps without having to invest in an IR gun. Most of the processors that are supposedly "running hot", are being reported by temperature software that is using the wrong TjMax. I was hoping you could run my test and post your results. Someone with an L2 did this once and showed that TjMax=85C looks reasonable but that was quite a few pages ago. I'm waiting for some summer weather to see if the delta between core temp and room temp at idle remains about the same.
Given the cooler you're using, your room temperature and your core voltage, 60C while running Prime is reasonable while 75C is not likely.
Thanks unclewebb for this valuable utility. I'll post screenshots of my X3350 once I get back home.
I finally had some time to work over the MHz code that RealTemp uses so I created a separate little utility for beta testing purposes. It also uses the WinRing0 library to read the multipliers so just drag this utility into your main RealTemp directory and it should work OK.
http://img71.imageshack.us/img71/704...tempmhzjv6.png
http://www.fileden.com/files/2008/3/...alTemp_MHz.zip
Let me know how this compares to CPU-Z. I'm hoping that this new code finally handles SetFSB correctly on all motherboards that use any Intel Core processor. A small window like this might become a future option for a mini, extra lean RealTemp version.
This is my EVGA 780i and the Q6600. The difference is probably the most I could catch. It's usually the same.
Attachment 80177
My Gigabyte GA-EP35-DS3R and E6750 is below.
Attachment 80178
Thanks msgclb. On my board, the new code gives more stable MHz readings than CPU-Z. There's a lot less wander after the decimal point. Hopefully Grandpa, WoZZeR999, deepsilver and others that were having trouble with the MHz that RealTemp displays can try this utility. RealTemp 2.60 has some serious MHz issues on laptops so I'm hoping to test and integrate this new code into RealTemp as soon as possible.
Did you want another setFSB stepping from 440 to 450?
Thanks WoZZeR999. The data you and Grandpa previously provided me has really helped me get this right this time. How do you like the stability of the readings with this code? They're almost too stable now!
Awesome job!
no problems here...
http://i116.photobucket.com/albums/o...0_RealTemp.png
but there's a bit of a lag till RealTemp shows the right MHz
Thanks emoners. The new code is working 100% better on my wife's Dell T7200 laptop as well. I didn't realize that the timers in each core aren't in sync on the mobile chips like they are on my desktop core processor. RealTemp 2.60 was giving some ugly MHz results on her laptop.
I should be able to improve the initial lag time but it takes time to accurately calculate MHz so a little bit of lag is a fact of life. It's presently about 2 seconds on my Q6600. If it's a lot longer than that for you then let me know.
Try starting up CPU-Z with a stop watch handy for comparison.
With the later CPU-z, it has that loading screen. It may do the initial calculation there, and it take a while to load.
Love it unc... thanks for the great work :D
:up:
Here's a quick update that should cut the initial lag in half.
http://www.fileden.com/files/2008/3/...alTemp_MHz.zip
Thanks loonym. Your board looks similar to my board. CPU-Z usually reports from 0.0 to 0.5 MHz higher. Do you have SetFSB installed? I just want to be sure that this code is compatible with SetFSB on a variety of boards. No need to overclock any further, just drop it down a couple of hundred MHz.
Has anyone found the frameless option yet?
http://img228.imageshack.us/img228/5...amelessln8.png
Would something about that size with the temps inside be a handy option for RealTemp?
Yeah, when I said to myself 'WTF, this is a button', then clicked it.
I personally wouldn't use the feature for what mhz it was running at on a regular basis. If you made it a pop-up around that size, and put the FSB/multi under the speed, that may be helpful for screen shots where you don't want cpu-z taking up half of the image.
Edit: I read you questions wrong. It may be easier to just have in your settings what you want to show, or combine min/max into 1 box with min_temp/max_temp. That way you can personalize what you want it to look like.
Settings screens could have something like below for options on what to show.
WoZZeR999: I like the check mark idea. Customizable software is always a good thing but I'm not sure how to program that at the moment. More stuff to learn.
I would keep a a visual of all of the options somewhere in the settings panel so that you don't have to keep the check boxes on the main window. Figuring out the window size shouldn't be a problem once you figure out how much space each box needs.
It's obviously harder than that, but it would be a nice feature to have in the future.
like this ?
http://i26.tinypic.com/2ywehi8.jpg
WoZZeR999: I did some reading and your check mark option might not be too hard at all. I like the idea of going into the Settings window and checking off what you want to see. I'll throw it on the things to do list.
mariosimas: I thought that would be a good size that users could add anywhere when posting a screen shot. I might put the MHz label on it so it is clear what that number represents.
I sugest also one "always on top" option
@ unclewebb
Oh it's MHz? Thought it was the Star-Date :D
Cool, if I could have a compact window that had the correct temps, that would be awesome.
1 other thing that would be interesting, have it add a * after the temp (in the case of PROCHOT# not being used) in the 'Core temp' area. So if the flag was thrown, have it say 40C*. You could shoot that down is you didn't think it would be helpful, just a thought.
My reason for suggesting this is
You say that at low temps, the DTS nbr is NOT accurate, so you provide a method to adjust this reading for low temps.
You also say that you adjust this based on some formula in your code.
So, if DTS at low temps is wrong, then the distance to TjMax is also wrong.
If your formula (after idle calibration) shows correct temps, then the distance to TjMax should reflect this.
Hope you understand my reason
martymonster: I understand your reasoning but I disagree with it.
The theory behind my program is not that the Distance to TjMax or reading from the Digital Thermal Sensors (DTS) is wrong. The problem is that other software has always assumed that this distance to TjMax moves linearly and that being a distance of 50 units away from TjMax is exactly the same as being 50 degrees away from TjMax. In my testing, that statement hasn't proven to be true.
That's why I use the term Distance. It more accurately reflects what the DTS is telling a user. A distance of 50 units away from TjMax on your processor might mean it is 45C or 50C or 55C or some other number away from TjMax. For this reason, DTS data is not comparable between two different processors.
The traditional formula has always been:
Reported Temps = TjMax - DTS
and when you rearrange the terms you found that:
TjMax = Reported Temps + DTS
This formula is wrong so RealTemp uses a modified version of it:
Reported Temps = TjMax - DTS + CorrectionFactor
With my formula, TjMax = Reported Temps + DTS
will only be true when you are not using a CorrectionFactor or at higher temperatures when the CorrectionFactor is equal to zero.
The bottom line is that what RealTemp reports as Distance to TjMax is simply the raw data being read from the processor which can be directly compared to other programs like CoreTemp that also report this same data. RealTemp's interpretation and conversion of this data into an absolute temperature may be different than other programs but there's no way I'll ever change the reporting of the raw data so the numbers add up nicely in a formula that isn't correct across the entire temperature range of these processors.
Edit: The new MHz code has been added to RealTemp without any issues. I did some house cleaning and added the toggle button back to the upper right portion of the screen so VID will be updated in real time again and I also added a clock in there that keeps track of how long RealTemp has been running for. It should stop when you go into Stand By or Hibernate mode and resume running when you resume. This will make it much easier to determine how much time was spent on the 'puter, ahhh working. :D
RealTemp beta testing should resume tomorrow and hopefully it will be a little easier on Vista compared to 2.60.
http://img135.imageshack.us/img135/6233/rt261oc8.th.png
Hi all,
I just stumbled across this thread and need some help.
I've followed the calibration instructions and taken my Q9450 down to a FSB of 266MHz and the core voltage to 1.104 (as reported by CPU-Z). The idle temps I see are basically the same as when I was using the stock FSB and voltage. For what its worth, I'm using a Thermalright Ultra-120 Extreme on a Gigabyte GA-X48-DQ6 in an Antec P182 case that's well cooled. Ambient is 72 F (22.2 C).
Here's what I see at idle (the first two are from Everest, the rest from Real Temp):
- Motherboard: 30
- CPU: 20 (note that this is 2C below ambient)
- Cores 1-4: 36 / 32 / 40 / 40
Note that cores 3 and 4 never register below 40 C but do seem to work normally above that.
When I run 4 copies of Prime95 with small FFT's, each copy tied to its own core, all of the core temps go to 46ish except Core 3 which stays at 40.
When I do the same with normal FSB and core voltage, I see the following after a few mins:
- Motherboard: 31 (from Everest)
- CPU: 36 (from Everest)
- Cores 1-4: 53 / 49 / 49 / 55 (these are max values)
When I stop Prime95, cores 1-3 drop very quickly while core 4 takes a while to get back to a typical temp.
All of this raises a few questions:
1) What should I set the Idle Temp Calibration factor settings to?
2) Obviously its impossible for the CPU to be 2 degrees below ambient, but assuming some reasonable amount of +/- in the sensor, is this value believable or is there something defective with the CPU temp sensor?
3) Why would the CPU and core temp values be so far apart?
4) Should I be concerned about core 3 and core 4 not registering below 40 C? They seem to be roughly in line with the other cores for temps above 40 C.
5) Can the motherboard BIOS revision play a role in this? I'm still using the original BIOS that came with my mobo. They've released a bunch of beta BIOS'es since then but no newer non-beta BIOS'es.
Bottom line: I've got 2 more days to return this CPU. Other than the temp readings I mentioned above, it seems to work normally. Do I return this one and try another or are these problems so minor and/or common that I'm *not* likely to do better with a replacement?
Many thanks in advance....
lunadesign: I'll try to answer your questions.
Your first reading of CPU at 20C shows a sensor that isn't 100% accurate at reporting low temperatures. This is a sensor on your motherboard so swapping CPUs won't change this reading. That inaccuracy is pretty typical for a motherboard sensor and is the original reason why Intel went to on chip sensors. If you know this sensor is not 100% accurate then making any further comparisons to what it's reporting is pointless.
In most documentation the cores in a Quad are numbered 0, 1, 2, 3. CPU-Z also uses this convention so I'll be doing the same when talking about your Quad.
Core2 and core3 sitting at 40C is typical for sensors that are getting stuck. Lots of sensors on the new 45nm chips do this at low temperatures but most work very well when the temperature goes above this point. Keep in mind that these sensors were neither designed nor calibrated by Intel to accurately report idle temperatures. If they work for that purpose, great but if not, Intel does not consider this a valid reason to return a processor.
The Calibration feature in RealTemp will only work on cores that are not getting stuck. With a stuck sensor, there's nothing you can do with it.
With two sensors that are likely stuck, you then have to look at the sensor in core0 and try to determine if it too is stuck. Sensors don't always stick at the same value for each core. Turn on the logging feature in RealTemp and set the interval to 1 second. What I look for is as your processor is cooling down after booting up, the temps should be dropping pretty much equally on core0 and core1. If you notice in the log that core0 is dropping at about the same pace as core1 and then it suddenly stops while core1 continues to drop further then that would be a good sign that core0 is stuck as well.
If they both move at about the same pace as your CPU cools but end up at slightly different values then something like that can be corrected by using slightly different calibration values for each core. For example, my Q6600 is pretty accurate when using these factors ( 1.0, 1.0, 2.2, 1.2 ) but every processor is unique.
Core1 might be your only reasonably accurate sensor. I've done all of my testing with an open case and if possible you should do the same. Core1 at 32C during my low volt/MHz test in a closed case at a room temperature of 22C is probably pretty close to the actual temperature. See if opening the case changes anything and if any of your readings drop another degree or two. Do some more testing and I'll try to come up with some calibration factors for you but there isn't anything that I can do if 3 of your 4 sensors are getting stuck at low temps. At least they seem to work at full load. At 50+C these sensors are usually within a degree or two of the real temperature so you shouldn't worry too much.
A different BIOS might change your CPU reading but it will not change the core temp readings that RealTemp reports. This data is being read directly from the on chip sensors within the Intel CPU.
I wouldn't bother returning it if it runs decently for you. Sadly, your sensors seem to be pretty typical for a 45nm Quad. :(
unclewebb - thanks for your response (and your neat program)....I really appreciate it.
Regarding the CPU temp that's reading 20C (Tcase), I thought this sensor was located on the case of the processor but read by the motherboard?
I opened the case, turned up the exhaust fans to max (they're Antec Tri-Cool fans) and underclocked/undervolted my system and the min temps are:
- CPU: 16 C (definitely not believable)
- Cores 0-3: 31 / 28 / 40 / 40
- Ambient: 23 C
If I run your "Test Sensors" test, Core0 and Core1 both bump up by about 4 C and come back equally fast. So, it looks like Core0 and Core1 are decent although slightly offset from each other.
Since my CPU sensor seems to be inaccurate, I guess I can use Core0 and Core1 as my reliable measuring sticks, right? And since Core2 and Core3 appear to become unstuck and roughly track the others when > 40 C, I guess I can trust all 4 in those circumstances? What would you recommend for calibration factors?
Are the temp sensors any better in the Xeon equivalent 45nm Quads? I'm just curious because I have to build a few more boxes like this one....
I'll have to check up on this. I thought it was reading a sensor that was located in the CPU socket but don't quote me on that.
Looks like the sensors on core0 and core1 are both working fine. My best guess is that your actual core temperature is between 28C and 30C during your test. If you want to keep things simple you could assume 28C so you wouldn't need to do any calibration on core1 and then you could use a -1.0 calibration on core0 which should bring it down to 28C also. If you choose to go up a degree or two, that's fine. Just use calibration factors so core0 and core1 are balanced during this test. I like the simple approach where you only have to use one factor. When you return to your normal MHz, you might need to do a slight adjustment. Core0 and core1 should be pretty much equal at idle even when overclocked and overvolted.
I like using Prime95 v25.6 running small FFTs. Are you using a multi-core version of Prime or running 4 individual instances of Prime? I'm not sure if this would make any difference to your results. At full load I generally find that core0 and core1 are equal while core2 and core3 are equal but the two sets of cores might be off by a few degrees. One user I helped had load temps similar to yours where the two outside cores ran hotter than the two inside cores and I think when he re-did his thermal paste this changed. Anything like installation procedure or an IHS / heatsink that is not perfectly flat can cause slight differences at full load as well as less than perfect sensors. Probably not worth worrying too much about though.
The other test I use is to use Prime95 and to go into the task manager and set the affinity to limit it to run on 1, 2 or 3 of the 4 cores at a time and to switch the load around while Prime is running. If you get some interesting numbers then post them.
I don't believe the Xeon sensors are any different. The 45nm sensors just aren't always capable of giving us the information that we'd like to be able to get out of them. It's sort of random luck and I think you're doing better than most users have. At least you took the time and did your homework so you know exactly what your sensors are capable of and the temperature range where they give you accurate data.