Finally got RealTemp v2.61 ready for some beta testing. Pretty much the same old thing. Gamer mode has been temporarily retired. When I learn how to do this feature correctly it will return.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
Printable View
Finally got RealTemp v2.61 ready for some beta testing. Pretty much the same old thing. Gamer mode has been temporarily retired. When I learn how to do this feature correctly it will return.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
Why don´t you put the "reset" function on the window that appears when we click in the right mouse button
when the cursor hoover over the tray icons ?
I find that very useful.
Thanks for the info! I ended up having to use a -1.8 calibration on Core0 to get it roughly matched up with Core1.
To answer your question, I generally run 4 instances of Prime95 and use the Affinity (-A) option to tie each instance to its own core. Is there a multi-core version of Prime95 out there? That would make it much easier...
I think I'm going to take your advice and not return the CPU...too much hassle especially if I'm likely to get another one like this. I'll be building a few more systems with almost identical specs so it will be interesting to see how the next ones match up with this one.
Unclewebb & lunadesign,Quote:
Originally Posted by Lunadesign
Regarding the CPU temp that's reading 20C (Tcase), I thought this sensor was located on the case of the processor but read by the motherboard?
This is a very common misunderstanding, and is precisely why users hesitate to trust Intel's Tcase sensor, which is the processor-specific thermal specification shown in Intel's Processor Spec Finder - http://processorfinder.intel.com/det...px?sSpec=SLAWQ
The era of the thermocouple-in-the-center-of-the-CPU-socket has long since passed.
The following Intel document - http://arxiv.org/ftp/arxiv/papers/0709/0709.1861.pdf - clearly shows on Page 2, Figure 1, upper right hand corner, the "Analog Sensor" which is embedded within the substrate layers of the processor package. Excerpts from my Core 2 Quad and Duo Temperature Guide - http://www.tomshardware.com/forum/22...perature-guide - explain how this Tcase sensor has worked for the past several years, which actually applies to some previous generations of Intel, as well as AMD processors:
"Section 1: Introduction
Core 2 Quad and Duo processors have 2 different types of temperature sensors; a CPU Case (not computer case) Thermal Diode located within the CPU die between the Cores, and Digital Thermal Sensors located within each Core...
Section 3: Interpretation
The first part of the spec refers to a measuring point on the Integrated Heat Spreader (IHS). Since a thermocouple is embedded in the IHS for lab tests only, IHS temperature is replicated using a CPU Case Thermal Diode integrated between the Cores. Maximum Case Temperature is determined by Spec#. The CPU Case Thermal Diode is how Tcase is measured, and is the CPU temperature displayed in BIOS and the software utility SpeedFan...
Section 5: Findings
(A) Tcase is acquired on the CPU Die from the CPU Case Thermal Diode as an analog level, which is converted to a digital value by the super I/O chip on the motherboard. The digital value is BIOS Calibrated and displayed by temperature software. BIOS Calibrations affect the accuracy of Tcase, or CPU temperature.
(B) Tjunction is acquired within the Cores from Thermal Diodes as analog levels, which are converted to digital values by the Digital Thermal Sensors (DTS) within each Core. The digital values are Factory Calibrated and displayed by temperature software. Factory Calibrations affect the accuracy of Tjunction, or Core temperatures.
(C) Tcase and Tjunction are both acquired from Thermal Diodes. Tcase and Tjunction analog to digital (A to D) conversions are executed by separate devices in different locations. BIOS Calibrations from motherboard manufacturers, Factory Calibrations from Intel, and popular temperature utilities are frequently inaccurate.
(D) Intel shows Maximum Case Temperature (Tcase Max) in the Processor Spec Finder, which is the only temperature that Intel supports on Core 2 desktop processors. Ambient to Tcase Delta has known Offsets which vary with power dissipation and cooler efficiency, and can be Calibrated at Idle using a standardized Test Setup..."
Since the first release of Core Temp, the overclocking community has become so brainwashed on core temperatures, that they now overlook CPU temperature as a reliable thermal measurement, let alone as a secondary reference. I have yet to see a Tcase sensor "stick", and as Intel has designed this sensor specifically for temperature measurements and depends on it's accuracy, my experience and observations is that the Tcase sensor scales with linear characteristics.
The only problem with the Tcase sensor is that BIOS programmers are confined to "canned" values, and sometimes incorrectly code just one of the many Socket 775 variants into BIOS. Regardless of whether BIOS is correctly offset for a particular processor, Tcase can still be calibrated in SpeedFan to an accuracy of within a degree or two. By using a similar standardized test setup for low Vcore and frequency, with covers removed and fans at 100% RPM, if ambient is accurately measured, then idle power dissipation and CPU cooler thermal efficiency are easily calculated to provide accurate CPU temperature.
I Hope this helps to clear things up.
Comp:cool:
Providing you have a modern motherboad, and most do, cpu temp is indeed read from a diode between the cores. Computronix, I agree with above, except the semantics issue of calling cpu diode a Tcase sensor. Intel will tell you to use the cpu diode to monitor Tcase specs, but it is not actually Tcase, and when you call them on it, they punt. There can be, mathematically, as much as ~15C difference between cpu diode and a sensor placed on IHS under certain testing full load TDP parameters, using intel formulas. I went round and round via email with intel via cpu temp vs Tcase (actual Tcase is IHS where no sensor exists). I already posted most of emails on anandtech, it is here. http://forums.anandtech.com/messagev...&enterthread=y
gradient goes from 1) core (~120W/m*k thermal conductance) to 2)between core (still in die material and still ~120W/m*k) and thus max gradient between core and cpu temp is ~5C on load.
gradient continues from between cores to 3) die attach solder ? ~10W/m*k (from companies that adverstise) 4) IHS over ~120w/m*k, hence most of the gradient occurs at the die attach adhesive.
So although only 5C max gradient occurs from core to cpu across 120w/m*k conductance material, a possible ?23C gradient exists under certain load, max TDP, intel cooling conditions from core to IHS center across lower conductance solder attach ~10w/m*k (if measured with sensor on IHS). (solder ball in pic below is not same as solder attach, for those looking at that)
see anandtech link for reasoning.
pic of common intel conductances.
rge, I quite clearly understand the definitions, and the major and minor variables, as well as the fine points of where the temperature is actually measured in the lab, as compared to how the end user's "CPU temperature" is "replicated".
Also, I've read that Anandtech thread, and as in so many other Forum threads, I'm continually amazed by how vague, evasive and ambiguous Intel's responses typically are. It's like asking the CIA to provide us with the First Lady's recipe for chocolate chip cookies obtained through the Secret Service.
Like unclewebb, I wonder sometimes if the guys in the blue clean room bunny suits with space helmets are going to show up in the wee hours of the night and abduct us for trying to expose the truth about Core 2 temps!
Comp:cool:
I figured you knew the difference, which is why I called it a semantic issue, my explanation was more for others trying to follow my logic.
As we both know, intel is using one term to describe two different points, in essence oversimplifying error either from laziness or intentional vagueness, or, giving intel the benefit of the doubt, if you are in specs via cpu temp by definition you are within Tcase/IHS temp spec (since that would be even lower). If you rewrite your guide, could you make me happy and differentiate the two, since intel is too lazy or secretive to do so?:D
Thanks for the additional info. I thought there was a separate sensor but I also read that Intel had removed this additional temperature sensor on some of their newer CPUs. I'll see if I can find a link to that.
Here's where you can download the multicore version of Prime. Makes life much easier.
ftp://mersenne.org/gimps/p95v256.zip
The wife will thank me that I didn't go for the Small FFTs on her laptop this morning. :D
http://img233.imageshack.us/img233/3...ngprimeyx3.png
On a side note I did notice this about what RealTemp reports for VID.
When I pull the plug on the laptop the VID reported by CPU-Z and RealTemp drops to a lower value.
http://img236.imageshack.us/img236/2...120volteg6.png
http://img78.imageshack.us/img78/8642/rtbatterylz9.png
Some more of intels conflicting documents, or in this case likely a poor editing job. The E8xxx series I think was slated to do away with the cpu diode, as evidenced by page 39, of thermal spec sheet for E8000/E7000 series which states under a graph "Note: The processor has only DTS and no thermal diode. The TCONTROL in the MSR is relevant only to the DTS."
http://download.intel.com/design/pro...nex/318734.pdf
However that document conflicts with spec sheet for E8000 series. And I emailed intel, they did say E8400 has a cpu diode. Also you can use speedfan, go to configure>>advanced>>click on mobo chip, for mine it is It8718F, and cpu temp value is diode. If it were reading mobo socket sensor that value will be thermistor. In fact if you change value to thermistor, it will read bizarre number, showing that it is a diode.
I can only assume that intel was planning on not having a cpu diode on E8xxx series when the document graph was done, then changed their mind, and forgot to change the graph.
CompuTronix: I ran into an old copy of your Guide on another site but its definitely not the latest as I don't see any references to Real Temp on it. Where can I get the latest and greatest guide? I've Googled around and all I see are references to deleted items.
BTW, I ran through your calibration procedure and got the following (after calibration) with the case closed and fans running normally:
Tcase = 27-28C idle, 45C load
Tjunction avg = 31C idle (avg excludes 2 cores that don't register this low), 50C load (includes all 4 cores)
Ambient: 23C
Chipset: X48
CPU: Q9450 (stock, no OC)
Cooler: Thermalright Ultra 120 Extreme
Frequency: 2.66GHz
Load: Prime95 - Small FFT's for 10 mins
Motherboard: Gigabyte GA-X48-DQ6
Stepping: C1
Vcore Load = 1.168
Adjustments to SpeedFan:
- CPU: +9
- Cores: -8/-8/-6/-10
Do these values seem reasonable for my configuration?
hi. i have cpu Intel Core 2 Extreme QX9650 and i have this problem:
the core 1 always stay at 40c and i see in real temp an up and down in temp (min 17c, max 50c) in core 0. also if i run a few times the test sensors always gives me another results. the most times says the core 1=0(stuck) but another times says about core 0 and core 3, the others have value 1.
all this readings is right or something is wrong with the cpu?
here's what it shows me the real temp:
http://img142.imageshack.us/img142/1743/53744814jo0.jpg
http://img263.imageshack.us/img263/2818/83743440ew4.jpg
thank you unclewebb for this great program. be well!!!
What happened with the VID ? Until now both realtemp and coretemp said my cpu(8400 Q746A) has 1.1125 vid. Now both programs says the vid is 1.2125. So what changed?
Please people , how can i get Real Temp 2,61 ???and some instruction if is not an usual instalation
sorry guys i did it, but is same sh%&""$·"!!!!, TJMAX 95 for q9450, it stil same as 2,60, , any comments??
The VID that most programs are displaying seems to be motherboard dependent and changes to your C1E / SpeedStep settings in the bios or in the Power Options settings in Windows will effect what programs report for VID.
During testing yesterday on a Dell Core2Duo laptop, RealTemp reported 4 different VID values as the mobile chip transitioned from idle to full load.
In my post above:
http://www.xtremesystems.org/forums/...postcount=1409
there are screen shots of two different values and here's a third screen shot during testing last night.
http://img111.imageshack.us/img111/8...arison3mj5.png
There were 4 different VID values displayed for the same processor in the same motherboard:
1.0625
1.1375
1.2125
1.2750
This motherboard / chip combo also displayed 4 different multiplier transitions.
x6.0 , x8.0, x10.0 , x12.0
Both CPU-Z, when set to report VID, and RealTemp were reporting the exact same numbers for MHz and VID during these transitions. CoreTemp 0.99 also reported these changes but it was reporting slightly different values that were a little lower. CoreTemp reported 0.9500 for the minimum VID and 1.1625 for the maximum.
In my opinion, the way users are comparing VID numbers is meaningless. Everyone assumes that a low VID means that you have a "good", highly overclockable chip but if VID changes depending on the motherboard you're using then whatever all these programs, including RealTemp, is displaying may not mean a hell of a lot. The Intel documentation for reading VID is about as clear as it is for trying to figure out TjMax.
loonym: I think you noticed this VID problem with RealTemp when you ran the same CPU on two different boards. Did both boards have the same maximum VID at full load or were they both different?
Nano74: RealTemp 2.61 is in beta testing right now but I like it's smaller size and likely increased stability better already. Go to TechPowerUp and download version 2.60 and then you can go to my beta section and download version 2.61 and copy the new RealTemp.exe file into your RealTemp folder and replace the version 2.60 exe file.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
I don't like changing the main download until I get some feedback from XS users who have been a great source of ideas and the main reason why RealTemp has become a decent little temp monitoring program. :up:
Edit: Thanks Nano74: for calling my hard work s-h-i-t. What do you know about TjMax that I don't know? If you are convinced that it is some other value than enter that value into RealTemp and be happy. Please post your vast knowledge about TjMax so we can all be enlightened.
Yes that is true. But i have the same settings as before (no c1e) and the reported vid is now 1.2125 instead of 1.1125. This has to do with the programs. CPU-z reports 1.184-1.200 vdef and i thought maybe the board sets the wrong voltage by default. I guess the vid reading was wrong. :shrug:
If both programs are reporting different, then it was probably something at the hardware level.
Realtemp : Displays CPU voltage identification (VID).
Coretemp : Fix: Incorrect VID detection on 45nm desktop Intel parts.
I believe coretemp was reading wrong.
Realtemp 2.60
Q9450
core0 48
core1 39
core2 39
core3 44
is the above normal or should I re seat hsf?
What cooling do you have?
CoreTemp has a Vista sidebar gadget out now, you must be making them sweat unclewebb. If and when RealTemp has one available I'll ditch CoreTemp. Thanks for all your hard work.
http://www.overclock.net/software-ne...p-add-ons.html
http://i239.photobucket.com/albums/f.../MyDesktop.jpg
How do you calibrate real temp? I see it in setting but, how do you go about setting it?
This is a good place to start: http://www.techpowerup.com/realtemp/docs.php
Apology accepted Nano74.
You're upset with the TjMax that RealTemp uses for your CPU so can you explain to me what TjMax you have chosen to use and what that's based on? Do you have any new documentation from Intel that I don't know about or any testing that you've done or is it based on some other programs that have chosen to use a different TjMax than what RealTemp uses? The competition hasn't posted any real world testing that I'm aware of.
I think RealTemp, CoreTemp and Everest all let you select whatever TjMax you want these days so you can pick whatever program you like and adjust it accordingly.
If you're interested in some of the testing I've done then read through this 50+ page novel. This thread is an excellent source of temperature information from all users.
kpo6969: No need to ditch CoreTemp. As long as you've set it to use the correct TjMax, which you have, then it's full load accuracy should be the same as RealTemp. Idle temps might be out a couple of degrees but they're not that important anyhow.
I haven't jumped on the Vista bandwagon quite yet so a sidebar for RealTemp is still a little ways off in the future. This may be sour grapes but I find CoreTemp and Everest both have too much information in their sidebars which makes it hard to see. I prefer the size of font that the weather gadget uses so it's easy to see your core temps at a glance.
alex17 GTX: RealTemp has been using the same VID formula since it introduced this feature. I read in the E8400 thread earlier today that you can do a Register Dump in CPU-Z and it will show you the max VID in the file it creates near the top. So far I haven't seen any difference between what it reports and what RealTemp reports.
I was playing around with a slightly different user interface this evening.
Someone once asked if I could make the temps stand out a little better.
The important numbers are pretty easy to read from a distance with this version.
http://img147.imageshack.us/img147/2966/rt262cs6.png
If you want to take it out for a run then download it here and place the .exe in your RealTemp directory as usual.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
Any feedback is always appreciated. After staring at the old version for the last few months I thought it was time for a change.
EXELLENT !! unclewebb thats helps all us 1 eyed clockers hahahahaaa
finally can use your program whith my new E8400 hot rod ;)
Here are the settings that I used...
Attachment 80457
And here's RealTemp v2.62 Minimized...
Attachment 80458
And here's what I previously had with RealTemp v2.60 Minimized...
Attachment 80459
Did you intentionally leave out the degree symbols and Cs? So far that's all I see.
By the way I do wear glasses and who knows when I will need this option.:D
thanks uncle...at least i don't have to use my glasses that often :D
msgclb: Thanks for noticing the missing symbols in the TaskBar. To center the temperature numbers in the dialog I had to remove the degree C sign but I guess they accidentally got deleted from the TaskBar as well. I'll fix that back up.
When you're running Prime it's nice when you can see the temps from a distance. My next plan is to be able to Minimize this dialog so only the top third of it shows similar to my MHz utility so it doesn't take up too much space on the Desktop.
Thanks HDCHOPPER. Whenever I'm checking out the E8400 thread it's always nice to see that famous quote.
Edit: I found the missing degree C symbols and put them back where they belong in the TaskBar and System Tray areas.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
@uncle: would it be possible to put the min/max temps in a tabbed version? maybe as well as the settings? basically just like how cpu-z shows & could just shift from cpu to motherboard to memory via tabs. at least that would save some desktop space. just a suggestion tho :D
emoners: CPU-Z like tabs is a good idea. I've never used them before so I'd have to open up my C++ book, again!
One of the reasons why I like the Settings as a separate Window is so you can make adjustments to TjMax or the Calibration settings and see exactly what effect that has on your reported temps. At the moment, I'm just going to go for a quick and easy mini option for RealTemp so you can see the important stuff without it overtaking your Desktop.
http://img71.imageshack.us/img71/4161/xpsidebartq3.png
Thxs Uncle to recive my apologyes!
I was lookin for Intels documentation and i didnt find anything about 45nm TJmax.
So i was following this thread and i sow many people here posting his Temps with their proccesors with real temps and i noticed that real temp shows lower temps than everest and core temps if u dont change tjmax, for example, everest read to me 105c with out manage tjmax manualy, and core temp same thing, but with reral read me 95c in tjmax wich its reead 10c less in all cores, for example:
Everest and Core temp: 49-39-49-49, after this i ve tested with real temp 2,63 and my temps are 39-39-32-40 and prime stable about 2 hours in real temp 45-46-38-47. so is good news in real temps because with this test temps re so lower and nice.
So, pls Uncle u know there is 5 diferents temps , one for CPU ad 4 mmore for each core (in ma case that i ve a Quad), and the cpu temp is 34c, whic for me in prime stable increase to 42c, and is hi comparing with 65nm or another 45nm that i sow in this thread. Not all is only core temps dont forget is CPU temps too, pls explain me if i m wrong bout avg cpu temps is 60c when i OC my q9450 to 3,7ghz with a zalman 9700 and mx-2 termic paste.
tnxs for your knowledges!
See u:)
Do you want to turn your processor into one of those super chips that has a nice low VID?
http://img177.imageshack.us/img177/4905/lowvidve5.png
One quick change and my Q6600 drops VID from 1.325 to 1.1625. Perfect for a screen shot or when heading to EBay. Buyers will pay extra for a low VID chip. :D
CoreTemp, CPU-Z, Everest & RealTemp all report this lower number which means that VID probably changes depending on your motherboard as well as how it is set up. C1E did not need to be enabled in the bios on my board for this to work so give it a try and see if you can create a low VID chip for yourself. Maybe users will start comparing Min and Max VID now.
My daughter told me I was getting a little carried away with the "eye chart" temp numbers so I've used a little restraint with the latest version.
http://img293.imageshack.us/img293/9484/rt264ey6.png
Once I get the mini mode working I'll post an update.
In CPU-Z if you go to the About tab and click on the Registers Dump button it will at least give you your Maximum VID value for whatever that's worth. It reports 1.325 for my Q6600.
I think too many users have been comparing Maximum and Minimum values from different software and different versions of the same software while running on different motherboards. I hope more users post some data so we can see if there's a pattern here. At the moment, most of the VID data that's been posted in forums is useless.
Had to disable OC, back to auto get VID change to work, but that was interesting. Top pic is with laptop power scheme showing VID 1.15, bottom pic is desktop power scheme which shows my max VID of 1.225
In this Core Temp 0.99 I found the Register Dump in the Options menu. I can't find any VID in the dump. After the top data as shown below, it is all hex. I haven't tried your power option yet.
Attachment 80553
msgclb: The Registers Dump in CPU-Z shows VID. I don't trust CoreTemp since version 0.99 is still getting my T7200 wrong compared to what CPU-Z says.
I noticed a coincidence while doing some testing.
http://img207.imageshack.us/img207/7...irfightmh7.png
Which one of these utilities gives you more useful info per square inch? The recent RealTemp diet has got these two ready to go head to head. I turned off RealTemp's calibration features and set TjMax=100C so at least it would be a fair fight! :D
msi 780i, x3360, realtemp reports vid 1.1500, is this min as bios sets default voltage?
DMI Baseboard
-------------
vendor Micro-Star International Co.,LTD
model MS-7510
revision 1.0
serial To be filled by O.E.M.
DMI BIOS
--------
vendor American Megatrends Inc.
version V1.2B1
date 06/06/2008
Processors Information
------------------------------------------------------------------------------------
Processor 1 (ID = 0)
Number of cores 4 (max 4)
Number of threads 4 (max 4)
Name Intel Xeon X3360
Codename Yorkfield
Specification Intel(R) Xeon(R) CPU X3360 @ 2.83GHz
Package Socket 775 LGA (platform ID = 4h)
CPUID 6.7.7
Extended CPUID 6.17
Core Stepping C1
Technology 45 nm
Core Speed 3825.2 MHz (8.5 x 450.0 MHz)
Rated Bus speed 1800.1 MHz
Stock frequency 2833 MHz
Instructions sets MMX, SSE, SSE2, SSE3, SSSE3, SSE4.1, EM64T
L1 Data cache 4 x 32 KBytes, 8-way set associative, 64-byte line size
L1 Instruction cache 4 x 32 KBytes, 8-way set associative, 64-byte line size
L2 cache 2 x 6144 KBytes, 24-way set associative, 64-byte line size
FID/VID Control yes
FID range 6.0x - 8.5x
max VID 1.238 V
Features XD, VT
I don't have someone watching my back like you do.:)
Both Real Temp 2.63 and CPU-Z dump shows VID of 1.3500v using this E6750.
Attachment 80556
I think you may be right.
I have pics of my E8400 with VID 1.1125 when it first came out, but that was older version of coretemp, which was probably in error.
And I can make it now 1.15 to 1.225.
I wonder if it is possible to have software report VID range, min to max?
I'll get right on that! I don't know if it's possible to read Min VID from the processor unless your motherboard and the settings you're using supports that. At least I know what to look for now when reading some registers.
loonym: If RealTemp is reporting Minimum VID then if you start up Prime or whatever it should jump and report the Maximum VID. The T7200 mobile chip I tried showed 4 different VID values depending on load when transitioning between Minimum and Maximum VID.
Interesting. On this board and the x3360 it doesn't seem to change depending on load and stays constant at 1.15 according to realtemp or any other software, everest (1.1), coretemp (1.0375), or cpuz (also 1.15)
I use RMClock 2.30 to toggle C1E within Windows. The newer version of this program isn't stable on my computer. Maybe on your board, the C1E setting might make a difference.
Hey guys,
New to the forums here. I've been reading through this entire thread for the last couple days and found that it made for some very good reading/info.
It's been a while since I've been big into overclocking ... I had the same PC for about 5 years -- and recently found a little bit of money to upgrade. It's not a powerhouse PC, but it works for what I use it for (mostly development and surfing).
Unfortunately, I've have been blessed with an E6400 with L2 stepping. CPU-z reports it as being a Conroe core (which I somewhat believe), however it only has 2MB of L2 Cache (which also leads me to believe it may be an Allendale). In any case, I'm having a difficult time determining what the correct TjMax for this chip is.
Coretemp 0.99 gave me idle temps of 45C/46C and load temps of 60C/60C (that was with a 100C TjMax I believe). I came across this site and found RealTemp and found that it uses 85C. Following the calibration instructions, RealTemp is now showing me idle temps of 31C/30C and load temps of 46C/46C. That seems about right as the ambient temp in my office is 24C.
Before I got my E6400 I had a P4 531 (Prescott) which I'm inclined to believe it ran much hotter than the E6400 and and my Gigabyte G-Power Pro kept it running at 39C idle and around 54C under load. I'm currently using AS5
Since you guys seem to know your stuff, so I'll ask. Do my temps seem right?
Screenshots:
http://www.maddenleagues.net/idle.jpg
http://www.maddenleagues.net/load.jpg
RealTemp 2.64 is available from the beta testing site.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
It includes a new mini mode which may not be as good looking as a Vista gadget but it's useful all the same. Just position RealTemp where you want it and then double click anywhere on the dialog to enter mini mode and double click again to go back to normal. It also works with the Always on Top option as long as you set that first.
The Windows Task Manager also dumps the title bar, etc. after a double mouse click so I shouldn't get in too much trouble for creating a hidden feature.
http://img183.imageshack.us/img183/5605/rtjrfp4.png
I'll try to add an option so you can start up in mini mode for the next release.
jtaber79: Go into the cpuz.ini file and set:
Sensor=ON
so it reports your actual core voltage and not VID. I believe that RealTemp is closest to the truth for the L2 series but I'm kind of biased and all of the competition seems to disagree. Oh well. If you've gone through my Calibration procedure as outlined in the docs then you'll probably come to the same conclusion that there's no chance that TjMax=100C for your processor. CoreTemp started this nonsense when the E4300 L2 came out to try and correct for sensor issues at low temperatures but this was a mistake. Everyone else blindly followed along.
Looks great! In mini mode you should be able to move the window by clicking anywhere on the box. Don't know if it's a pain to implement that, but it could be helpful.
Great little piece of software. I only have one small problem: RealTemp (any version, even the newest one) always starts maximized if I put a shortcut in my Start Up folder. Even if I have ticked the Start Minimzed setting or if I start the shortcut with the Minimzed setting under the shortcut properties.
WoZZeR999: Good idea. I'll see what I can do.
roflcopter: Adjusting the shortcut properties shouldn't make a difference. I'm using XP and have RealTemp in my Start Up folder and it starts up Minimized in the System tray area. Where do you have the RealTemp folder located? Some users have problems because they are using a limited account and then stick the RealTemp folder in a location like C:\RealTemp that they don't have the priviledge to read or write to so their settings are not saved. Try putting the RealTemp folder on your Desktop and make sure Start Minimized is checked. Give me some more details if this doesn't work for you like what OS you're using so I can investigate further.
@unc,
how come after downloading 2.64 and overwriting the old files (2.63). i still get 2.63??
Are you using Firefox? unclewebb gave me the following advice to solve it...
http://www.xtremesystems.org/forums/...postcount=1186
I'm wonder if Firefox 3 will have the same problem.
I'm an administrator on my pc so it shouldn't be a user rights problem. I'm using Vista 64. I tried it by placing a shortcut in the startup folder and by adding RealTemp as an startup item in the registry. But this doesn't help and RealTemp always starts maximized. If I check the .ini file startminimized is set to 1.
Just had another thought (sorry, I just keep coming up with things that I think would be cool to have), when in 'mini mode', have a docking option for corners/edges of the screen (I would say other windows, but I don't want you to be driven insane).
roflcopter: I think this might be a Vista 64 compatibility issue. In XP it works fine and I think it will start minimized to the System Tray or Task Bar in Vista 32 OK. Hopefully someone can confirm that. I'll look into doing this another way. Does anyone know if this feature ever worked in RealTemp when using Vista 64? If so what version of RealTemp?
I made a change a while ago so that it goes directly to the tray but that might be causing a problem in Vista 64 so I'll try to find a work around.
Just tested 2.64 and start minimized worked as it should on vista x64.
unclewebb:
I checked the cpuz.ini file and Sensor=ON is set.
It, nor any other utility (tried Everest as well) will show anything different than 1.325v.
Must be my crappy Intel mobo. :(
I would have to agree that RealTemp is the closest to truth. I've seen your test with the E2160 and the IR gun. You definitely know what you're talking about.
Here's my best test of an E8400. I haven't seen too many real world tests published by the competition.
http://www.xtremesystems.org/forums/...&postcount=573
Looks like CPU-Z is unable to read the actual core voltage on your motherboard. If it can't find a sensor to read it defaults to displaying the VID which is the voltage that the processor is asking the motherboard for but may not have anything to do with the core voltage that it is actually getting. Maybe the author of CPU-Z might be able to help you. He's done an excellent job at correctly reporting core voltage on most motherboards.
Turns out I had an old version of Everest.
Looks like the actual voltage is 1.301v with EIST & C1E off and fluctuates between 1.195v and 1.301v with them on.
Everest says the Sensor Type is: Analog Devices ADT7476 (SMBus 2Eh)
I guess I will e-mail CPU-z and see if they can help.
Here's something that's very interesting. It's kind of long but worth reading if you own a Quad.
I've been using the same Q6600 for the last few months for testing purposes. I've probably stared and analyzed the DTS data coming out of these sensors more than anyone in the world has analyzed DTS data. Here is a post from yesterday that shows the typical DTS data from this processor.
http://www.xtremesystems.org/forums/...postcount=1444
The DTS data from sensor 0 and 1 has always been exactly the same at idle. Core3 is usually the same as the first two cores or reads 1 higher. Core2 has always been out to lunch and usually reads 4 higher than the first two cores. This has been consistent for months.
Tonight I have a look at my temps and they're all over the place. I turned off the calibration feature and started up CoreTemp and set it to report the raw data coming from the digital thermal sensors so I'd have something to compare to. RealTemp reports this data as Distance to Tj Max.
http://img129.imageshack.us/img129/6...rtest01nt2.png
CoreTemp and RealTemp are both reading the exact same DTS values from the processor but now the sensor on core2 is correct and core1 is wrong. The readings coming from these two cores appears to have swapped.
This makes no sense so I set SpeedFan to display DTS and it reports the exact same thing.
http://img442.imageshack.us/img442/7...rtest02iz8.png
To try to understand what was going on I started up the multi core version of Prime95. By going into the Task Manager you can use Set Affinity and limit what cores it runs on.
Just to clarify things, a Quad core is basically two separate Dual cores. What I found in previous testing was that if I put all of the load on core0 then core1 would heat up to a similar temperature because it's directly connected to core0 while the other two idle cores that were separate would run cooler.
Now when I run this test, all temperature monitoring software shows that when I put a Prime load to core0, its partner, core1 stays cool and core2 heats up which implies that core0 and core2 are now partners. As I move the load to each individual core it's obvious that the readings being displayed as core0,core2 are coming from one dual core and the readings from core1,core3 are coming from the other dual core.
Anyone with a Quad has seen this before where two cores run at one temperature while the other two cores run at a slightly different temperature. There has never been any consistency of what cores will run at the same temperature but now its starting to make some more sense. There is either a bug within Intel Quad processors that when software asks it to read the sensor, it sometimes returns temperature data from the wrong core or there is a bug within Visual C++.
Given that 3 different programs are all reporting the same, the bug looks more like an Intel issue. One more reason not to remount your heatsink 101 times when the temp data is looking a little odd on a Quad. The temp data is odd!
Thank God for RealTemp. If this new pattern stays the same I can just quickly swap calibration factors for core1 and core2 and I should be good for the next few months. The competition continues to ignore the numerous problems with getting accurate data out of these sensors, and not just the 45nm sensors.
I had noticed that alot. Either Core 0 and Core 1 were linked, or Core 0 and Core 2 were linked. I have yet to see Core 0 and Core 4 linked.
WoZZeR999: You don't have to look at too many screen shots to see the linking of the cores in a Quad. I always thought that Quads came in two varieties with either core0/core1 linked or core0/core2 linked. It's easy enough to test for this by moving Prime, using the Task Manager, to each core as it is running. Now I've got one processor that after 3 months of use has suddenly changed.
Here's a screen shot from the end of March when I first installed this Q6600 showing the original behavior.
http://www.xtremesystems.org/forums/...&postcount=567
Now core1 and core2 have swapped and all temp software is reporting the same so it wasn't a change that I made to RealTemp. Maybe Quads are designed to do this after so many hours of service. :shrug: I've seen a lot of hard to explain stuff coming from these sensors but this latest issue is a mystery.
RealTemp 2.65 is available in the beta section:
http://www.fileden.com/files/2008/3/...alTempBeta.zip
Just a quick update so that RealTemp can report current VID as well as Maximum VID by clicking on the toggle switch in the upper right. If you go into the Control Panel -> Power Options and set your processor to Portable/Laptop, you might see a difference in these two values.
Here's what my board shows with two instances of RealTemp running:
http://img411.imageshack.us/img411/8138/rtvidkv7.png
@unc,
thanks for answering my question above. yes, i am using firefox 3 but before i posted that i did clear my cache but still no go. so what i just did is use a download manager (FlashGet) and it worked fine. also tried IE and it downloaded the latest one 2.65.
again keep up the good work and nice to see you also added the multipliers. cheers!
In the Help section it mentions mobile PCs and options such as On Battery or Plug in but I don't think these options or any option related to laptops are available unless it's a laptop. On my Vista 32 and I believe also on Vista 64 there are three available power plans: Balanced, Power Saver and High Performance. They all have the same options.
Balanced Plan | Power Saver | High Performance
-------------
Default settings
Turn off the display: 20 minutes (1 minute to 5 hours or never)
Put the computer to sleep: 1 hour (1 minute to 5 hours or never)
I can't find any of these under Advance settings.:shrug: Maybe a more experience Vista user will figure it out!Quote:
On the Advance setting tab ...
Suppose that you frequently use a mobile PC to give presentations, and you want the display to stay on during the presentations. Consequently, you also want the mobile PC to stay awake while you give your presentations.
To keep the display on during presentations
Expand Display, expand Turn off display after, click On battery or Plugged in, and then click the arrow to change the setting to Never. You can also type the word Never in the box.
To prevent the mobile PC from going to sleep during presentations
Expand Sleep, expand Sleep after, click On battery or Plugged in, and then click the arrow to change the setting to Never. You can also type the word Never in the box
Thanks loonym. If I had a dollar every time you gave me a thumbs up during this project I'd be a rich guy!
I had a look at your previous post:
http://www.xtremesystems.org/forums/...postcount=1299
where you were getting two different VID numbers depending on what motherboard you were using. It looks like the latest version of RealTemp is reporting the two different VID values that you were previously seeing. I think I found a way to read the Minimum VID as well but it's not documented by Intel. The manual says "Reserved" and leaves you guessing but most of my guesses have been pretty good so far.
msgclb: The Portable / Laptop trick works in XP on Desktop processors. It sometimes drops the multi and drops the reported VID. Some boards need C1E enabled for this to work and some boards like the one I'm using don't.
Nice update, now I think you should make a command line which should decide if to center(dualcore cpu's) or not(quadcore cpu's) the temperatures because it looks weird. Just a suggestion... :D
On my Q6600/B3 I see VID = MaxVID.
CPU VID: 1.3250V
CPU VID set in BIOS: 1.3550V
CPU real voltage/full load: 1.290V
PS. Hello to everyone, especially to unclewebb :)
Yeah thanks again for a nice proggy :)
really like that you are developing it all the time..changing it to the better..
i would also like to center to digits when using a dual core... couse it just doesnt look good :P
and one more thing, when "dubble clicked" i cant move the window anymore to where i want...
thanks again uncle!
zorzyk: Your screen shot is a perfect example of a recent bug I found. Not a bug in RealTemp but a bug in all core temperature monitoring programs like CoreTemp, RealTemp, SpeedFan and Everest to name a few of the majors. With the cores being numbered from 0 to 3, there is a bug or feature, possibly at the hardware level where the data returned for the center cores, 1 and 2, is getting swapped. This just started happening on my Q6600 the other day.
If you swap the two center cores on your processor you get 46,46 on one set of the Dual cores in your Quad and 42,41 on the other Dual core. When you see a pattern of high, low, high, low that is usually a pretty good sign of this problem.
The easiest way to test your Quad for this issue is to run Prime95 and then go into the Task Manager and use the Set Affinity... option and set Prime to run on core0 only. During this test, core0 and core1 should increase in temperature because they are physically side by side while core2 and core3 should be at a slightly lower temperature. My Q6600 used to show that until one day it switched. Now when I apply a Prime load only to core0, all temperature monitoring software shows core2 heating up while core1 and core3 are now linked and are running cooler. This is wrong and is one of the reasons why Quad data looks very odd sometimes for many users.
Good news is I've come up with a quick way to test for this so RealTemp should be able to adjust for this and automatically report the cores in their proper positions. I'll post a fresh beta later today so Quad core users can test this out.
On most boards, VID will always be equal to MAX VID but if you are using a power saving mode and your motherboard supports it, VID can drop while your MAX VID will stay the same. Check out loonym's screen shot above for an example of this. I thought it was time for software to report both of these values. I'm hoping that MAX VID is the same on all motherboards for a processor while it's VID can vary from one motherboard to the next. Users have previously been trying to compare VID numbers that weren't consistent.
Ovidiu: I don't like how RealTemp looks when using a Dual core either. It would be easy enough to center the data so it shows up in the center two positions and I might do that shortly. A much better solution would be to redesign the interface and create a dual core specific version so there isn't a bunch of wasted space. That's always been the long range plan which will hopefully get done in the near future.
Infa: Not being able to move the GUI is presently a limitation in mini-mode. Fixing that is on my things to do list as well.
FullSky: Thanks for confirming this. When I first observed my Q6600 suddenly reporting different values than what it's been reporting for the last 3 months I thought I was losing my marbles! My first thought was that I did something and screwed up RealTemp but then I found that every other temp program was confirming that something had changed. Hard to use RealTemp's calibration features if the center two cores swap back and forth at random.
Luckily my simple fix to test for and to correct this problem is working 100% so far during testing. CPU-Z is the only utility I've found that is correctly handling this issue and reporting the cores in their logical order with core0 and core1 being joined as well as core2 / core3. This bug also effects the SetAffinity... options within the XP Task Manager. It could be a hardware feature so when software asks to use core1 it gives them core2 instead and when software asks to use core2 it returns core1. It might be some scheme to balance the load and the temps across all 4 cores. Either that or it's Windows bug #3,435,278.
I'll probably have to include an option to disable this new feature because I'm sure there will be users that will never believe or understand it when RealTemp goes off on another tangent and starts reporting data that is different than all of the competition, again.
Unclewebb, like loonym said, nice job on VID! At idle with laptop power mode VID drops (EIST enabled, auto/stock on my GB P35 DQ6), under load it goes back to max VID. VID max remains constant as it should...so no cheating the VID max on realtemp! (I just realized you can have more than one realtemp open at same time...helps with testing.)
As for the core swapping, I have to say that I encountered such readings (values are only for demonstration porposes):
1) 46-42-46-41 mostly
2) 46-42-42-46 very rare
3) 46-46-42-41 seldom
This changes in joining cores are to be observed when I reset PC time and time again.
Due to the shifting core configuration I was seeing, I started using different sets of calirbration numbers; one set for when core 0 was tied to core 1 and one for when core 0 was tied to core 2. After awhile, I learned to live with the effect, dropped the calibration altogether, and started relying only on what core 0 was doing.
The temperature patterns you show in #1 and #3, and their frequencies, are almost exactly what I get (but not the temps themselves). For me, the swap occurs infrequently, mostly after an initial boot, a reboot, or during an extended period of uptime, say 18 hours or more.
Carey
rge: Thanks for showing that it works. I'm pretty sure that both values, Min and Max VID can be read from a processor without a user having to make any adjustments to their Power configurations. I'll probably display the VID range on the Settings page and leave the real time VID on the main page.
FullSky: Of your 3 examples, #3 is correct and my new code will try to identify and correct #1. If that works maybe I will try to test and correct #2 as well.
The simplest test I've found to determine what cores are linked is to run Prime95 / Orthos and use SetAffinity to force it to only run on core0. The core it is linked with will also increase in temperature. When I first got my Q6600, core0 and core1 would go up while core2 and core3 would lag behind. That is what should happen on a Quad but now core0 and core2 go up while core1 and core3 lag. If anyone has a 45nm Quad they might want to try this test and post their temps.
RealTemp 2.66 that I'm working on swaps the center two cores so it looks like it should.
http://img525.imageshack.us/img525/7...testingpg6.png
Core0 is hottest and it heats up core1 to within a degree or two of it. The other two cores are separate and don't heat up as much. With 65nm, there is a very clear difference as you move farther away from the core that is doing the work.
Previous versions of RealTemp and all of the other temp software I tested shows high, low, high, low which is wrong. Be careful when testing because the Task Manager has this bug as well. Clicking on CPU 1 will not get you the real core1 if core0 and core2 are linked.
graysky: I try to keep the beta versions sort of low profile so you have to go looking for them. It gives me a chance to test things out without 1001 people screaming the moment I screw something up. I've been using the same URL for a while now so hide that in your favs and you'll be OK.
Go figure. I somehow turned a couple of lines of code to read temps from Core processors into a 60 page thread. :DQuote:
It took me forever to scroll through all these pages
hi uncle Thank You again for hard work an updating this nice tool :)
some screens of my 45nm QX9650 / seems like the same issue as your Q
when by taskmgr CPU2 only selected (affinty CPU2) it should be core#3 by common means - it make a very little changes in the readings ..
when by taskmgr CPU3 only selected (affinty CPU3) (core#4) then it seems rising
the readouts for Core#1 and Core#2 aka ( CPU0 and CPU1) but due to the crappy sensor ( i think) ... @ default clocks and voltage it does not rise over
27(C) what seems to be the dead (zero) point for that core ,(anyway i get it moving on higher clocks and voltage so it isn't stuck totally)
( my idle temps at stock setup are (25-25-21-27 by RT)
so is this true or not .. :shrug: i want to believe this is :)... my case is open and amient is ~23~24 C
when by taskmgr CPU0 or CPU1 selected (affinty CPU0 or CPU1)
there is basically no difference witch one of them ...
readings for core#1 and core#2 ( by RT or others.. Everest)
run both like linked ...
there is no calibration used in this tests .. i used currently beta v2.64
screens are 4 separate runs of prime95 each run 1 work thread(blend) selected and affinity was set before launch the thread..
( also did a double check by loading 1 thread SmallFFT in prime95 and the switched
affinity within running worker thread - it is visible what core is 100% load in task mgr performance window - but the results for temps were the same )
load1-thread-Affinity CPU0
http://img131.imageshack.us/img131/6...cpu0sj4.th.png
load1-thread-Affinity CPU1
http://img520.imageshack.us/img520/2...cpu1ts0.th.png
load1-thread-Affinity CPU2
http://img296.imageshack.us/img296/6...cpu2ef7.th.png
load1-thread-Affinity CPU3
http://img517.imageshack.us/img517/5...cpu3ef1.th.png
----
load4-thread-Affinity CPU-ALL
http://img363.imageshack.us/img363/3...cpuazc1.th.png
Although my E8400 is IR gunned 95 at DTS=0, like unclewebb's, and tjmax is likely 95. I wanted to test a bare die to confirm. I killed my E6850 (soldered IHS), trying to remove IHS. So, I got an E7200, die attach only, no solder.
IR gun (which is accurate to within 1C of calibrated touch temp probe). Also have a thermocouple calibrated on route (for Fluke multi), should be here by next Thursday.
Pic 1 and 2 is E7200 casing temp (IHS still on) at undervolted state. Casing temp reads consistently 88.8 to 89.9C most of time (add delta to tjmax to IR reading for temp at DTS=0)
Pic 3 is IHS removed (very easy with razorblade).
Pic 4 is IR gun to bare die. Bare die temp reads nearly same thing but more consistently 90 to 90.5C (adding delta to tjmax to IR reading for temp at DTS=0)
Point is, intel documents prove that at idle there is no gradient across the die, and thus die temps = Tjmax. And to me this shows that it is accurate to measure die temps via IHS intact at an undervolted, underclocked state, where the gradient is ~1C or less.
i43: Thanks for the results. When I was testing I had my CPU fan on its lowest setting to help create some heat. A little extra core voltage helps too. Stuck sensors will make the results harder to see. I know with my E8400 when one core was running Prime, the second core would heat up to within 0.5C on average of the other one. The gradients from core to core is much smaller with 45nm compared to 65nm.
rge: Nice to know that Intel isn't using solder on the E7200. If you ever get bored do you want to try testing at about 60C? I found with a small case / cpu fan blowing across the IHS that things were able to stabilize and I had plenty of time to measure with the IR gun without the temps moving.
Those new temp fonts are easy to see even when using a camera for screen shots! :D
If my mobo and/or cpu is not dead (having to use wifes computer for time being), I will try to get some pics at lower temps, but may be difficult unless reattached IHS. But I think my E7200 tjmax is 90C, unlike E8400 at 95C. I have a box fan on floor, and with IHS on, temps were rising slowly, I know at 70C, delta tj was 20, and temp was holding there a while, but did not get pic there, was trying to get pics near DTS=0.
Bare die, much more difficult to control temps, but somewhat controllable once around 85C or so, the IHS is an excellent heat sink I am finding out.
Unfortunately on some testing right after I posted, some smoke billowed out from around cpu.
BTW...people that take off IHS for temp control...you guys are crazy:D
EDIT: Well happily, my mobo lives. My E7200 which still smells like burnt electronics did not make it. Still not sure if it was a reset that set voltage to auto and overheated it...but happened so quickly at time surface temp read 100C.. Interestingly with no cpu in socket, computer does not beep/post but runs all day. With dead cpu, just immediately shuts off. Was planning on overclocking E7200 to death after temp testing...maybe should have done that vice versa.
rge: Thanks for your sacrifice literally! I might have to buy an E7200 myself to see how it compares to my previous testing but I think I'll keep the IHS on. My basement at 14C helps keep the temps down.
RealTemp 2.66 is finally done and is ready for some weekend testing.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
http://img57.imageshack.us/img57/4259/coreswapdy7.png
New features include the reporting of the Minimum and Maximum VID for your processor in the Settings window. On my Q6600 it is able to read these values while Prime is running. This is experimental code with no documentation from Intel to back up my findings so if the Min / Max numbers look out to lunch on your CPU then let me know. The real time VID is still displayed on the main page.
My Q6600 has the bug where after 3 months the center core temps have been swapped. RT 2.66 tests for this condition and corrects it. The screen shot shows Prime running and SetAffinity... was used to force it to run on core0 only. RT correctly shows core1 heating up while core2 and core3 are cooler like they should be. If you look at the DTS numbers (Distance to TjMax) you can see that RealTemp has swapped the Core1 temp data with Core2 which I believe is correct. If you have a Quad then you can do a Test Sensors test and if your center cores are reversed then it will report that it has swapped them and if everything is OK then nothing new will be reported. I don't think my test for this Quad bug will work on mobile Quad processors so I'll probably include a separate option later to enable or disable it.
I did a couple of bug fixes to Log File output for Dual cores.
I've also tried to fix the problem with RealTemp not minimizing to the System Tray on initial start up when that option is selected. It seems to only be an issue for some users with Vista x64. If you had that bug with previous versions then let me know if this version has fixed it or not.
RealTemp 2.66:
- I did not reset PC so nothing changed in core assignement according to the screenshot in my earlier post. Now Core#0 and #1 are presented in RT as joined together, but in CoreTemp they are swapped,
- on the RT main page VID is displayed as MaxVID (unchanged) rather than as real VID, which is 1.290V at the moment (CPU full load under crunching with Folding@Home SMP client :)),
- on the Settings RT page MinVID is displayed correctly AFAIR - I saw 1.1625V when I calibrated RT idle temperature.
I wont be taking off any more IHS's myself. Had to do it once just to prove to myself the gradient, so glad I did it. But temps way too difficult to control with IHS off, and it is not necessary to do so for temp testing.
Would be interesting if you get one to see the temps, I was expecting to see 95 at DTS 0, instead of 90 on E7200.
I double checked my E8400 yesterday, since I had everything already set up, still getting same ~94.3-95C with it. Again its held at temp with floor box fan, and adjust fan speed/angle to adjust temps. Pic is with IR reading 73.9, and RT core 74.
BTW..the VID feature is awesome now! The max/min 1.15 to 1.225 and current vid 1.225 is working perfect on mine.
Here is pic from yesterday of my E8400 versus realtemp, with same setup.
Just wondering why my XS Bench is just 981 comparing to 1000 for the E8400. MY CPU is also an E8400 so shouldn't I get the same score as the base one?
Pictures like that speak volumes. Keep it handy for the non believers!Quote:
rge: Pic is with IR reading 73.9, and RT core 74.
You've got me curious about the E7200. I might pick one up for a test if the price is right.
I think this is a common misunderstanding. VID may not have anything to do with the actual core voltage that your processor is receiving. CPU-Z typically reports Core Voltage or actual voltage. Think of VID as the voltage that the processor is requesting from the motherboard. The motherboard, depending on how it is set up, can use this information or can ignore it. If your bios is set to AUTO and you are not overclocking then motherboards are designed to set your maximum core voltage to your maximum VID. There is always a little bit of voltage droop so you typically end up with slightly less than the maximum. Your max VID is 1.3250 so if CPU-Z is reporting 1.29 volts then it might be working the way it is designed to do. If you are manually setting core voltage in the bios then VID information is usually ignored.Quote:
on the RT main page VID is displayed as MaxVID (unchanged) rather than as real VID, which is 1.290V at the moment
With my Q6600 at default MHz and AUTO voltage with C1E enabled, CPU-Z reports core voltage just less than Minimum VID at idle and just less than Maximum VID at full load.
http://img410.imageshack.us/img410/108/idlevidrq0.png
And here it is with Orthos warming up a couple of cores.
http://img292.imageshack.us/img292/1...thosvidqf5.png
A small amount of voltage droop between what the processor asks for and what the motherboard delivers is normal. The new Min / Max VID numbers in RealTemp definitely have some meaning on my CPU and board.
This is what you should see after clicking on Test Sensors if RT has swapped your center cores.
http://img95.imageshack.us/img95/7235/quadbugov4.png
jcniest5: What operating system are you using? The bench score is based on Windows XP with a minimum of background junk running. I don't remember seeing any Vista scores that were significantly less but maybe a few Vista users can post their scores for comparison. My E8400 at 3000 MHz as well as my Q6600 at 3000 MHz both score 1000. It's only a single core bench and doesn't take advantage of the extra cache in the E8400 so there is no difference between the two. The only difference is that on a Quad I can run 4 instances of the RealTemp XS Bench at the same time and the scores are all very close together.
http://img171.imageshack.us/img171/2792/quadrnthp1.png
Sorry uncleweb, but it looks like 2.66 does not fix the core swap bug for me.
The following image shows RT 2.66 on the left and CoreTemp on the right. Both show the core swap bug, both are in agreement, and RT is apparently not adjusting for the bug. When I test the sensors, I do not see a message stating the core bug was taken into account.
http://members.cox.net/wifeometer/RT...ity_test_4.jpg
On rebooting, the core swap bug suddenly goes away (which is typical), the temperatures suddenly make sense and both RT and CoreTemp temps are still in agreement.
http://members.cox.net/wifeometer/RT...ity_test_3.jpg
When I reboot again the swap bug returns.
When the bug is not present, running Prime95 with affinity favoring specific cpus has predictable results. When the bug is in effect, running the same tests on single cores produces skewed results.
Keep up the good work, uncleweb! And, let me know if I can provide any details you may need. I'd love to see this problem solved.
Carey