Outstanding!!! Another wonderful update to an already great release. :up:
Printable View
@unc,
thanks for answering my question above. yes, i am using firefox 3 but before i posted that i did clear my cache but still no go. so what i just did is use a download manager (FlashGet) and it worked fine. also tried IE and it downloaded the latest one 2.65.
again keep up the good work and nice to see you also added the multipliers. cheers!
In the Help section it mentions mobile PCs and options such as On Battery or Plug in but I don't think these options or any option related to laptops are available unless it's a laptop. On my Vista 32 and I believe also on Vista 64 there are three available power plans: Balanced, Power Saver and High Performance. They all have the same options.
Balanced Plan | Power Saver | High Performance
-------------
Default settings
Turn off the display: 20 minutes (1 minute to 5 hours or never)
Put the computer to sleep: 1 hour (1 minute to 5 hours or never)
I can't find any of these under Advance settings.:shrug: Maybe a more experience Vista user will figure it out!Quote:
On the Advance setting tab ...
Suppose that you frequently use a mobile PC to give presentations, and you want the display to stay on during the presentations. Consequently, you also want the mobile PC to stay awake while you give your presentations.
To keep the display on during presentations
Expand Display, expand Turn off display after, click On battery or Plugged in, and then click the arrow to change the setting to Never. You can also type the word Never in the box.
To prevent the mobile PC from going to sleep during presentations
Expand Sleep, expand Sleep after, click On battery or Plugged in, and then click the arrow to change the setting to Never. You can also type the word Never in the box
Thanks loonym. If I had a dollar every time you gave me a thumbs up during this project I'd be a rich guy!
I had a look at your previous post:
http://www.xtremesystems.org/forums/...postcount=1299
where you were getting two different VID numbers depending on what motherboard you were using. It looks like the latest version of RealTemp is reporting the two different VID values that you were previously seeing. I think I found a way to read the Minimum VID as well but it's not documented by Intel. The manual says "Reserved" and leaves you guessing but most of my guesses have been pretty good so far.
msgclb: The Portable / Laptop trick works in XP on Desktop processors. It sometimes drops the multi and drops the reported VID. Some boards need C1E enabled for this to work and some boards like the one I'm using don't.
Nice update, now I think you should make a command line which should decide if to center(dualcore cpu's) or not(quadcore cpu's) the temperatures because it looks weird. Just a suggestion... :D
On my Q6600/B3 I see VID = MaxVID.
CPU VID: 1.3250V
CPU VID set in BIOS: 1.3550V
CPU real voltage/full load: 1.290V
PS. Hello to everyone, especially to unclewebb :)
Yeah thanks again for a nice proggy :)
really like that you are developing it all the time..changing it to the better..
i would also like to center to digits when using a dual core... couse it just doesnt look good :P
and one more thing, when "dubble clicked" i cant move the window anymore to where i want...
thanks again uncle!
zorzyk: Your screen shot is a perfect example of a recent bug I found. Not a bug in RealTemp but a bug in all core temperature monitoring programs like CoreTemp, RealTemp, SpeedFan and Everest to name a few of the majors. With the cores being numbered from 0 to 3, there is a bug or feature, possibly at the hardware level where the data returned for the center cores, 1 and 2, is getting swapped. This just started happening on my Q6600 the other day.
If you swap the two center cores on your processor you get 46,46 on one set of the Dual cores in your Quad and 42,41 on the other Dual core. When you see a pattern of high, low, high, low that is usually a pretty good sign of this problem.
The easiest way to test your Quad for this issue is to run Prime95 and then go into the Task Manager and use the Set Affinity... option and set Prime to run on core0 only. During this test, core0 and core1 should increase in temperature because they are physically side by side while core2 and core3 should be at a slightly lower temperature. My Q6600 used to show that until one day it switched. Now when I apply a Prime load only to core0, all temperature monitoring software shows core2 heating up while core1 and core3 are now linked and are running cooler. This is wrong and is one of the reasons why Quad data looks very odd sometimes for many users.
Good news is I've come up with a quick way to test for this so RealTemp should be able to adjust for this and automatically report the cores in their proper positions. I'll post a fresh beta later today so Quad core users can test this out.
On most boards, VID will always be equal to MAX VID but if you are using a power saving mode and your motherboard supports it, VID can drop while your MAX VID will stay the same. Check out loonym's screen shot above for an example of this. I thought it was time for software to report both of these values. I'm hoping that MAX VID is the same on all motherboards for a processor while it's VID can vary from one motherboard to the next. Users have previously been trying to compare VID numbers that weren't consistent.
Ovidiu: I don't like how RealTemp looks when using a Dual core either. It would be easy enough to center the data so it shows up in the center two positions and I might do that shortly. A much better solution would be to redesign the interface and create a dual core specific version so there isn't a bunch of wasted space. That's always been the long range plan which will hopefully get done in the near future.
Infa: Not being able to move the GUI is presently a limitation in mini-mode. Fixing that is on my things to do list as well.
FullSky: Thanks for confirming this. When I first observed my Q6600 suddenly reporting different values than what it's been reporting for the last 3 months I thought I was losing my marbles! My first thought was that I did something and screwed up RealTemp but then I found that every other temp program was confirming that something had changed. Hard to use RealTemp's calibration features if the center two cores swap back and forth at random.
Luckily my simple fix to test for and to correct this problem is working 100% so far during testing. CPU-Z is the only utility I've found that is correctly handling this issue and reporting the cores in their logical order with core0 and core1 being joined as well as core2 / core3. This bug also effects the SetAffinity... options within the XP Task Manager. It could be a hardware feature so when software asks to use core1 it gives them core2 instead and when software asks to use core2 it returns core1. It might be some scheme to balance the load and the temps across all 4 cores. Either that or it's Windows bug #3,435,278.
I'll probably have to include an option to disable this new feature because I'm sure there will be users that will never believe or understand it when RealTemp goes off on another tangent and starts reporting data that is different than all of the competition, again.
Unclewebb, like loonym said, nice job on VID! At idle with laptop power mode VID drops (EIST enabled, auto/stock on my GB P35 DQ6), under load it goes back to max VID. VID max remains constant as it should...so no cheating the VID max on realtemp! (I just realized you can have more than one realtemp open at same time...helps with testing.)
As for the core swapping, I have to say that I encountered such readings (values are only for demonstration porposes):
1) 46-42-46-41 mostly
2) 46-42-42-46 very rare
3) 46-46-42-41 seldom
This changes in joining cores are to be observed when I reset PC time and time again.
Due to the shifting core configuration I was seeing, I started using different sets of calirbration numbers; one set for when core 0 was tied to core 1 and one for when core 0 was tied to core 2. After awhile, I learned to live with the effect, dropped the calibration altogether, and started relying only on what core 0 was doing.
The temperature patterns you show in #1 and #3, and their frequencies, are almost exactly what I get (but not the temps themselves). For me, the swap occurs infrequently, mostly after an initial boot, a reboot, or during an extended period of uptime, say 18 hours or more.
Carey
rge: Thanks for showing that it works. I'm pretty sure that both values, Min and Max VID can be read from a processor without a user having to make any adjustments to their Power configurations. I'll probably display the VID range on the Settings page and leave the real time VID on the main page.
FullSky: Of your 3 examples, #3 is correct and my new code will try to identify and correct #1. If that works maybe I will try to test and correct #2 as well.
The simplest test I've found to determine what cores are linked is to run Prime95 / Orthos and use SetAffinity to force it to only run on core0. The core it is linked with will also increase in temperature. When I first got my Q6600, core0 and core1 would go up while core2 and core3 would lag behind. That is what should happen on a Quad but now core0 and core2 go up while core1 and core3 lag. If anyone has a 45nm Quad they might want to try this test and post their temps.
RealTemp 2.66 that I'm working on swaps the center two cores so it looks like it should.
http://img525.imageshack.us/img525/7...testingpg6.png
Core0 is hottest and it heats up core1 to within a degree or two of it. The other two cores are separate and don't heat up as much. With 65nm, there is a very clear difference as you move farther away from the core that is doing the work.
Previous versions of RealTemp and all of the other temp software I tested shows high, low, high, low which is wrong. Be careful when testing because the Task Manager has this bug as well. Clicking on CPU 1 will not get you the real core1 if core0 and core2 are linked.
graysky: I try to keep the beta versions sort of low profile so you have to go looking for them. It gives me a chance to test things out without 1001 people screaming the moment I screw something up. I've been using the same URL for a while now so hide that in your favs and you'll be OK.
Go figure. I somehow turned a couple of lines of code to read temps from Core processors into a 60 page thread. :DQuote:
It took me forever to scroll through all these pages
hi uncle Thank You again for hard work an updating this nice tool :)
some screens of my 45nm QX9650 / seems like the same issue as your Q
when by taskmgr CPU2 only selected (affinty CPU2) it should be core#3 by common means - it make a very little changes in the readings ..
when by taskmgr CPU3 only selected (affinty CPU3) (core#4) then it seems rising
the readouts for Core#1 and Core#2 aka ( CPU0 and CPU1) but due to the crappy sensor ( i think) ... @ default clocks and voltage it does not rise over
27(C) what seems to be the dead (zero) point for that core ,(anyway i get it moving on higher clocks and voltage so it isn't stuck totally)
( my idle temps at stock setup are (25-25-21-27 by RT)
so is this true or not .. :shrug: i want to believe this is :)... my case is open and amient is ~23~24 C
when by taskmgr CPU0 or CPU1 selected (affinty CPU0 or CPU1)
there is basically no difference witch one of them ...
readings for core#1 and core#2 ( by RT or others.. Everest)
run both like linked ...
there is no calibration used in this tests .. i used currently beta v2.64
screens are 4 separate runs of prime95 each run 1 work thread(blend) selected and affinity was set before launch the thread..
( also did a double check by loading 1 thread SmallFFT in prime95 and the switched
affinity within running worker thread - it is visible what core is 100% load in task mgr performance window - but the results for temps were the same )
load1-thread-Affinity CPU0
http://img131.imageshack.us/img131/6...cpu0sj4.th.png
load1-thread-Affinity CPU1
http://img520.imageshack.us/img520/2...cpu1ts0.th.png
load1-thread-Affinity CPU2
http://img296.imageshack.us/img296/6...cpu2ef7.th.png
load1-thread-Affinity CPU3
http://img517.imageshack.us/img517/5...cpu3ef1.th.png
----
load4-thread-Affinity CPU-ALL
http://img363.imageshack.us/img363/3...cpuazc1.th.png
Although my E8400 is IR gunned 95 at DTS=0, like unclewebb's, and tjmax is likely 95. I wanted to test a bare die to confirm. I killed my E6850 (soldered IHS), trying to remove IHS. So, I got an E7200, die attach only, no solder.
IR gun (which is accurate to within 1C of calibrated touch temp probe). Also have a thermocouple calibrated on route (for Fluke multi), should be here by next Thursday.
Pic 1 and 2 is E7200 casing temp (IHS still on) at undervolted state. Casing temp reads consistently 88.8 to 89.9C most of time (add delta to tjmax to IR reading for temp at DTS=0)
Pic 3 is IHS removed (very easy with razorblade).
Pic 4 is IR gun to bare die. Bare die temp reads nearly same thing but more consistently 90 to 90.5C (adding delta to tjmax to IR reading for temp at DTS=0)
Point is, intel documents prove that at idle there is no gradient across the die, and thus die temps = Tjmax. And to me this shows that it is accurate to measure die temps via IHS intact at an undervolted, underclocked state, where the gradient is ~1C or less.
i43: Thanks for the results. When I was testing I had my CPU fan on its lowest setting to help create some heat. A little extra core voltage helps too. Stuck sensors will make the results harder to see. I know with my E8400 when one core was running Prime, the second core would heat up to within 0.5C on average of the other one. The gradients from core to core is much smaller with 45nm compared to 65nm.
rge: Nice to know that Intel isn't using solder on the E7200. If you ever get bored do you want to try testing at about 60C? I found with a small case / cpu fan blowing across the IHS that things were able to stabilize and I had plenty of time to measure with the IR gun without the temps moving.
Those new temp fonts are easy to see even when using a camera for screen shots! :D
If my mobo and/or cpu is not dead (having to use wifes computer for time being), I will try to get some pics at lower temps, but may be difficult unless reattached IHS. But I think my E7200 tjmax is 90C, unlike E8400 at 95C. I have a box fan on floor, and with IHS on, temps were rising slowly, I know at 70C, delta tj was 20, and temp was holding there a while, but did not get pic there, was trying to get pics near DTS=0.
Bare die, much more difficult to control temps, but somewhat controllable once around 85C or so, the IHS is an excellent heat sink I am finding out.
Unfortunately on some testing right after I posted, some smoke billowed out from around cpu.
BTW...people that take off IHS for temp control...you guys are crazy:D
EDIT: Well happily, my mobo lives. My E7200 which still smells like burnt electronics did not make it. Still not sure if it was a reset that set voltage to auto and overheated it...but happened so quickly at time surface temp read 100C.. Interestingly with no cpu in socket, computer does not beep/post but runs all day. With dead cpu, just immediately shuts off. Was planning on overclocking E7200 to death after temp testing...maybe should have done that vice versa.
rge: Thanks for your sacrifice literally! I might have to buy an E7200 myself to see how it compares to my previous testing but I think I'll keep the IHS on. My basement at 14C helps keep the temps down.
RealTemp 2.66 is finally done and is ready for some weekend testing.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
http://img57.imageshack.us/img57/4259/coreswapdy7.png
New features include the reporting of the Minimum and Maximum VID for your processor in the Settings window. On my Q6600 it is able to read these values while Prime is running. This is experimental code with no documentation from Intel to back up my findings so if the Min / Max numbers look out to lunch on your CPU then let me know. The real time VID is still displayed on the main page.
My Q6600 has the bug where after 3 months the center core temps have been swapped. RT 2.66 tests for this condition and corrects it. The screen shot shows Prime running and SetAffinity... was used to force it to run on core0 only. RT correctly shows core1 heating up while core2 and core3 are cooler like they should be. If you look at the DTS numbers (Distance to TjMax) you can see that RealTemp has swapped the Core1 temp data with Core2 which I believe is correct. If you have a Quad then you can do a Test Sensors test and if your center cores are reversed then it will report that it has swapped them and if everything is OK then nothing new will be reported. I don't think my test for this Quad bug will work on mobile Quad processors so I'll probably include a separate option later to enable or disable it.
I did a couple of bug fixes to Log File output for Dual cores.
I've also tried to fix the problem with RealTemp not minimizing to the System Tray on initial start up when that option is selected. It seems to only be an issue for some users with Vista x64. If you had that bug with previous versions then let me know if this version has fixed it or not.
RealTemp 2.66:
- I did not reset PC so nothing changed in core assignement according to the screenshot in my earlier post. Now Core#0 and #1 are presented in RT as joined together, but in CoreTemp they are swapped,
- on the RT main page VID is displayed as MaxVID (unchanged) rather than as real VID, which is 1.290V at the moment (CPU full load under crunching with Folding@Home SMP client :)),
- on the Settings RT page MinVID is displayed correctly AFAIR - I saw 1.1625V when I calibrated RT idle temperature.
I wont be taking off any more IHS's myself. Had to do it once just to prove to myself the gradient, so glad I did it. But temps way too difficult to control with IHS off, and it is not necessary to do so for temp testing.
Would be interesting if you get one to see the temps, I was expecting to see 95 at DTS 0, instead of 90 on E7200.
I double checked my E8400 yesterday, since I had everything already set up, still getting same ~94.3-95C with it. Again its held at temp with floor box fan, and adjust fan speed/angle to adjust temps. Pic is with IR reading 73.9, and RT core 74.
BTW..the VID feature is awesome now! The max/min 1.15 to 1.225 and current vid 1.225 is working perfect on mine.
Here is pic from yesterday of my E8400 versus realtemp, with same setup.
Just wondering why my XS Bench is just 981 comparing to 1000 for the E8400. MY CPU is also an E8400 so shouldn't I get the same score as the base one?
Pictures like that speak volumes. Keep it handy for the non believers!Quote:
rge: Pic is with IR reading 73.9, and RT core 74.
You've got me curious about the E7200. I might pick one up for a test if the price is right.
I think this is a common misunderstanding. VID may not have anything to do with the actual core voltage that your processor is receiving. CPU-Z typically reports Core Voltage or actual voltage. Think of VID as the voltage that the processor is requesting from the motherboard. The motherboard, depending on how it is set up, can use this information or can ignore it. If your bios is set to AUTO and you are not overclocking then motherboards are designed to set your maximum core voltage to your maximum VID. There is always a little bit of voltage droop so you typically end up with slightly less than the maximum. Your max VID is 1.3250 so if CPU-Z is reporting 1.29 volts then it might be working the way it is designed to do. If you are manually setting core voltage in the bios then VID information is usually ignored.Quote:
on the RT main page VID is displayed as MaxVID (unchanged) rather than as real VID, which is 1.290V at the moment
With my Q6600 at default MHz and AUTO voltage with C1E enabled, CPU-Z reports core voltage just less than Minimum VID at idle and just less than Maximum VID at full load.
http://img410.imageshack.us/img410/108/idlevidrq0.png
And here it is with Orthos warming up a couple of cores.
http://img292.imageshack.us/img292/1...thosvidqf5.png
A small amount of voltage droop between what the processor asks for and what the motherboard delivers is normal. The new Min / Max VID numbers in RealTemp definitely have some meaning on my CPU and board.
This is what you should see after clicking on Test Sensors if RT has swapped your center cores.
http://img95.imageshack.us/img95/7235/quadbugov4.png
jcniest5: What operating system are you using? The bench score is based on Windows XP with a minimum of background junk running. I don't remember seeing any Vista scores that were significantly less but maybe a few Vista users can post their scores for comparison. My E8400 at 3000 MHz as well as my Q6600 at 3000 MHz both score 1000. It's only a single core bench and doesn't take advantage of the extra cache in the E8400 so there is no difference between the two. The only difference is that on a Quad I can run 4 instances of the RealTemp XS Bench at the same time and the scores are all very close together.
http://img171.imageshack.us/img171/2792/quadrnthp1.png
Sorry uncleweb, but it looks like 2.66 does not fix the core swap bug for me.
The following image shows RT 2.66 on the left and CoreTemp on the right. Both show the core swap bug, both are in agreement, and RT is apparently not adjusting for the bug. When I test the sensors, I do not see a message stating the core bug was taken into account.
http://members.cox.net/wifeometer/RT...ity_test_4.jpg
On rebooting, the core swap bug suddenly goes away (which is typical), the temperatures suddenly make sense and both RT and CoreTemp temps are still in agreement.
http://members.cox.net/wifeometer/RT...ity_test_3.jpg
When I reboot again the swap bug returns.
When the bug is not present, running Prime95 with affinity favoring specific cpus has predictable results. When the bug is in effect, running the same tests on single cores produces skewed results.
Keep up the good work, uncleweb! And, let me know if I can provide any details you may need. I'd love to see this problem solved.
Carey