Look at VID in outdated Core Temp 0.96.1.! :rofl:
Printable View
Look at VID in outdated Core Temp 0.96.1.! :rofl:
VID changes depending on load :p:
CoreTemp is only a guide. It shows gains from higher MHz or volts. Relative to itself, it does the job.
VID is always FIX! :yepp:
From unclewebb:
Quote:
I think Core Temp is assuming that it is some sort of Core 2 processor so it tries to do a VID calculation on whatever data it finds. There's still no known way to read VID from Core i5/i7/i9 CPUs.
I know it *should* be fixed :) I meant my VID DOES change, according to CT. Wrong, but funny
For Core i7 / i5, Intel moved the multiplier to where VID used to be located. If you use an old monitoring program, it will look in this CPU register and see a multiplier value where VID info used to be located. It then converts that into a meaningless VID number. If Core Temp 0.96.1 shows the VID changing on your Core i CPU, what it's really showing is that your multiplier is changing. Intel is usually very consistent from one generation to the next but once in a while they move things around which is why it's always a good idea to upgrade your software when a new series of CPUs comes out.
does anybody can tweak the gigabyte bios help me for save some vcore?for keep 4 ghz i need to much vcore like 1,38 crazy..
Built a new i7 860 system!
My #1 requirement for the build was a system that would encode x264 significantly faster than my current E8400 Wolfdale @ 3.6 GHz.
So far, I'm impressed. Seems around 2.5x faster than my E8400. I've applied a mild Overclock to match core for core speed of my old system, and had the CPU at 100% load (encoding dual x264) for over 24 hours with no problems.
http://www.myalbumbank.com/albums/us...302/PC%7E0.png
@sarlock
can i know what is your proc batch..
@ezaleina
Batch#: L925B478T
I assume the "VIN1"/"Vcore2" readings are the VTT voltage? I need to get around to playing with my new system, I'm just too busy with a new game coming out every week. :)
The first (1.15) is the CPU VCore, the second (1.63) is the RAM Voltage.
BTW, love your avatar. Futurama rocks.
i'm getting 20x200 sitting at 4Ghz on 1.3 vcore atm with my 860 on a new asus maximus 3 formula... lowered the qpi too 3200 too increase stabilty, also allowed me to run lower volts on the cpu so a nice drop in temps
VERY happy with these new cpu's atm
If anyone is using a P55-GD80, I recommend updating the BIOS if you're using stock 1.4. 1.4 isn't bad for an average user, but if you're going to OC, you're going to want a newer version. There's a bunch of little things that got tweaked, one of which is the vcore droop, which was broken in 1.4. I'm not sure if 1.5 fixed it, but it was obviously fixed in the 1.5b8 that K404 is using and it's fine in the 1.6b7 that I installed today. Made a world of difference coming from 1.4
I'm not doing anything hard with mine, testing it out at 3.6Ghz at the moment. That's good enough for me and I don't want to push the voltage much if I can help it because of the supposed Foxconn 1156 socket issue. (knock on wood)
If anyone else is planning to encode x264, here's some info.
These temperature statistics are in celsius, and were obatined using Gigabyte's own Easy Tune 6 Software.
The temperatures shown are during 100% load across all 8 logical cores, stressed using 1920 x 1080 x264 encodes.
3.62 GHZ (181x20), 1.152 Vcore - MAX Temp - 48 degrees
3.80 GHz (181x21), 1.232 Vcore - MAX Temp - 58 degrees
It seems that 3.6 is the sweet spot where temperatures stay fairly cool, and moving up to 3.8 GHz is only a 5% clock speed increase from 3.6, and through my tests yielded roughly the same speed increase during video encoding (4.8%), but at the cost of 10 degrees more heat.
Also, with HT on, the x264 encoding speed was roughly 15% faster during my tests. (as compared to having HT disabled.)
3.6 is good enough for me, but It's good to know 3.8 is still within safe limits, and I can use it if I really need to (which I usually won't).
FYI, coretemp, speedfan, realtemp, and cpuid all show different temperatures, so I chose to use the Gigabyte software for the tests, hoping that it'd be the most accurate, since they made my mobo.
One last FYI side note, compared to C2D (E8400) clock for clock, I usually get around 2.6x faster encoding speed with the i7 860.
The Gigabyte software is probably the least accurate. The only important temperature is the core temperature and both Core Temp 0.99.5 and RealTemp are able to read your core temperature correctly. I can't say the same about some of those other programs you tested. Speedfan seems to be the least accurate based on your screen shot. It doesn't seem to be using the correct TJMax which is written into each CPU.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
Good to know. Coretemp and Realtemp read about 10 degs higher than the Gigabyte software, which is why I figured my 3.8 GHz load temps were about as high as I'd feel comfortable going.
Sarlock, so far your chip looks like a nice clocker! I have a chip on the way that is the same batch # as yours. Can you tell me what vcore your chip will need to do 4GHz please? Judging by your 3.8GHz run it should be under 1.3v which is pretty nice. :up:
PS: My 860 batch # is L925B479... should be here next week.
Thanks
Just got my 870 today and a asus p7p55 deluxe. But with x24 multi cpuz says its running x24 but 3dmark06 says x22 and coretemp says x23 :P Whats going on? What multi is it really running?
Snowman89: You sound like a good candidate for RealTemp and i7 Turbo. Time for you to see what your multi is really up to.
http://www.fileden.com/files/2008/3/...alTempBeta.zip
On these new CPUs if you have C3/C6 enabled, your multiplier can be very dynamic based on load and can change hundreds of times a second. Old school software that samples your multiplier once per second is usually going to be wrong or I guess right for that one particular instance in time but doesn't do a good job of telling you what's really going on inside your CPU.
Run a single Super PI bench and take a screen shot about half way through it and I'll tell you what your multi is really up to and what these new tools are telling you.
An i7-870 has a default multiplier of 22. 3DMark06 came out long before these new CPUs so it doesn't report the turbo boost that is available.
With 3 or 4 cores active, an i7-870 can use a +2 turbo boost. Most CPUs only get a +1 turbo boost when all 4 cores are active so that's probably why Core Temp is assuming that you are only running at 23X.
If you enable C3/C6 then this CPU can also use a 27X multiplier when only a single core is in the active state and the other 3 are inactive in the C3/C6 state. When a second core becomes active, this will drop down to a maximum of 26 and when 3 or 4 cores are active, you'll be back at a maximum of 24X.
With C3/C6 enabled, this transitioning to and from different multipliers is happening internally hundreds of times a second. You need some modern software that decided to follow Intel's Turbo White Paper released in November 2008 to accurately determine the correct multiplier. The method is kind of complex so most software doesn't bother doing this correctly.
Here is the pic.
--> http://img263.imageshack.us/img263/5204/870n.jpg
Another question, im having a hard time getting qpi 200 stable. I can make it screen stable but nothing more. Im running a P7P55 Deluxe mobo. Only voltage ive changed is vcore and imc voltage, running 1.3v on cpu and 1.175v on imc. What more should i change? How high can i push the vcore?
Ive disabled the speedstep and set the multi to x24 in bios.
Everything looks good. i7 Turbo reports that all 8 threads are using the 24.000 multiplier with no sagging issues.
RealTemp and CPU-Z agree.
You might have disabled Enhanced Intel SpeedStep (EIST) in the bios but i7 Turbo shows that it is still enabled. That's not uncommon. A lot of motherboards provide this option but it doesn't actually do anything. EIST is usually needed for turbo mode to operate correctly so leaving it enabled in the bios shouldn't make any difference on your computer. When EIST is enabled in the bios, just go into the Control Panel -> Power Options and set the Minimum processor state to a high number like 100% if you don't want your multiplier dropping down at idle.
Nice overclock so far at that voltage. Hopefully the other guys can help you get it higher.
The Intel datasheet for these CPUs lists the Absolute Maximum core voltage as 1.55 volts so you have lots of room to play yet as long as the temps are in control.
Okey thanks :) Heres my max so far http://valid.canardpc.com/show_oc.php?id=810722
Although what voltages should i change really? Ive read something about VTT but i cant even find the setting in my bios :p: Well im gonna play some more with this setup and hopefully learn some more :)
Snowman, nice clocks. Unfortunately I think your chip might need more volts to be Intel burn test stable. Can you confirm? Is that an ES chip?
Oh and sorry unclewebb! I use coretemp because it has a LCDC plugin for my LCD on the front of my case. But I want you to know that I prefer realtemp. :D
So here is my retail 860 chip, which I think is nearly identical to the ES chip that Snowman has. At 4GHz it only took 1.23vcore for basic stability, but 1.32 to be stress test stable. :shrug:
http://i279.photobucket.com/albums/k...eri7860cpu.jpg
http://i279.photobucket.com/albums/k...8604GHz20k.jpg
here's my stuff
I received a SLBJJ S-SPEC (B1), with a stock VID of 1.18, and FPO/Batch# is L927B308.
http://i38.tinypic.com/2zivr7t.jpg
first OC i did, I needed a little more tweaking to get it stable. that's just a early screenshot. cooling was my thermaltake v1
a comparison
Core i7 920 @ 4 GHz (191x21) + GTX 275 (stock clocks)
http://i36.tinypic.com/2ib2t7k.png
Core i7 860 @ 4 GHz (191x21) + GTX 275 (stock clocks)
http://i38.tinypic.com/2sadv2u.jpg