But did you get a gain from making Extra 1.1v?
Because people talking about 1.1V, but I thought that was already defaulted to 1.1V in 3D instead of 1.05V.
But did you get a gain from making Extra 1.1v?
Because people talking about 1.1V, but I thought that was already defaulted to 1.1V in 3D instead of 1.05V.
I am still trying to find out if people is actually getting any better results from using the Nibitor mod. Everything is a bit confusing.
*CPU: Xeon X5650 @ 4.3 Ghz | Cooler: Thermaltake Water 2.0 Extreme
*Asus Rampage III Formula | RAM: 36GB DDR3 (Tracer LED + Hyper X Savage)
*Video Cards: Gigabyte Aorus 1080ti
Sound Card: Sound Blaster Z | PSU: Corsair HX1000W | Display: BenQ PD3200u | JVC RS520 projector
*Case: CoolerMaster HAF X (932 side panel) | Others: Roccat Kone AIMO | Roccat Alumic | Logitech G15 |Cameras: Sony A7R3 | RX100 V
after flashing my 8800gt with the 1,1V set in nibitor i measured the VGPU at 1,12V idle and 1,15V load.
however I have no idea what the readings were before the flash.
Intel C2D E8400 @ 4000MHz ~1,192V
Thermaltake Big Typhoon 120
MSI P35 NEO2-FIR (bios v1.8)
ADATA 2*2GB DDR2 800+ EE
ASUS Geforce 8800GT 512MB + Accelero S1
Seasonic S12-600
You can't set a higher voltage in the bios on the gtx, you have to hard mod.
*CPU: Xeon X5650 @ 4.3 Ghz | Cooler: Thermaltake Water 2.0 Extreme
*Asus Rampage III Formula | RAM: 36GB DDR3 (Tracer LED + Hyper X Savage)
*Video Cards: Gigabyte Aorus 1080ti
Sound Card: Sound Blaster Z | PSU: Corsair HX1000W | Display: BenQ PD3200u | JVC RS520 projector
*Case: CoolerMaster HAF X (932 side panel) | Others: Roccat Kone AIMO | Roccat Alumic | Logitech G15 |Cameras: Sony A7R3 | RX100 V
I clocked 725core/1750shader at default voltage. With 1.1v I'm 755/1825 AtiTool artifact free. 1.2v didn't get me any better. Mavke (NiBiToR creator) thinks that only extra clocks and voltages are working (3D, Throttle and 2D are not working), so VID you are talking about is pointless only Exact Extra 1.05. I'll try to find my DMM to see what vGPU reads on the board with 1.2v in BIOS.
See screens below of default BIOS:Post by Mavke
Can you share with ur your NVIDIA reference BIOS? Cause we haven't seen anyone with the NVIDIA board yet? ANd just to explain how the BIOS logic on the voltage works, this is all controlled by the VID lines available, with each VID having its own voltage level. On the GeForce 8800 GT so far there are only 4 levels available as with most GeForce 8x00 series.
GeForce 8800 GT VID Voltage Levels...
- VID 00 -> 0.95V
- VID 01 -> 1.00V
- VID 02 -> 1.05V
- VID 03 -> 1.10V
And currently that is just it from the so far know reference voltage table available on the GeForce 8800 GT, however I also heard some talks about higher voltages, but since we don't have a GeForce 8800 GT, well not yet we are still looking to get one it seems also 1.15V and 1.20V would work but we are not sure which VID xx this would be, but they can't be the ones mentioned above.
![]()
Last edited by F@32; 11-11-2007 at 02:48 PM. Reason: Added Mavke's post
Sony KDL40 // ASRock P67 Extreme4 1.40 // Core i5 2500K //
G.Skill Ripjaws 1600 4x2Gb // HD6950 2GB // Intel Gigabit CT PCIe //
M-Audio Delta 2496 // Crucial-M4 128Gb // Hitachi 2TB // TRUE-120 //
Antec Quattro 850W // Antec 1200 // Win7 64 bit
Nice copy/paste guys... It seems you are all well following the thread on MVKTech but don't really discuss it there but rather here. To make it clear to every one, every BIOS you can add/change the voltage table but it depends if the voltage circuitry can work and recognize the different VID levels. And yes all turns around the VID level not the label you put next to it in NiBiTor.
So if VID 00 would give 1.00V, and you would use NiBiTor to change the label of VID 00 from 1.00V to 1.20V that will make no change at all as the VID is still 00 and that is what the card looks at not the label next to it. On the GeForce 8800 GT, the VID 00 is 0.95V and highest VID 03 is 1.10V. So you can do what you want but none of the VID's 00 to 03 will give 1.20V. But maybe VID 04...if possible and if the card can work with it.
Hey man, the correct term is "quoting" not "copying/pasting""Copy/paste" is when there is no reference to original material and no credit is given to the author.
So, you're saying that even if I updated Voltages Table, max VID 03 is capped at 1.1v? Than it makes sense why adding 1.2v and changing VID 03 to 1.2v doesn't make difference with overclock![]()
Mavke, I will have to re-register on your site, somehow I didn't get activation e-mail for F@32 accountAnyway, I hope you have no problem with me posting quotes and links to your posts
![]()
Regards,
- F@32
Sony KDL40 // ASRock P67 Extreme4 1.40 // Core i5 2500K //
G.Skill Ripjaws 1600 4x2Gb // HD6950 2GB // Intel Gigabit CT PCIe //
M-Audio Delta 2496 // Crucial-M4 128Gb // Hitachi 2TB // TRUE-120 //
Antec Quattro 850W // Antec 1200 // Win7 64 bit
Yes, and it is not capped. The voltage circuitry on NVIDIA based graphics card only looks at the VID xx (the label 1.1V or 0.95V is just for you guys ti understand what each VID give sin terms of voltage). The card just looks at VID xx, and for the GeForce 8800 GT, the VID 00 is 0.95V, so having set the core at VID 00 the GPU voltage circuitry will recognize VID 00 and run at 0.95V. And VID 03 is just 1.1V... So if there would be a VID 04 then maybe we could get 1.15V...
Changing vids in nibitor is pointless after nvidia 6800 days. Never worked for me
Last edited by kiwi; 11-12-2007 at 10:07 AM.
...
Of course, that is why on an Inno3D with the default BIOS the overclock was limited to 700MHz and with a modified voltage of 1.1V the maximum overclock was 725MHz... So you tell me, but I have done some testing with Point of View and also on their cards they had similar results and even gotten up to almost 800MHz core.
So there is no point to give to VID 03 a voltage table of over 1.1v?
Right on mate it works with the GeForce 7800 GT as well which could be raised via the BIOS to 1.5V from the default 1.4V. So yes some cards can still have a BIOS voltage increase but it is just one version out of a full range. :-(
flashed once didnt work but seems like i didnt save the bios properly.
In the volts table you can add move vids, would this work?
ok got it right this time 1.12v with load![]()
Sony KDL40 // ASRock P67 Extreme4 1.40 // Core i5 2500K //
G.Skill Ripjaws 1600 4x2Gb // HD6950 2GB // Intel Gigabit CT PCIe //
M-Audio Delta 2496 // Crucial-M4 128Gb // Hitachi 2TB // TRUE-120 //
Antec Quattro 850W // Antec 1200 // Win7 64 bit
I will give this a go tomorrow to see if it works![]()
オタク
"Perfection is a state you should always try to attain, yet one you can never reach." - me =)
I assigned 1.2v to VID03, see here, but my artifact-free clocks didn't improve... 755/1825/1000 50C under load water cooled.
Last edited by F@32; 11-12-2007 at 05:28 PM.
Sony KDL40 // ASRock P67 Extreme4 1.40 // Core i5 2500K //
G.Skill Ripjaws 1600 4x2Gb // HD6950 2GB // Intel Gigabit CT PCIe //
M-Audio Delta 2496 // Crucial-M4 128Gb // Hitachi 2TB // TRUE-120 //
Antec Quattro 850W // Antec 1200 // Win7 64 bit
Could you send the BIOS that works for you via email Felipe? I dont know if I will have time to play around with nibitor much today :/
オタク
"Perfection is a state you should always try to attain, yet one you can never reach." - me =)
Bookmarks