Guys, if you're going to talk about voltages, never ever consider what the software is telling you.
Simply put, it's wrong in most cases if not all.
Guys, if you're going to talk about voltages, never ever consider what the software is telling you.
Simply put, it's wrong in most cases if not all.
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
That is something and others are trying to find an excuse that Phenom could consume less power
but interesting to note that PII/C2Q in Crysis uses 70+W more than CPU test while I7 uses 45W or less more on Crysis , i thought all 3D and CPU results should be linear , But i was wrong![]()
Intel Core I7 920 @ 3.8GHZ 1.28V (Core Contact Freezer)
Asus X58 P6T
6GB OCZ Gold DDR3-1600MHZ 8-8-8-24
XFX HD5870
WD 1TB Black HD
Corsair 850TX
Cooler Master HAF 922
excuse me?
If a mobo overvolts CPU, it consumes more power. Period. Thats a fact.
If a mobo undervolts CPU, it consumes less power. Thats another fact.
Yeap that is True , But changed Voltage in Bios then measured the Power consumption , this got nothing with the default Vcore the chip should run on, this just like i undervolt my CPU and say it consume less power on stock clocks You need to measure the real Vcore , my 3800X2 for example Stock Volt is 1.35V and i run it 1.2V all set in Bios but never measured and it consumes less power on custom settings but this got nothing to do with stock settings ,, It is similar to the fact that if you overclock your CPU it will run faster , next time we will ask power consumption test with CPUs at stock clocks and undervoltet to the max![]()
Intel Core I7 920 @ 3.8GHZ 1.28V (Core Contact Freezer)
Asus X58 P6T
6GB OCZ Gold DDR3-1600MHZ 8-8-8-24
XFX HD5870
WD 1TB Black HD
Corsair 850TX
Cooler Master HAF 922
I was wondering what benches I can post of my chip. I am in the USA, And don't want to violate NDA.
~1~
AMD Ryzen 9 3900X
GigaByte X570 AORUS LITE
Trident-Z 3200 CL14 16GB
AMD Radeon VII
~2~
AMD Ryzen ThreadRipper 2950x
Asus Prime X399-A
GSkill Flare-X 3200mhz, CAS14, 64GB
AMD RX 5700 XT
HMM,, well I suppose not.. tho the person (president of my work) that gave it too me said I was alloud to post a picture or anything(noob). I though about doing some 3.0ghz bench's comparing my 9950 @ 3.0ghz. same BN HT and Ram settings. I only have the one board right now.. so I will have to swap CPUs to do so.
~1~
AMD Ryzen 9 3900X
GigaByte X570 AORUS LITE
Trident-Z 3200 CL14 16GB
AMD Radeon VII
~2~
AMD Ryzen ThreadRipper 2950x
Asus Prime X399-A
GSkill Flare-X 3200mhz, CAS14, 64GB
AMD RX 5700 XT
Seems we made our greatest error when we named it at the start
for though we called it "Human Nature" - it was cancer of the heart
CPU: AMD X3 720BE@ 3,4Ghz
Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
GPU:HD5850 1GB
PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty)
yea id just hold off on the benchies. we only got a week until phenom II's release, show us what you got then.
He was being pretty picky about me posting info..even a screenshot or 2. He won't even tell me how he got it..I guess I could block the info like everyone else..and post a screen shot.
Edit: I am loading Vista 7 and want to see how it is. Here is all I did. I only ran a 1M Pi..So I can't say for sure how stable it is...
![]()
Last edited by charged3800z24; 01-01-2009 at 08:29 PM.
~1~
AMD Ryzen 9 3900X
GigaByte X570 AORUS LITE
Trident-Z 3200 CL14 16GB
AMD Radeon VII
~2~
AMD Ryzen ThreadRipper 2950x
Asus Prime X399-A
GSkill Flare-X 3200mhz, CAS14, 64GB
AMD RX 5700 XT
All are fake !You can see again ! special , benchmark in game !
example with Crysis Warhead.
benchmark in http://www.hwbox.gr/showthread.php?t=3189&garpg=22
1280x1024 : FPS>125FPS
1680x1050 : FPS~~24FPS
benchmark in http://guru3d.com/article/top-10-gam...ore-216-test/7
1280x1024 : FPS<50FPSsystem
Mainboard
ASUS X58 ROG edition Rampage II Extreme
Processor
Core i7 965 @ 3750 MHz
Graphics Cards
GeForce GTX 260 core 216 896 MB
Radeon HD 4870 1024 MB
Memory
3072 MB (3x1024MB) DDR3 1800 MHz OCZ
Power Supply Unit
1200 Watt
Monitor
Dell 3007WFP - up to 2560x1600
1600x1200 : FPS ~~32FPS
-----------------------------------------------------------
Why ? who will explain ? thanks !![]()
When AMD had 64-bit and Intel had only 32-bit, they tried to tell the world there was no need for 64-bit. Until they got 64-bit.
When AMD had IMC and Intel had FSB, they told the world "there is plenty of life left in the FSB" (actual quote, and yes, they had *math* to show it had more bandwidth). Until they got an IMC.
When AMD had dual core and Intel had single core, they told the world that consumers don't need multi core. Until they got dual core.
When intel was using MCM, they said it was a better solution than native dies. Until they got native dies. (To be fair, we knocked *unconnected* MCM, and still do, we never knocked MCM as a technology, so hold your flames.)by John Fruehe
Cause you are a little bit a ****.
We run crysis with the lower details.
At guru 3d they run it @ 1280*1024 + 2xAA + Gamer Details.
@1600*1200 which is a higher resolution than 1680*1050 , they run it with Gamer Details and 2xAA
We run it @ 1280*1024 without AA + Performance Details.
@1680*1050 we run it at Enthusiast without AA
Ok now?
write my answer here too
http://vozforums.com/showthread.php?t=146108
Last edited by OverClocker_gr; 01-01-2009 at 11:38 PM.
When AMD had 64-bit and Intel had only 32-bit, they tried to tell the world there was no need for 64-bit. Until they got 64-bit.
When AMD had IMC and Intel had FSB, they told the world "there is plenty of life left in the FSB" (actual quote, and yes, they had *math* to show it had more bandwidth). Until they got an IMC.
When AMD had dual core and Intel had single core, they told the world that consumers don't need multi core. Until they got dual core.
When intel was using MCM, they said it was a better solution than native dies. Until they got native dies. (To be fair, we knocked *unconnected* MCM, and still do, we never knocked MCM as a technology, so hold your flames.)by John Fruehe
don't worry about him i swear he is related to lo squartatore.
When do clock-per-clock comparisons arrive? I don't mean to be impatient, I'm just excited!![]()
For my part I know nothing with any certainty, but the sight of the stars makes me dream.
.
.
oh ! in DX9c, i7 run faster
But I don't see Deneb's score in resolution 1680*1050 ! NDA?http://img229.imageshack.us/img229/11/shi11ub0.jpg
in i7 benchmark, you can reduce QPI speeds.
Last edited by vietthanhpro; 01-02-2009 at 12:38 AM.
When AMD had 64-bit and Intel had only 32-bit, they tried to tell the world there was no need for 64-bit. Until they got 64-bit.
When AMD had IMC and Intel had FSB, they told the world "there is plenty of life left in the FSB" (actual quote, and yes, they had *math* to show it had more bandwidth). Until they got an IMC.
When AMD had dual core and Intel had single core, they told the world that consumers don't need multi core. Until they got dual core.
When intel was using MCM, they said it was a better solution than native dies. Until they got native dies. (To be fair, we knocked *unconnected* MCM, and still do, we never knocked MCM as a technology, so hold your flames.)by John Fruehe
Bookmarks