Yes they compared it to a i7 920, x4 9950, a q9300 and a q9400. They managed to run benchmarks at 3.6 ghz for the phenom 940. The new phenom beat the q9400 and q9300 in just about every benchmark. It consumed about 50 more watts at 2d load compared to the q9400 and q9300 and consumed less power than a i7 920.
An Italian review: http://www.dinoxpc.com/Tests/articol...dex.asp?id=866
![]()
nice thanks for the link.![]()
DFM, process tweaks, etc. is all good but my point is that their is no way that a 4ghz Phenom II will only dissipate 90W at load. If you are implying that this first batch of PII are "tweaked for low power" then they must also be "tweaked for higher speed parts" as well since they are running at 4ghz with 90W?? The poof is in the pudding, if this chip could run at that speed with that wattage then we would already of seen it released as an FX chip regardless of what their tuning their process to do. $$$$$$
who said it would only be 90W at 4ghz?
edit: just read that review and it looks pretty good. im confused as to why they did the gaming tests at such a low resolution. because really whats the point of 1024x768? does anyone still run that? so that part of the review was just pointless imo. the power consumption looks good too. completely different from hwbox's results. interesting to see that the 920 and 940 use the same amount of power in standby and idle and are close in load. shows that maybe it won't heat up too bad when it overclocks. i also saw that they listed the 940 and 920 both as unlocked too so idk about that.
Last edited by roofsniper; 01-07-2009 at 03:38 PM.
hwbox used a 79-T for power consumption as well. what im happy about is that with my phenom 9600 running at 2.3ghz upgrading to a phenom II at 3ghz uses less power and from the power consumption difference between the 920 and 940 it looks like i might even be able to oc it a bit and still get lower numbers than now. it only went up 2.3W from 2.8ghz to 3ghz on the phenom II while on the phenom I from 2.5ghz to 2.6ghz is 3.45W.
Last edited by roofsniper; 01-07-2009 at 03:45 PM.
I know, its as if he intentionally skips some posts.Anyway, did you see that 170w consumption @ 3.85Ghz or so? Interestingly, the temp was around 50c load but could not prime at 4Ghz, according to Coolaler.
Now if you listen to Informal, high voltage is no problem for PII; which begs the question: If it can take high voltage and runs cool at 1.6v, why won't it prime at 4ghz? The devil in it is somewhere, don't ask me to find it.![]()
maybe but still what does that say? if you want to know gaming results then you want to know what your cpu will get at normal resolutions. as we saw with the hwbox review at 1680x1050 deneb got about the same performance as competing intel cpus and sometimes even better. showing it at 1024x768 is a whole new story.
Yes, hwbox numbers were the same because the GPU became the bottleneck so those numbers actually say less about the CPU and more about the GPU. The low resolution numbers I think are more informative if you wan't to future proof your rig so that say you upgrade to a faster card in a year or so the chances of your CPU becoming the bottleneck are less likely.
not necessarily. the lower resolution numbers don't test the same exact things as the higher ones do. it just seems that if you are going to post gaming benchmarks then you post what people game at. 1280x1024 was the highest i saw and does that make me want to buy it when im running 1650x1080? i want to know how it performs on the resolutions that i run and the resolutions that most others run as well. if they were making the gaming benchmarks to show how futureproof it would be you would think that they would show it at the high resolutions.
There are to many resolution to test, and such stuff usually is part of GFX tests and not cpu tests.
E.g. Im only interessted in 1920x1200, but since at that resolution most cards just reach its limit it wont tell you how good the cpu is and what you can expect if you later upgrade your gpu.(thats just my viewpoint on that topic)
![]()
yea it just seems with new gpus coming about once a year and new cpus coming about one every two years that if the cpus were equal at the higher resolutions then putting in a new video card wouldn't make that big of a difference if any. and you can always overclock too if your cpu becomes a bottleneck. i would be more interested than how it performs now than how it performs years from now when most likely ill have a new cpu anyway.
what i mean is that if a cpu wins in a benchmark at a lower resolution then it doesn't mean it will be better for gaming. i see testing it at a lower resolution more of a different test than a test that is finding out how futureproof it is. in one situation the cpu is dealing with a lot of small frames while in another its dealing with a lot less larger ones.
They shouldn't have used a gx for power consumption.
They also could only overclock to 3.6....seriously? everyone here has gotten that easily on stock cooling or worse.
They don't seem to get that you can overclock nb and hypertransport etc. when overclocking phenom's guess review sites are so used to intel.
Minimum fps should be used in reviews aswell...since cpu power influences that before and after overclocking from what I've seen and fps stability is very important for gaming.
I also forgot to meantion when it came to power consumption there is no way it'd be as bad as it was if they turned on cool n' quiet....which now runs the processor at 1ghz and 1v apparantly which should dramatically reduce idle power consumption.
Last edited by Caveman787; 01-07-2009 at 07:08 PM.
Bookmarks