Yes, even though no OC was tried, this was a promising review.
Deneb is a clear improvement over Agena.
(The Italian conclusion was somewhat pessimistic according to Babelfish).
Look at the good power consumption!
Green CPU! (Idle). :D
Printable View
hwbox used a 79-T for power consumption as well. what im happy about is that with my phenom 9600 running at 2.3ghz upgrading to a phenom II at 3ghz uses less power and from the power consumption difference between the 920 and 940 it looks like i might even be able to oc it a bit and still get lower numbers than now. it only went up 2.3W from 2.8ghz to 3ghz on the phenom II while on the phenom I from 2.5ghz to 2.6ghz is 3.45W.
I know, its as if he intentionally skips some posts. :shakes: Anyway, did you see that 170w consumption @ 3.85Ghz or so? Interestingly, the temp was around 50c load but could not prime at 4Ghz, according to Coolaler.
Now if you listen to Informal, high voltage is no problem for PII; which begs the question: If it can take high voltage and runs cool at 1.6v, why won't it prime at 4ghz? The devil in it is somewhere, don't ask me to find it. :shrug:
maybe but still what does that say? if you want to know gaming results then you want to know what your cpu will get at normal resolutions. as we saw with the hwbox review at 1680x1050 deneb got about the same performance as competing intel cpus and sometimes even better. showing it at 1024x768 is a whole new story.
Yes, hwbox numbers were the same because the GPU became the bottleneck so those numbers actually say less about the CPU and more about the GPU. The low resolution numbers I think are more informative if you wan't to future proof your rig so that say you upgrade to a faster card in a year or so the chances of your CPU becoming the bottleneck are less likely.
not necessarily. the lower resolution numbers don't test the same exact things as the higher ones do. it just seems that if you are going to post gaming benchmarks then you post what people game at. 1280x1024 was the highest i saw and does that make me want to buy it when im running 1650x1080? i want to know how it performs on the resolutions that i run and the resolutions that most others run as well. if they were making the gaming benchmarks to show how futureproof it would be you would think that they would show it at the high resolutions.
There are to many resolution to test, and such stuff usually is part of GFX tests and not cpu tests.
E.g. Im only interessted in 1920x1200, but since at that resolution most cards just reach its limit it wont tell you how good the cpu is and what you can expect if you later upgrade your gpu. ;) (thats just my viewpoint on that topic) ;)
yea it just seems with new gpus coming about once a year and new cpus coming about one every two years that if the cpus were equal at the higher resolutions then putting in a new video card wouldn't make that big of a difference if any. and you can always overclock too if your cpu becomes a bottleneck. i would be more interested than how it performs now than how it performs years from now when most likely ill have a new cpu anyway.
what i mean is that if a cpu wins in a benchmark at a lower resolution then it doesn't mean it will be better for gaming. i see testing it at a lower resolution more of a different test than a test that is finding out how futureproof it is. in one situation the cpu is dealing with a lot of small frames while in another its dealing with a lot less larger ones.
They shouldn't have used a gx for power consumption.
They also could only overclock to 3.6....seriously? everyone here has gotten that easily on stock cooling or worse.
They don't seem to get that you can overclock nb and hypertransport etc. when overclocking phenom's guess review sites are so used to intel.
Minimum fps should be used in reviews aswell...since cpu power influences that before and after overclocking from what I've seen and fps stability is very important for gaming.
I also forgot to meantion when it came to power consumption there is no way it'd be as bad as it was if they turned on cool n' quiet....which now runs the processor at 1ghz and 1v apparantly which should dramatically reduce idle power consumption.
i really don't care. whether i know what i am talking about or not he will still find a way to bash me. its fine with me tho since i can't read any word he says and even if i did they are pointless. lmao just like when he says he knows more about amd than me. :rofl:
The best processor is the processor that deliver most power on the lowest FPS areas. One way to filter out low FPS is to use a slow video card and/or increase the resolution.
Phenom is a very good game processor. I handles threading very well, the cache (L3) on Phenom is 32-way set associative and on Phenom II it is 48-way set associative. I think Core 2 is 8-way set associative (don't remember now).
Phenom has more places to put the memory in the cache, on Core 2 the memory don't have that many places and when memory use is increasing then Core 2 needs to go to main memory a lot more often compared to Phenom. The cache on Intel CPU is more sensetive when you multitask or games use a lot och memory from different areas (like when there is fights etc in the game).
Testing a game for 10 minutes and there is one action scene for 1 minute, then Intel will gain a lot more FPS on those 9 minutes when there is low activity in the game.
Benchmarking at higher resolutions also show how well the Phenom II scales with the gpu. 1440x900 should be the lowest resolution benched at, IMO.
Review back up at Hexus.
And HardOCP.
And Guru3d