PDA

View Full Version : So...Tell me what's Good about Phenom



akaBruno
03-14-2008, 02:05 PM
We all know they're bad overclockers...

How do they stack up otherwise?

Bruno

AliG
03-14-2008, 02:20 PM
They're decent performers, slightly better than k8 in most places (not all though) clock:clock but still no conroe killer, the original k10 design was cut down due to high costs and low yields

Honestly, I would wait for the 45nm phenoms, its just not worth it for you to downgrade from a 2.9ghz dual core to a weaker quadcore when the quadcore will only be of use in certain instances. If what I hear is true, then the 45nm k10.5 phenoms will be the ones you want

akaBruno
03-14-2008, 03:19 PM
Well thanks for sayin my old rig runs well.

In fact... I'm on an old Tbred that's kickin butt right now. I just brought it back from the dead.

I'd like to try it out on one of those new agp ATI cards, that supposedly run Crysus?

AMD all the way.

AliG
03-14-2008, 06:31 PM
Don't even bother to try to run crysis on an agp 3850, you'd be lucky to get 20 fps at the lowest settings

My advice is either wait for 45nm phenoms and the rv770 or consider waiting for k11 (nehalem will be here sooner but this is the amd forum so I think I'll keep my mouth shut lol)

Swatrecon_
03-14-2008, 08:23 PM
yea, i wouldn't bother with the 3850 either, you're limited both by the bus and the card, especially with Crysis.

JumpingJack
03-14-2008, 09:18 PM
So the low down ... the core is improved over the K8, a few review sites have made attempts to quantify the details, but the direct comparision is difficult.

You will read about the TLB errata ... it is of no consequence, I have been running a 9600 BE for a couple of months and it has been rock solid stable, no indication of problems. It has locked up on me 2 times, neither case would I have assigned it to the TLB errata, rather just a bug in software or OS. Though the B3 to fix this will be out soon, so the rumor mill states.

In multithreaded code they perform well overall. If you are a gamer, you will get the same overall gaming experience you would with an equivalently clocked dual core (if the game is single threaded), and good performance if the game is mulithreaded.

They are indeed not great overclockers, some sites and a few forum users have reported hitting 3 GHz.

Jack

cdawall
03-15-2008, 09:11 AM
Don't even bother to try to run crysis on an agp 3850, you'd be lucky to get 20 fps at the lowest settings

My advice is either wait for 45nm phenoms and the rv770 or consider waiting for k11 (nehalem will be here sooner but this is the amd forum so I think I'll keep my mouth shut lol)

ummm no you can run crysis just fine on AGP i ran a 7800GS @500/740 and a 3400+ @stock and had crysis @ 1024x768 all med/high settings


yea, i wouldn't bother with the 3850 either, you're limited both by the bus and the card, especially with Crysis.

even a 3850 wont max out an AGP8x bus and yes i know he is not going to be runnning it @ max settings but he could 1280X1024 with med settings just fine

KTE
03-15-2008, 11:42 AM
They are OK - surely not what AMD had hoped for but the design is good, ahead of its time, technologically advanced. 4 clock node levels, 4 individual PLL adjustment domains is crazy work to get working, separate power planes, 2 separate VRM input levels, [5-0] power states optional and switching between them is a ms job, excellent on this front. Idle/CnQ power and lowest quad core power/MHz/voltage you will get to date on any modern CPU. Most tweakable CPU out there ATM, the options on the CPU are wowzer if someone can take time to understand them. That's why you haven't seen most users try or even understand them yet. 100-200MHz stable, real-time at 0.4V, unbelievable. That's a very fine achievement to have on a CPU, allow such control, stable, for an end user to switch between at a very low TDP (hardly more than 2W AC gain at full load). HTPC, I hear. Most >2GHz CPU's cannot do below 1GHz, physically. So you can choose what MHz/Volts you want to set and switch through them at a click. The major issues to iron out are IMC/SB600 related mainly, but also transistor material choices. CTI and STT incorporating SGoI with the next B3 step (supposed) may help a quite a lot with x power @ y frequencies. SiGe is a major addition to transistor technologies.

Price, good until 2 weeks ago. 9600BE was nearly 1/4 price of Intel's first non-native 65nm quad of 2.67G. Really fought well against C2Q prices 16 months on but Q6600 here dropped prices recently and is now around $10-15 more expensive than Phenom 9600BE (maybe because Intel will finally release some retail Yorkfields). So now the price is majorly against Phenom favor for around here, Wolfdale/Q6600 is the primary choice.

TDP officially (not user measured) is as poor for the clock as Intel's first 65nm quad B3 in Nov. '06, minus around 35W which they have on their NB. That was a poor oc'er on air. Difference was, it had design headroom, so colder>higher MHz. Only needed a die shrink and/or a newer step. K10h doesn't, it has a limitation beyond die shrink (future problem).

The problem with materials/node = current leakage/low MHz.

K10h by design is far more advanced than K8, by material choice, it's not, and by performance it is ahead with multi-threaded code, but not by a massive margin like Core 2 was over Netburst. It is well suited for HTPC/videoing/imaging/compression/decompression, anything memory intensive, I have tested this, i.e. 3dsMax 2008, it is a close call in many departments. But not something which outrights beats Core 2. I mean, I can knock +14k Cinebench 10 out of my 65nm C2Q on air, my daily rig hits 13,500 on it, although that's only one streamlined bench which has always favoured Intel arch (Apophysis for instance won't), it is high performance nothing really comes close to and nothing really out there has improved much to take special note of.

Biggest problem. 16 months too late to compete. They need to now advance 16 months of tech inside 6 months to compete.

AliG: high physics/particle, low shaders, medium texture and medium on most everything else, C2Q 3.4G/HD2600XT at 1024x768, no AA/AF, taking out millisecond very high and very lows, 76-49 FPS consistent. No stuttering or lags. It 'aint too bad unless res/detail/IQ is aggravated. I've never seen sub 48FPS at these settings and have seen plus 100 FPS with them.

AliG
03-15-2008, 11:54 AM
ummm no you can run crysis just fine on AGP i ran a 7800GS @500/740 and a 3400+ @stock and had crysis @ 1024x768 all med/high settings



I highly doubt you got decent framerates, I don't call anything below a constantly 30 fps decent, nor anything around 15 or below playable. Yes you can play crysis (with the pci_e version mind you), I beat oblivion with a 6600gt@1920*1200 with 10 fps lol! But was it anywhere near as enjoyable as it should have been? No.