you comparing power consumption of 65W Vs 125W chip.
that's not what i was asking for nor what is discussed in this topic.
Early morning for you there?
Printable View
Let's see:
AMD does rate and use max. TDP for its desktop lineup(unlike ACP for server parts).Next i said it is unlike intel since we are not sure,even today,how intel does come up with their numbers(they use that internal burn-in tool i suppose).Quote:
Originally Posted by me
I also said :"not to say that 45nm 65W consumes 65W though" which means intel's 45nm parts usually use less power than rated,which is also known for a while.Hence i said the rating on the box is for "cooling reference".
So please tell me what was wrong or incorrect in my post.And please don't skip an answer or just say "Ehmm no" when you are cornered,you do this "Hit&run" too many times it is not even funny.
Ignore the AMD CPU. it doesnt matter and its just some poor excuse on your part. Now tell me instead. How are those 4 Intel CPUs vs their TDP?
The same applies for 65nm CPUs. Try read the link I gave. 115W for a 125W AMD CPU, 50W tops for a 65W Intel CPU. It doesnt take much math skills to see who is closest to their max TDP value. Both in raw W and in percentage.
Cant wait for the reviews. Im looking for a low watt system for a router/NAS/Print/whatever setup.
You just love the AMD threads huh Shintai?
why are we comparing TDP anyway... 99% worthless determining how much power a system uses.
The method AMD is using to determine TDP is unknown:
http://www.amd.com/us-en/assets/cont...docs/33954.pdf, page 10Quote:
TDP. Thermal Design Power. The thermal design power is the maximum power a processor can
draw for a thermally significant period while running commercially useful software. The
constraining conditions for TDP are specified in the notes in the thermal and power tables.
In order to know their method we need to know the following two variables:
1) thermally significant period
2) commercially useful software
Basically AMD adopted Intels way of measuring TDP.
Second, I posted this more (getting tired of repeating it), we had this TDP vs ACP discussion a while back and I posted quite some interesting links there where as everyone went silent on. And yet you dare to continue posting about it stating you're right... I might have missed something epicly here.
Re-reading the links I posted shows Intel's TDP is not the max. Actually it seems they pull their numbers out of 'nowhere'. Sometimes being spot on (which is still higher percentage wise then AMD's real TDP), also they've a lower TDP than they'll ever pull as in your example... But as well, actually rate the TDP too low. So Im really wondering what your point exactly was.
Rammsteiner
if you do not know poeple at xbitlabs and how they work you have no rights to judge their credibility. Secondly TDP numbers don't come from nowhere cos motherboard and heatsinks vendors must know these numbers to design their products and Intel must know these numbers to design packaging. As for the Intel's methods some details can be found in the tech docs:
I'm sure AMD will provide more in depth details on their methodology upon business request.Quote:
This specification is the Thermal Design Power and is the estimated maximum possible
expected power generated in a component by a realistic application. It is based on
extrapolations in both hardware and software technology over the life of the component. It
does not represent the expected power generated by a power virus. Studies by Intel
indicate that no application will cause thermally significant power dissipation exceeding
this specification, although it is possible to concoct higher power synthetic workloads that
write but never read. Under realistic read/write conditions, this higher power workload can
only be transient and is accounted in the AC (max) specification.
It's just the way AMD works.
look the phenoms are all rated the same TDP business or not
http://products.amd.com/en-us/Deskto...&id=405&id=406
its not really a big deal this is VIA's space no1 should be able to encroach on it, especially with a desk top derivative
but then again tdp dosnt mean much u dont fold on a battery
From the same documentQuote:
Thermal Design Power (TDP) and IDD max are the limits at the highest Tcase max in the specified range for the
corresponding OPN. Products conform to the TDP and IDD Max limits at all valid nominal voltages. The relationship
of Tcase max and Thermal Profile to TDP for a specific device is defined in
Table 26.
By no means I tried to down play Xbit labs. I more meant that I think it's funny Xbit labs got screwed over a 180 degree after their Deneb FX posts but the same people (group) of people use XBit labs as link for their back up. I think that's quite, well, weird:rolleyes:
I know TDP numbers after all come from somewhere, but IMO it's also strange if they rate a TDP for a certain CPU it actually doesnt match. It's like 'Oh, this core uses this much, we rate all the CPU's for that number', that's what I meant for 'nowhere'.
I mean, for lower end CPU's it would be quite stupid, to say the least, to rate it way higher than its actual TDP, but also it's a stupid under rate a high end CPU.
It's still ambiguous.... you are quoting conditions under which it is derived which is bascially a load line, gODJO is quoting the definition (i.e. what it means by AMD's standards).
You could put the max voltage and clock on the CPU and run something as simple as solitare and call it a commercially relevant software. That would give you a completely different number than running say Prime95.
He is correct, unless it is specified what the thermally significant period is (is it one second, 10 seconds, 5 days??) and the load they are actually running (super pi 1M, solitaire, prime 95, spec FP rate?) there is no really understanding what AMD's methods were to establish their spec on TDP.
Another way of putting it... would you, BrowncoatGR, please repeat AMD's measurement to verify TDP and show the data to the forum? You can't, you don't have enough information.
This is not to say Intel is any better, they are just as vague....
I do understand thus far TDP stands for the amount of Watt to be cooled down (not specially the actual amount of Watt used). Dont know if it's close though. How it's used in industry, no idea actually. How do you mean that actually? For motherboard manufacturers to build their products? Or should I look into a complete other direction? No need for an epic large book work, more like a quick explanation what you meant with that will do;)
Watts used=Watts to cool on a CPU..well maybe besides 0.0001W in radiation.
TDP= thermal DESIGN power.
In short, nomatter what the CPU uses. the cooling solution can cool 130W of heat. (Thats also equal to 130W used by a CPU).
But thats the reason things are specced as it is. So you dont have to have 117 different cooling solutions. But can do with 3 example.
You are correct of course. Neither company specifies how they actually calculate TDP(and i can't see why really. I don't see how this is sensitive data). Initially i interpreted that max to mean that while the conditions are met the CPU will never exceed the TDP. After reading it again i dont think that is correct. If you apply the first definition to the part that i quoted it makes the second statement a lot more ambiguous.
As for Intel i've been thinking that their high TDP rating of 45nm CPUs might be due to cooling needs of those CPUs and not that they lumped all their CPUs together like some ppl suggest. Couldn't a hotspot on the cpu cause Intel to conclude that the CPU needs better cooling than the chips actual thermal dissipation would suggest? Granted that's what heatspreaders are for but how effective are they really?
Well sorta but not quite... it is not the amount of watts the processor must be cooled down, it is the rate that the cooler must dissipate energy... TDP references the cooling solution for the CPU that is necessary to keep the CPU at normal operating temperatures.
While not trivial, the dynamic flow of energy through solids (to the fins), and via convection to air, is doable, it is classic thermal physics to model heat transfer from a high energy source to a lower energy sink -- the 2nd law of thermodynamics. It is trivial for a mechanical engineer to design a chunk of metal, a fan with a certain air flow, that will remove x amount of energy per unit time.
So both Intel and AMD provide to their customers a spec for the thermal solution, which if met, guarantees the processor will work -- so there you are correct. The repercussions of not getting that spec right puts ownership on who is responsible. If AMD or Intel under spec the solution and the processor fails, then they are liable... if the system designer OEM fails to meet the spec, then they have no recourse but to eat the costs.
nVidia is feeling this problem now and took a 200 million dollar charge for not ensuring they provided the adequate cooling specification (or designing the product to fall within an acceptable margin below the cooling specification).
For Joe enthusiast, the TDP is actually meaningless -- we typically outfit our systems with 3rd party fans anyway, most all reputable HSF makers well exceed even the highest end TDP criteria.
Jack
This is not a bad reason I suspect, perhaps another reason would be that of new gate materials and different transistors. While one can design and test a transistor ad nauseam in the lab, how it will fair in the field is another question. They may have decided best to force an extreme cooling solution which would mean, on average, the temps would run lower than typical.
In terms of the IHS, another good point. They help, but they never completely eliminate any particular hot spot. There are some good 'thermal imaging' papers out there, I can try to dig one up and post it... I have seen on the net studies for both Athlon and C2D.
Jack
For desktop (nettop) AMD will have a good solution, ultimately Intel is pushing Atom into way more form factors than nettops, in netbooks AMD will be hindered with the higher power consumption, but will be helped with better IGP (assuming people want to use an Internet device as a gaming device).
AMD should push their nettop initiatives with Vista.
Granted for now the offerings' only for desktops, but in the future when the mobile version comes they definitely have bragging rights.
Intel seems to want a big separation between the 2 markets. Which means whatever Atom's gonna be coupled to (be it 945GC or Moorestown) it won't nab even basic 3D graphics for that matter. Their loss I guess.
I think you kinda nailed it here. What intel wanted to design was a processor with ultra low power, ultra low costs but acceptable enough to do silly routine stuff ... i.e. net browsing, emailing, may watch a dvd.... and they did not want to put up too much performance that might stress the ASP of the higher end product.
I don't have a nano board yet (plan on swiping one when they appear), and I will probably see if I can pick up the new AMD just to put them on the bench and check them out.
My personal take on the whole Atom thing is this ... the first incarnation is good enough for netbooks, and because of the low manufacturing costs, into the essential line of desktop boards. In this area, Via and AMD can offer compelling (certainly better) alternatives. However, Intel has been clear that they want to move this product down into much smaller form factors ... as it goes, every Watt counts, and even at the top bin of 2-4 W, this is too much for those apps. Atom 2.0 will go sub 1 W I suspect.
Overall, Via and AMD will offer up great alternatives, but the overlap in market for AMD, Via against Atom will ultimately be very small, and focused to nettops. Atom will dominate the smaller form factors just based on power and costs.