Originally Posted by
fornowagain
Don't confuse power with energy, the Watt is Joules per second, simply a rate of energy conversion. The Watt-hour is a unit of power, a rate of energy use.
Only consider the system for a single slice in time, an instant. 400W in, some is lost through inefficient conversion. 400x0.8=320, so 80W of energy instantly lost. The current flows though conductors with resistance, convert to heat. Each voltage step or regulation releases heat. The cpu transistors pass current and change state, convert to heat. The fans convert electrical (electromagnetic) to kinetic with friction losses at the bearing, also moving air which imparts momentum which then slows from friction. To simplify say its a well insulated room (no radiation can escape) so a closed system. The energy in must equal energy out. With no energy out, the total energy must increase at the rate of 400 Joules/s. All of the energy used in that instant is either stored or converted. Assuming none is stored mechanically or electrically. As energy can't be lost, the net effect is a rise in room temperature from radiation (which includes direct heat as infra red, also other electromagnetic frequencies e.g. light and radio), convection (hot air) and conduction (the PC gets warm). Now in reality the room leaks energy, you left the door open, the walls absorb infra red and radio noise gets out. Once temperatures stabilize you've reached the point of equilibrium that's where the energy leaving the room equals the energy being used by the computer, otherwise the room would just get hotter and hotter.
The subject to read up on is the conservation of energy and the first law.