Endymio
Posts: 3,599 +3,492
They're not fully constrained to the inductor, or even to the card itself, true. But they still eventually interact with something, and become waste heat. You point out they're not significant (I believe the FCC limits emissions to the level of a few microvolts/meter) but even ignoring those limits, the work-energy equation still must balance.The arithmetic mean for the component may be average, but the environmental losses aren't (I.e. e/m fields aren't fully constrained).
I believe you're misinterpreting statements in that paper such as " The [classical] expression above will overestimate the total power dissipated in a quasi-ballistic device..." Certainly, classical-scale calculations of both resistance and charge flow become tricky at nanoscale levels, and you cannot consider the gate of a nanoscale device as a closed system, and blindly apply Joule's Law.. But I didn't see anything in that paper contradicting that law, and in fact it specifically states it should be applied "with care". The calculated power dissipation may indeed change ... but of the power that is dissipated, it all eventually becomes heat. That's inherent in conservation of energy.Hence, the total energy required is the potential change + resistive + capacitive losses, but it's only the resistive loss that really becomes heat. But of course, even this is far too simplistic a view. This paper provides far more depth on the matter (and it's a very good read):
Let's take capacitive losses, for instance. An ideal capacitor has none. But in a real-world capacitor, current leads voltage and a small loss occurs. That loss becomes heat ... which is why an overdriven capacitor can heat up enough to explode (that's why can-type caps all have the 'X' atop them-- to keep them from becoming little bombs in a worse-case scenario). These losses, like the result of any non-conservative force, dissipate into some form of mechanical energy ... in this case, heat.
Now let's take potential change. An increase becomes stored potential energy (hence the name), not heat, true. But a real-world circuit switching in the gigahertz range is changing state several billion times a second. Half that time any arbitrary point in the circuit is increasing potential, the other half is decreasing potential. Unless you integrate over a half-cycle or shorter period, your average potential change is zero.
If you're measuring a closed system, it has to be. I too have taught thermodynamics, and I hope I wasn't teaching it wrong.I've done demonstrations for students of thermodynamics, using TRTI on power amps, and there's never a direct 1:1 ratio between the measured energy consumption and calculation heat loss (within the granularity of the system).