Another in my off-and-on series about condensed matter physics concepts.
Everyone has an intuitive grasp of what they mean by "temperature", but for the most part only physicists know the rigorous definition. Temperature, colloquially, is some measure of the energy stored in a system. If two systems having different temperatures are placed "in contact" (so that energy can flow between them via microscopic interactions like atoms vibrating into each other), there is a net flow of energy from the high temperature system to the low temperature system, until the temperatures equilibrate. Fun fact: the nerves in your skin don't actually sense temperature; rather, they sense the flow of thermal energy. Metals below body temperature generally feel cold to the touch because they are very effective at conducting away thermal energy. Plastics at the same temperature feel warmer because they are much worse thermal conductors.
Anyway, temperature is defined more rigorously than this touchy-feely business. Consider a system (e.g., a gas) that has a certain amount of total energy. That system can have many configurations (e.g., positions and momenta of the gas molecules) that all have the same total energy. Often there are so many configurations in play that we keep track of the log of the number of available configurations, which we call (to within a constant that gives us nice units) S. Now, what happens if we give the system a little more total energy? Well, (almost always) this changes the number of available configurations. (In the gas example, now more of the molecules have access to higher momenta, for example.) How many more configurations? A simple assumption here is that the change in E is linearly proportional to the change in S. The proportionality factor is exactly T, the temperature. At low temperatures, a given change in E implies a comparatively large change in the number of available configurations; conversely, at high temperatures, a given change in E really doesn't increase the number of available configurations very much. (I know that someone is going to object to my gas example, since even for a single molecule in a box there are, classically, an infinite number of possible configurations for the molecule even if we only keep track of positions. Don't worry about that too much - we have mathematically solid ways to deal with this.)
Those in the know will already be aware that S is the entropy of the system. The requirement that the total S always increase or stay the same for a closed system ends up implying just the familiar properties of temperature and average energy flow that we know from everyday experience.
18 comments:
Nice work! I always think our little corner of physics gets left out of all the "exciting" physics that the HEP people play with.
Random tidbit: it's occasionally the case that as the energy of the system increases, the entropy can *decrease*. For example, any spin system in a magnetic field. For such states, we would have to consistently give it a *negative* temperature, as measured on the Kelvin scale. Further, at the point of maximal energy, the temperature would have to be *negative zero*, in symmetry with the lowest energy state being positive zero. I've always thought that negative temperature systems don't get the recognition they deserve, even though every CD drive only works because of them.
Thanks. I thought about discussing negative absolute temperatures, but decided to stick with strictly equilibrium stuff. I may write more about that idea later....
until the temperatures equilibrateA 0.5 mm diameter 5 cm long aluminum wire and a 0.5 mm diameter 5 cm long gold wire have their near ends spot welded together. Their far ends are shorted through a 1 ohm 1 watt resistor. Assume a 20 C isothermal environment with unlimited heat capacity and large thermal conductivity. How much time passes until the junction comes into thermal equilibrium with its proximate separate wires?
Al, I do understand about thermocouples. Notice, too, that I talked about a closed system.
dang. i got beat by genneth for the mention of negative temp in spin systems!
Actually, Al, now that I think about your question, I wonder if I don't know what you're asking. Putting a loop like you suggest in an isothermal environment will do nothing - no current will flow beyond a transient when the loop is first closed. The transient results because the chemical potentials of the electrons have to equilibrate throughout the loop.
Are you sure that thermal sensation is sensation of heat flux? I always assumed the reason metal felt cold was the following. At the interface between two bodies of different thermal conductivity I should equate heat fluxes (in steady state), leading to a jump in temperature gradient. Thus with a better conductor, the temperature at the skin's surface will be lower, all else being equal.
I have a bit of a problem with this description. As you note, to the extent that most people have an intuition for temperature, it relates to how energetic internal jiggling is, which I take to be the energy per degree of freedom. That intuition is precisely correct in cases like the one that you described. By introducing entropy, you've made the definition more complicated and made it depend on the sophisticated and abstract concept of the log of the number of states. But in the end, by considering only a case where entropy and energy changes are proportional, you're restricting yourself to a system where all that generality doesn't buy any insight. Most people would be just as well off--for this system--thinking about energy per degree of freedom. Of course that does not explain why things come to the same temperature.
A thermoelectric generator shorted through a resistance, wholly within a closed system, need not reach thermal equilibrium. Vacuum diode generator: p-type 6H silicon carbide (7.2 eV work function) collector and micron-spaced emitter paved with n-doped hydrogen-terminated diamond [111] facets with negative vacuum electron affinity.
"Using negative electron affinity diamond emitters to mitigate space charge in vacuum thermionic energy conversion devices" Diamond and related materials 15(11-12) 2082 (2006)
The diamond will be cooler than the SiC, forever, despite black body photon radiative coupling. Right? (Pt, Os, IR, Au collectors are almost as good and make nice ohmic contacts to its external lead.) A thermocouple is the same though less extreme case.
Thanks for sharing the bit about nerves being sensitive to heat flow rather than temperature. It makes perfect intuitive sense.
Austen, I think you're more right than me, actually, according to this. Hmm. The argument that I'd heard before (can't find a primary citation right now) was that the nerves in different layers of skin can differentially detect thermal gradients (and thus heat flux) rather than having a simple, direct response to temperature.
Don, fair enough. I struggled with this one, and should probably rewrite it. My problem with "temperature = how much things are jiggling about" is that it really does miss some important points. Let me put it this way: that's a fair description as long as equipartition is true.
Al, I get your point about the difference between T, chemical potential, and the fact that \mu/T (not just \mu) must really be equal between systems that are in diffusive equilibrium. Still, you don't mean to suggest that there's steady-state current flow in your system, do you?
I don't really like this perspective because it involves defining temperature in terms of quantities that are not measurable and obscures the true physical nature of temperature. I think it is best to define temperature in macroscopic and operational terms, using the zeroth law of thermodynamics. The attached slides are from my undergraduate lectures. The key ideas are
* the zeroth law allows us to assign a single number to a thermodynamic system that has the important property that this number will tell us whether or not the system will change when it is brought into thermal contact with another system.
* a thermometer is just a thermodynamic system with just one state variable.
Hello Ross - thanks for the slides. I understand your misgivings, but I have to disagree with some of your statements. For example, you cannot completely specify the state of a gas just with pressure and density. As you know, equations of state (like the ideal gas law) are very useful but, in fact, do not provide the same complete description of a system that a true thermodynamic potential (e.g., E(S,V,N) or F(T,V,N)) does. I also don't like the statement that a thermometer is a system with only one state variable. Rather, a good (secondary) thermometer is a system that is sufficiently small such that (1) when brought into thermal contact with a system of interest, the main system's temperature is hardly changed while the thermometer's approaches that of the main system; and (2) there is some macroscopic parameter (e.g., the volume of mercury in a little tube) that may be related simply to the thermometer temperature. [A primary thermometer is a device that holds to (1), and has some fundamental relation directly to temperature, such as Johnson-Nyquist noise.]
Hey, as an experimentalist I really like parameters that I can measure directly, too. Still, entropy and internal energy are perfectly well-defined, and can be inferred from experiments.
System equilibrium for the shorted diode is a temperature gradient across cathode and anode, as you showed. The surrounding thermal bath warms the cathode and cools the anode through blackbody radiation. Ambient temp thermal promotion is a dribble of electrons/sec, a loophole through an epsilon. However, current flows forever.
A short arc lamp's tungsten spear cathode is cooled by electron evaporation through the work function. Its massive corrigated tungsten blob anode is heated by electron condensation through the work function plus acceleration through the voltage gradient. Protoype multi-kilowatt short arc lamps (e.g., 30 million candlepower Night Sun) could not be engineered - they all went high order in an armored test rig, for months. The first working lamp was diddled by a quartzblower and an electrode former who were skilled rather than educated.
i am not convinced that the mathematical ways to deal with are "solid enough" to satisfy a mathematician. as i perceive it there are some potentially inconvenient divergences which are "approximated" somehow and swept under the rug ... not that i feel mathematicians have any other way to work this out ... finally amazingly all this adds up and predicts the real world with astonishing accuracy.
please correct me if i am wrong
I feel bad about being grumpy in my earlier comment, so here's an alternative approach:
Temperature reflects how much internal energy an object has available for sharing with other objects. If two objects of different temperatures are brought together, heat flows from the hotter to the colder until their temperatures are equal.
This equalization of temperatures is an example of the second law of thermodynamics, which mandates an inexorable increase of entropy, or randomness, in the universe. In most cases, giving an object more internal energy increases its entropy, because its atoms can arrange themselves in more ways.
Heat energy moves from one object to another only if it increases the entropy of the target more than it decreases the entropy of the object it leaves, so that the total entropy increases. Thus the temperature is related to the amount of energy it takes to increase the entropy by a certain amount. A higher-temperature object requires more energy, so moving the energy to a lower-temperature object gives more “bang for the buck,” in the sense of increasing the overall entropy. This process continues until the temperatures are equal, so that heat flow doesn't further increase the entropy.
It’s important to remember that there’s no law against decreasing the entropy of one part of a larger system, as long as the entropy of the rest of the system increases even more. The complex and highly structured arrangement of living creatures like us is a vivid example of such a lower-entropy state, financed by the dumping extra heat into the world around us (raising its entropy).
Many familiar objects cam be regarded as a bunch of more-or-less independent entities, such as the molecules of a gas. In these cases, the temperature (the energy available for sharing) is directly related to the average motional energy of its constituents. In other cases, however, this familiar relationship between temperature and internal jiggling breaks down, and scientists need to use the more complicated definition of temperature based on the entropy.
temperature is not proportional to the number of configurations (actually the log of the number) but inversely proportional to it (which makes sense with the second part of your sentence).
Probably a dumb question, but if something is 114.8°F at the nanoscale, and it is in contact with some healthy tissue, say, the brain, is that brain tissue at risk of damage from the heat? Also, if a whole bunch of nanoscale particles are heated to that temperature, could the damage spread from a targeted tumor to healthy cells?
Post a Comment