This has to be one of the most useless things I've ever seen on the arxiv. Basically, the authors point out that there is absolutely no chance of the helium coolant of the LHC magnet system suddenly deciding to explode. Gee, really?! It is just sad that someone felt compelled to write this.
This paper reminds me of the old Annals of Improbable Research article, "The Effect of Peanut Butter on the Rotation of the Earth".
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Search This Blog
Monday, September 29, 2008
Sunday, September 28, 2008
A subtle statistical mechanics question
A faculty colleague of mine posed a statistical physics question for me, since I'm teaching that subject to undergraduates this semester, and I want to throw it out there to my readership. I'll give some context, explain the question, and then explain why it's actually rather subtle. If someone has a good answer or a reference to a good (that is, rigorous) answer, I'd appreciate it.
In statistical physics one of the key underlying ideas is the following: For every macroscopic state (e.g., a pressure of 1 atmosphere and a temperature of around 300 K for the air in your room), there are many microstates (in this example, there are many possible arrangements of positions and momenta of oxygen and nitrogen molecules in the room that all look macroscopically about the same). The macroscopic states that we observe are those that have the most possible microstates associated with them. There is nothing physically forbidden about having all of the air in your room just in the upper 1 m of space; it's just that there are vastly more microstates where the air is roughly evenly distributed, so that's what we end up seeing.
Crucial to actually calculating anything using this idea, we need to be able to count microstates. For pointlike particles, that means that we want to count up how many possible positions and momenta they can have. Classically this is awkward because position and momentum are continuous variables - there are an infinite number of possible positions and momenta even for one particle. Quantum mechanically, the uncertainty principle constrains things more, since we can never know the position and momentum precisely at the same time. So, the standard way of dealing with this is to divide up phase space (position x momentum) into "cells" of size hd, where h is Planck's constant and d is the dimensionality. For 3d, we use h3. Planck's constant comes into it via the uncertainty principle. Here's an example of a typical explanation.
Here's the problem: why h3, when we learn in quantum mechanics that the uncertainty relation is, in 1d, (delta p)(delta x) >= hbar/2 (which is h/4 pi, for the nonexperts), not h ? Now, for many results in classical and quantum statistical mechanics, the precise number used here is irrelevant. However, that's not always the case. For example, when one calculates the temperature at which Bose condensation takes place, the precise number used here actually matters. Since h3 really does work for 3d, there must be some reason why it's right, rather than hbar3 or some related quantity. I'm sure that there must be a nice geometrical argument, or some clever 3d quantum insight, but I'm having trouble getting this to work. If anyone can enlighten me, I'd appreciate it!
UPDATE: Thanks to those commenting on this. I'm afraid that I wasn't as clear as I'd wanted to be in the above; let me try to refine my question. I know that one can start from particle-in-a-box quantum mechanics, or assume periodic boundary conditions, and count up the allowed plane-wave modes within a volume. This is equivalent to Igor(the first response post)'s discussion of applying the old-time Bohr-Sommerfeld quantization condition (that periodic orbits have actions quantized by h). My question is, really, why does h show up here, when we know that the minimal uncertainty product is actually hbar/2. Or, put another way, should all of the stat mech books that argue that the h3 comes from uncertainty be reworded instead to say that it comes from Bohr-Sommerfeld quantization?
In statistical physics one of the key underlying ideas is the following: For every macroscopic state (e.g., a pressure of 1 atmosphere and a temperature of around 300 K for the air in your room), there are many microstates (in this example, there are many possible arrangements of positions and momenta of oxygen and nitrogen molecules in the room that all look macroscopically about the same). The macroscopic states that we observe are those that have the most possible microstates associated with them. There is nothing physically forbidden about having all of the air in your room just in the upper 1 m of space; it's just that there are vastly more microstates where the air is roughly evenly distributed, so that's what we end up seeing.
Crucial to actually calculating anything using this idea, we need to be able to count microstates. For pointlike particles, that means that we want to count up how many possible positions and momenta they can have. Classically this is awkward because position and momentum are continuous variables - there are an infinite number of possible positions and momenta even for one particle. Quantum mechanically, the uncertainty principle constrains things more, since we can never know the position and momentum precisely at the same time. So, the standard way of dealing with this is to divide up phase space (position x momentum) into "cells" of size hd, where h is Planck's constant and d is the dimensionality. For 3d, we use h3. Planck's constant comes into it via the uncertainty principle. Here's an example of a typical explanation.
Here's the problem: why h3, when we learn in quantum mechanics that the uncertainty relation is, in 1d, (delta p)(delta x) >= hbar/2 (which is h/4 pi, for the nonexperts), not h ? Now, for many results in classical and quantum statistical mechanics, the precise number used here is irrelevant. However, that's not always the case. For example, when one calculates the temperature at which Bose condensation takes place, the precise number used here actually matters. Since h3 really does work for 3d, there must be some reason why it's right, rather than hbar3 or some related quantity. I'm sure that there must be a nice geometrical argument, or some clever 3d quantum insight, but I'm having trouble getting this to work. If anyone can enlighten me, I'd appreciate it!
UPDATE: Thanks to those commenting on this. I'm afraid that I wasn't as clear as I'd wanted to be in the above; let me try to refine my question. I know that one can start from particle-in-a-box quantum mechanics, or assume periodic boundary conditions, and count up the allowed plane-wave modes within a volume. This is equivalent to Igor(the first response post)'s discussion of applying the old-time Bohr-Sommerfeld quantization condition (that periodic orbits have actions quantized by h). My question is, really, why does h show up here, when we know that the minimal uncertainty product is actually hbar/2. Or, put another way, should all of the stat mech books that argue that the h3 comes from uncertainty be reworded instead to say that it comes from Bohr-Sommerfeld quantization?
Thursday, September 25, 2008
A mini book review
Recently I acquired a copy of Electrical Transport in Nanoscale Systems by Max Di Ventra, a new textbook aimed at graduate students. I haven't yet had time to read through it in detail, but what I've seen so far is impressive. The book provides a thorough intro to various formalisms appropriate for understanding nanoscale transport, including the usual stuff (Drude, Kubo, Landauer-Buttiker, nonequilibrium Green's function (NEGF)) and other sophisticated approaches that focus on transport fundamentally as a nonequilibrium quantum statistical mechanics problem (dynamic density functional theory, a hydrodynamic approximation for the electron liquid, and a detailed look at the interactions between the electrons and the ions). I also appreciate the effort to point out that truly nanoscale systems really are more complicated and different than "ordinary" mesoscopic systems. The only significant omission (intentional, in large part to avoid doubling the size of the book) is a comparative lack of discussion of strong correlation effects (e.g. Kondo physics). (A good complementary book for those interested in the latter topic is that by Bruus and Flensberg.) It's not exactly light entertainment, but the writing is clear and pedagogical.
Update: By coincidence, Supriyo Datta just put up a nice long review of the NEGF approach. He also has a full book-length treatment written with a very pedagogical focus.
(For those curious about my own book efforts, it's slowly coming along. Slowly.)
Update: By coincidence, Supriyo Datta just put up a nice long review of the NEGF approach. He also has a full book-length treatment written with a very pedagogical focus.
(For those curious about my own book efforts, it's slowly coming along. Slowly.)
Saturday, September 20, 2008
Science funding.
This article confirms my previous impressions, and is very depressing. This past week the government promised roughly 200 years worth of the entire NSF annual budget to bail out the banking system. Since 2003 the US government has spent another 200 years worth of the the entire NSF annual budget in Iraq. After two years of "level funding", and the certainty that there will be no real budget passed before the election, what we really need is the prospect of another year of frozen budgets.
In related news, I've come to the realization that my research program is "too big to fail".
Update: I might as well put all of my nonscience stuff in one posting. Looking at the text of the proposed financial bailout bill here, I am aghast because of this section:
Let me get this straight. The Secretary of the Treasury gets incredibly broad authority to use up to $700 billion to prop up the financial markets in essentially any way he decides is appropriate, and his decisions are explicitly not reviewable ever by anyone, including the judicial branch?! I'm no lawyer, but isn't this, umm, absolutely insane?
In related news, I've come to the realization that my research program is "too big to fail".
Update: I might as well put all of my nonscience stuff in one posting. Looking at the text of the proposed financial bailout bill here, I am aghast because of this section:
Decisions by the Secretary pursuant to the authority of this Act are non-reviewable and committed to agency discretion, and may not be reviewed by any court of law or any administrative agency.
Let me get this straight. The Secretary of the Treasury gets incredibly broad authority to use up to $700 billion to prop up the financial markets in essentially any way he decides is appropriate, and his decisions are explicitly not reviewable ever by anyone, including the judicial branch?! I'm no lawyer, but isn't this, umm, absolutely insane?
Wednesday, September 17, 2008
Because I'm a big musical nerd...
... I couldn't pass this up. Very well done, though someone should point out to the Obama supporters behind this that things didn't work out too well for most of the characters singing this in Les Miserables.
I will return to actual physics blogging soon, once the immediate disarray settles out.
I will return to actual physics blogging soon, once the immediate disarray settles out.
Sunday, September 14, 2008
Ike follow-up
Well, that was interesting. Thankfully we're all fine and our house is undamaged. The prospect of being without power for an extended period continues to suck, to put it bluntly. 90 degree weather, near 100% humidity, and no air conditioning or refrigeration. On the plus side, my university still has power and AC. On the downside, they've disabled access (card keys) to most buildings and water service (i.e. sanitary plumbing) is spotty on campus.
Friday, September 12, 2008
Hurricane Ike
Hello - for those readers who don't know, I live in Houston, which is about to get hit by Hurricane Ike. I'm hopeful that this won't be a big deal, but there's always the chance that I'll be without electricity for a few days. So, blogging may be slow. In the mean time, check out this cool site for following tropical storm systems, and this explanation of how hurricanes are heat engines.
Thursday, September 11, 2008
Ahh, the Gulf coast.
You know, I lived the first twenty-nine years of my life without having to pay close attention to stuff like this.
Wednesday, September 10, 2008
Important online resource
The internet is definitely the best way to keep up with current events. Check here often (look at the link text). (Thanks, Dan.)
Tuesday, September 09, 2008
Final Packard highlights + amusing article
One of my former professors, Michael Peskin, has a nice article about why the LHC will not destroy the earth. He taught me graduate-level mechanics, and my brain still hurts from his take-home final.
A last few things I learned at the Packard meeting:
A last few things I learned at the Packard meeting:
- The stickleback is a very useful fish for addressing the question, if natural selection removes variation in phenotypes, then why do we still see so much variation?
- There are structures on the membranes of many cells (the primary cilium; the protein known as rhomboid) that seem to have really profound effects on many cellular processes. Understanding how and why they do what they do demonstrates why systems biology is hard.
- It may be possible to do some kind of "safe" cryptographic key exchange based on functions that are not algebraic (as opposed to usual RSA-type encryption which is based on the asymmetry in difficulty between multiplication and factorization).
- There are deep connections between random permutations and the distribution of the number of prime factors.
- It's possible to run live small animals (zebrafish, c. elegans) through microfluidic assay systems in massively parallel fashion.
- Stem cell differentiation can apparently be influenced by the mechanical properties (e.g., squishy vs. hard) of the substrate. Weird.
- Artificial sieve structures can be very useful for electrophoresis of long segments of DNA.
- There may be clever ways to solve strongly correlated electronic structure problems using tensor networks.
- Natural synthesis of useful small molecules (e.g., penicillin, resveratrol) is pretty amazing. Makes me want to learn more about bacteria, actomycetes, and fungi.
- By clever trading of time and statistics for intensity, 3d superresolution imaging is possible under some circumstances.
- DNA can be used as a catalyst.
- Some bacteria in biofilms secrete molecules that look like antibiotic byproducts, but may actually serve as a way of carrying electrons long distances so that the little buggers far from the food source can still respirate.
- Virus chips are awesome.
- Don't ever get botfly larvae growing in your scalp. Ever.
- Tensegrity structures can be very useful for biomimetic machines.
- Sub-mm arrays are going to be a boon for astronomy.
- It looks like much of the Se and Br in the universe was actually produced by the same compact object mergers that give short gamma ray bursts.
- Dark energy remains a major enigma in physics and astrophysics. It's a big one.
Sunday, September 07, 2008
Packard highlights
Some things I learned at my final Packard meeting:
- The density of stars in a globular cluster is just absurd - something like 104-106 stars in a volume 10 ly on a side. Wow.
- The joint between the stem and base of a wine glass is a perfect lens for demonstrating the types of Einstein rings that one sees in gravitational lensing.
- In the protoplanetary disk phase of solar system formation, elements get mixed on very rapid timescales (like around 1000 years).
- Bacteria are much better at using 40Ca in their metabolism than 44Ca, and it's not at all clear how this works kinetically.
- 3-5 million years ago, in the early Pliocene, the global climate is a good test case for comparison with global warming models. Bad news for me: if the trends can really be mapped onto today, the hurricane rate is likely to increase by a factor of two.
- Using isotopic analysis (!), it is possible to put an error bound on how many people the lions in the Fields Museum actually ate: 41 +- 11. Anecdotal evidence had put the number between 15 and 135.
- We're all going to be able to get our genomes sequenced very soon, since the rate at which DNA can be sequenced (base pairs per day, for example) has gone up by five orders of magnitude in the last five years.
Wednesday, September 03, 2008
Packard meeting 2008
I'm in Park City for my last annual meeting as a Packard Fellow. As I've said before, I can't speak highly enough of the Packard Foundation. Their support has jumpstarted or otherwise contributed to pretty much my entire research program, and through their annual meetings I've gotten to know meet some fascinating people and hear excellent talks in the sciences, mathematics, and engineering. This is a particularly special meeting because it's the 20th anniversary of the Packard fellowship program, and they've invited back all the previous fellows. I'll try to post some highlights over the next few days.
Subscribe to:
Posts (Atom)