Wednesday, December 27, 2017

The Quantum Labyrinth - a review

Because of real life constraints I'm a bit slow off the mark compared to others, but I've just finished reading The Quantum Labyrinth by Paul Halpern, and wanted to get some thoughts down about it.  The book is a bit of a superposition between a dual biography of Feynman and Wheeler, and a general history of the long-term impact of what started out as their absorber theory.  

The biographical aspects of Feynman have been well trod before by many, including Feynman himself and rather more objectively by James Gleick.   Feynman helped create his own legend (safecracking, being a mathematically prodigious, bongo-playing smart-ass).  The bits called back in the present work that resonate with me now (perhaps because of my age) are how lost he was after his first wife's death, his insecurity about whether he was really getting anything done after QED, his embracing of family life with his third wife, and his love of teaching - both as theater and as a way to feel accomplishment when research may be slow going.  

From other books I'd known a bit about Wheeler, who was still occasionally supervising physics senior theses at Princeton when I was an undergrad.  The backstory about his brother's death in WWII as motivation for Wheeler's continued defense work after the war was new to me.   Halpern does a very good job conveying Wheeler's style - coining pithy epigrams ("Spacetime tells matter how to move; matter tells spacetime how to curve.", "The boundary of a boundary is zero.") and jumping from topic to topic with way outside the box thinking.  We also see him editing his students' theses and papers to avoid antagonizing people.  Interesting.

From the Feynman side, the absorber theory morphed into path integrals, his eponymous diagrams, and his treatment of quantum electrodynamics.   The book does a good job discussing this, though like nearly every popularization, occasionally the analogies, similes, and metaphors end up sacrificing accuracy for the sake of trying to convey physical intuition.    From the Wheeler angle, we get to learn about attempts at quantizing gravity, geons, wormholes, and the many worlds interpretation of quantum mechanics.

It's a fun read that gives you a sense of the personalities and the times for a big chunk of twentieth century theoretical physics, and I'm impressed with Halpern's ability to convey these things without being a professional historian.  

Tuesday, December 19, 2017

The state of science - hyperbole doesn't help.

It seems like every few weeks these days there is a breathless essay or editorial saying science is broken, or that science as a whole is in the midst of a terrible crisis, or that science is both broken and in the midst of a terrible crisis.  These articles do have a point, and I'm not trying to trivialize anything they say, but come on - get a grip.  Science, and its cousin engineering, have literally reshaped society in the last couple of hundred years.  We live in an age of miracles so ubiquitous we don't notice how miraculous they are.  More people (in absolute numbers and as a percentage of the population) are involved in some flavor of science or engineering than ever before.

That does mean that yes, there will be more problems in absolute numbers than before, too, because the practice of science and engineering is a human endeavor.  Like anything else done by humans, that means there will be a broad spectrum of personalities involved, that not everyone will agree with interpretations or ideas, that some people will make mistakes, and that occasionally some objectionable people will behave unethically.   Decisions will be made and incentives set up that may have unintended consequences (e.g., trying to boost Chinese science by rewarding high impact papers leads to a perverse incentive to cheat.).   This does not imply that the entire practice of science is hopelessly flawed and riddled with rot, any more than a nonzero malpractice rate implies that all of medicine is a disaster.

Why is there such a sense of unease right now about the state of science and the research enterprise?  I'm not a sociologist, but here's my take.

Spreading information, good and bad, can happen more readily than ever before.  People look at sites like pubpeer and come away with the impression that the sky is falling, when in fact we should be happy that there now, for the first time ever, exists a venue for pointing out potential problems.  We are now able to learn about flawed studies and misconduct far more effectively than even twenty years ago, and that changes perceptions.  This seems to be similar to the disconnect between perception of crime rates and actual crime rates.

Science is, in fact, often difficult.  People can be working with complex systems, perhaps more complicated than their models assume.   This means that sometimes there can be good (that is, legitimate) reasons why reproducing someone's results can be difficult.  Correlation doesn't equal causation; biological and social phenomena can be incredibly complex, with many underlying degrees of freedom and often only a few quantifiable parameters.  In the physical sciences we often look askance at those fields and think that we are much better, but laboratory science in physics and chemistry can be genuinely challenging.  (An example from my own career:  We were working with a collaborator whose postdoc was making some very interesting nanoparticles, and we saw exciting results with them, including features that coincided with a known property of the target material.  The postdoc went on to a faculty position and the synthesis got taken over by a senior grad student.  Even following very clear directions, it took over 6 months before the grad student's particles had the target composition and we reproduced the original results, because of some incredibly subtle issue with the synthesis procedure that had changed unintentionally and "shouldn't" have mattered.)

Hyperbolic self-promotion and reporting are bad.   Not everything is a breakthrough of cosmic significance, not every advance is transformative, and that's ok.  Acting otherwise sets scientists and engineers up for a public backlash from years of overpromising and underdelivering.   The public ends up with the perception that scientists and engineers are hucksters.  Just as bad, the public ends up with the idea that "science" is just as valid a way of looking at the world as astrology, despite the fact that science and engineering have actually resulted in technological society.  Even worse, in the US it is becoming very difficult to disentangle science from politics, again despite the fact that one is (at least in principle) a way of looking at the world and trying to determine what the rules are, while the other can be driven entirely by ideology.  This discussion of permissible vocabulary is indicative of a far graver threat to science as a means of learning about the universe than actual structural problems with science itself.  Philosophical definitions aside and practical ones to the fore, facts are real, and have meaning, and science is a way of constraining what those facts are.

We can and should do better.  Better at being rigorous, better at making sure our conclusions are justified and knowing their limits of validity, better at explaining ourselves to each other and the public, better at policing ourselves when people transgress in their scientific ethics or code of conduct.

None of these issues, however, imply that science itself as a whole is hopelessly flawed or broken, and I am concerned that by repeatedly stating that science is broken, we are giving aid and comfort to those who don't understand it and feel threatened by it.


Saturday, December 16, 2017

Finding a quantum phase transition, part 2

See here for part 1.   Recall, we had been studying electrical conduction in V5S8, a funky material that is metallic, but on one type of vanadium site has local magnetic moments that order in a form of antiferromagnetism (AFM) below around 32 K.  We had found a surprising hysteresis in the electrical resistance as a function of applied magnetic field.  That is, at a given temperature, over some magnetic field range, the resistance takes different values depending on whether the magnitude of H is being swept up or back down. 

One possibility that springs to mind when seeing hysteresis in a magnetic material is domains - the idea that the magnetic order in the material has broken up into regions, and that the hysteresis is due to the domains rearranging themselves.  What speaks against that in this case is the fact that the hysteresis happens over the same field range when the field is in the plane of the layered material as when the field is perpendicular to the layers.   That'd be very weird for domain motion, but makes much more sense if the hysteresis is actually a signature of a first-order metamagnetic transition, a field-driven change from one kind of magnetic order to another.   First order phase transitions are the ones that have hysteresis, like when water can be supercooled below zero Celsius.

That's also consistent with the fact that the field scale for the hysteresis starts at low fields just below the onset of antiferromagnetism, and very rapidly goes to higher fields as the temperature falls and the antiferromagnetic state is increasingly stable.   Just at the ordering transition, when the AFM state is just barely favored over the paramagnetic state, it doesn't necessarily take much of a push to destabilize AFM order.... 

There was one more clue lingering in the literature.  In 2000, a paper reported a mysterious hysteresis in the magnetization as a function of H down at 4.2 K and way out near 17-18 T.  Could this be connected to our hysteresis?  Well, in the figure here at each temperature we plot a dot for the field that is at the middle of our hysteresis, and a horizontal bar to show the width of the hysteresis, including data for multiple samples.  The red data point is from the magnetization data of that 2000 paper.  

A couple of things are interesting here.   Notice that the magnetic field apparently required to kill the AFM state extrapolates to a finite value, around 18 T, as T goes to zero.  That means that this system has a quantum phase transition (as promised in the post title).  Moreover, in our experiments we found that the hysteresis seemed to get suppressed as the crystal thickness was reduced toward the few-layer limit.  That may suggest that the transition trends toward second order in thin crystals, though that would require further study.  That would be interesting, if true, since second order quantum phase transitions are the ones that can show quantum criticality.  It would be fun to do more work on this system, looking out there at high fields and thin samples for signatures of quantum fluctuations....

The bottom line:  There is almost certainly a lot of interesting physics to be done with magnetic materials approaching the 2d limit, and there are likely other phases and transitions lurking out there waiting to be found.

Saturday, December 09, 2017

Finding a quantum phase transition, part 1

I am going to try to get the post frequency back up now that some tasks are getting off the to-do list....

Last year, we found what seems to be a previously undiscovered quantum phase transition, and I think it's kind of a fun example of how this kind of science gets done, with a few take-away lessons for students.  The paper itself is here.

My colleague Jun Lou and I had been interested in low-dimensional materials with interesting magnetic properties for a while (back before it was cool, as the hipsters say).  The 2d materials craze continues, and a number of these are expected to have magnetic ordering of various kinds.  For example, even down to atomically thin single layers, Cr2Ge2Te6 is a ferromagnetic insulator (see here), as is CrI3 (see here).  The 2d material VS2 had been predicted to be a ferromagnet in the single-layer limit.  

In the pursuit of VS2, Prof. Lou's student Jiangtan Yuan found that the vanadium-sulphur phase diagram is rather finicky, and we ended up with a variety of crystals of V5S8 with thicknesses down to about 10 nm (a few unit cells).  

[Lesson 1:  Just because they're not the samples you want doesn't mean that they're uninteresting.]   

It turns out that V5S8  had been investigated in bulk form (that is, mm-cm sized crystals) rather heavily by several Japanese groups starting in the mid-1970s.  They discovered and figured out quite a bit.  Using typical x-ray methods they found the material's structure:  It's better to think of V5S8  as V0.25VS2.  There are VS2 layers with an ordered arrangement of vanadium atoms intercalated in the interlayer space.  By measuring electrical conduction, they found that the system as a whole is metallic.   Using neutron scattering, they showed that there are unpaired 3d electrons that are localized to those intercalated vanadium atoms, and that those local magnetic moments order antiferromagnetically below a Neel temperature of 32 K in the bulk.  The moments like to align (antialign) along a direction close to perpendicular to the VS2 layers, as shown in the top panel of the figure.   (Antiferromagnetism can be tough to detect, as it does not produce the big stray magnetic fields that we all associate with ferromagnetism. )

If a large magnetic field is applied perpendicular to the layers, the spins that are anti-aligned become very energetically unfavored.  It becomes energetically favorable for the spins to find some way to avoid antialignment but still keep the antiferromagnetism.  The result is a spin-flop transition, when the moments keep their antiferromagnetism but flop down toward the plane, as in the lower panel of the figure.  What's particularly nice in this system is that this ends up producing a kink in the electrical resistance vs. magnetic field that is a clear, unambiguous signature of the spin flop, and therefore a way of spotting antiferromagnetism electrically

My student Will Hardy figured out how to make reliable electrical contact to the little, thin V5S8 crystals (not a trivial task), and we found the physics described above.  However, we also stumbled on a mystery that I'll leave you as a cliff-hanger until the next post:  Just below the Neel temperature, we didn't just find the spin-flop kink.  Instead, we found hysteresis in the magnetoresistance, over an extremely narrow temperature range, as shown here.

[Lesson 2:  New kinds of samples can make "old" materials young again.]

[Lesson 3:  Don't explore too coarsely.  We could easily have missed that entire ~ 2.5 K temperature window when you can see the hysteresis with our magnetic field range.] 

Tune in next time for the rest of the story....

Tuesday, November 28, 2017

Very busy time....

Sorry for the light blogging - between departmental duties and deadline-motivated writing, it's been very difficult to squeeze in much blogging.  Hopefully things will lighten up again in the next week or two.   In the meantime, I suggest watching old episodes of the excellent show Scrapheap Challenge (episode 1 here).  Please feel free to put in suggestions of future blogging topics in the comments below.  I'm thinking hard about doing a series on phases and phase transitions.

Friday, November 17, 2017

Max the Demon and the Entropy of Doom

My readers know I've complained/bemoaned repeatedly how challenging it can be to explain condensed matter physics on a popular level in an engaging way, even though that's the branch of physics that arguably has the greatest impact on our everyday lives.  Trying to take such concepts and reach an audience of children is an even greater, more ambitious task, and teenagers might be the toughest crowd of all.  A graphic novel or comic format is one visually appealing approach that is a lot less dry and perhaps more nonthreatening than straight prose.   Look at the success of xkcd and Randall Munroe!   The APS has had some reasonable success with their comics about their superhero Spectra.  Prior to that, Larry Gonick had done a very nice job on the survey side with the Cartoon Guide to Physics.  (On the parody side, I highly recommend Science Made Stupid (pdf) by Tom Weller, a key text from my teen years.  I especially liked Weller's description of the scientific method, and his fictional periodic table.)

Max the Demon and the Entropy of Doom is a new entry in the field, by Assa Auerbach and Richard Codor.  Prof. Auerbach is a well-known condensed matter theorist who usually writes more weighty tomes, and Mr. Codor is a professional cartoonist and illustrator.  The book is an entertaining explanation of the laws of thermodynamics, with a particular emphasis on the Second Law, using a humanoid alien, Max (the Demon), as an effective superhero.  

The comic does a good job, with nicely illustrated examples, of getting the point across about entropy as counting how many (microscopic) ways there are to do things.  One of Max's powers is the ability to see and track microstates (like the detailed arrangement and trajectory of every air molecule in this room), when mere mortals can only see macrostates (like the average density and temperature).    It also illustrates what we mean by temperature and heat with nice examples (and a not very subtle at all environmental message).   There's history (through the plot device of time travel), action, adventure, and a Bad Guy who is appropriately not nice (and has a connection to history that I was irrationally pleased about guessing before it was revealed).   My kids thought it was good, though my sense is that some aspects were too conceptually detailed for 12 years old and others were a bit too cute for world-weary 15.  Still, a definite good review from a tough crowd, and efforts like this should be applauded - overall I was very impressed.

Tuesday, November 07, 2017

Taxes and grad student tuition

As has happened periodically over the last couple of decades (I remember a scare about this when Newt Gingrich's folks ran Congress in the mid-1990s), a tax bill has been put forward in the US House that would treat graduate student tuition waivers like taxable income (roughly speaking).   This is discussed a little bit here, and here.

Here's an example of why this is an ill-informed idea.  Suppose a first-year STEM grad student comes to a US university, and they are supported by, say, departmental fellowship funds or a TA position during that first year.  Their stipend is something like $30K.  These days the university waives their graduate tuition - that is, they do not expect the student to pony up tuition funds.  At Rice, that tuition is around $45K.  Under the proposed legislation, the student would end up getting taxed as if their income was $75K, when their actual gross pay is $30K.   

That would be extremely bad for both graduate students and research universities.  Right off the bat this would create unintended (I presume) economic incentives, for grad students to drop out of their programs, and/or for universities to play funny games with what they say is graduate tuition.   

This has been pitched multiple times before, and my hypothesis is that it's put forward by congressional staffers who do not understand graduate school (and/or think that this is the same kind of tuition waiver as when a faculty member's child gets a vastly reduced tuition for attending the parent's employing university).  Because it is glaringly dumb, it has been fixed whenever it's come up before.  In the present environment, the prudent thing to do would be to exercise caution and let legislators know that this is a problem that needs to be fixed.

Tuesday, October 31, 2017

Links + coming soon

Real life is a bit busy right now, but I wanted to point out a couple of links and talk about what's coming up.
  • I've been looking for ways to think about and discuss topological materials that might be more broadly accessible to non-experts, and I found this paper and videos like this one and this one.  Very cool, and I'm sorry I'd missed it back in '15 when it came out.
  • In the experimental literature talking about realizations of Majorana fermions in the solid state, a key signature is a peak in the conductance at zero voltage - that's an indicator that there is a "zero-energy mode" in the system.  There are other ways to get zero-bias peaks, though, and nailing down whether this has the expected properties (magnitude, response to magnetic fields) has been a lingering issue.  This seems to nail down the situation more firmly.
  • Discussions about "quantum supremacy" strictly in terms of how many qubits can be simulated on a classical computer right now seem a bit silly to me.  Ok, so IBM managed to simulate a handful of additional qubits (56 rather than 49).  It wouldn't shock me if they could get up to 58 - supercomputers are powerful and programmers can be very clever.  Are we going to get a flurry of news stories every time about how this somehow moves the goalposts for quantum computers?    
  • I'm hoping to put out a review of Max the Demon and the Entropy of Doom, since I received my beautifully printed copies this past weekend.

Wednesday, October 25, 2017

Thoughts after a NSF panel

I just returned from a NSF proposal review panel.  I had written about NSF panels back in the early days of this blog here, back when I may have been snarkier.

  • Some things have gotten better.  We can work from our own laptops, and I think we're finally to the point where everyone at these things is computer literate and can use the online review system.  The program officers do a good job making sure that the reviews get in on time (ahead of the meeting).
  • Some things remain the same.  I'm still mystified at how few people from top-ranked programs (e.g., Harvard, Stanford, MIT, Cornell, Cal Tech, Berkeley) I see at these.  Maybe I just don't move in the right circles.  
  • Best quote of the panel:  "When a review of one of my papers or proposals starts with 'Author says' rather than 'The author says', I know that the referee is Russian and I'm in trouble."
  • Why does the new NSF headquarters have tighter security screenings that Reagan National Airport?  
  • The growth of funding costs and eight years of numerically flat budgets has made this process more painful.  Sure looks like morale is not great at the agency.  Really not clear where this is all going to go over the next few years.  There was a lot of gallows humor about having "tax payer advocates" on panels.  (Everyone on the panel is a US taxpayer already, though apparently that doesn't count for anything because we are scientists.)
  • NSF is still the most community-driven of the research agencies. 
  • I cannot overstate the importance of younger scientists going to one of these and seeing how the system works, so you learn how proposals are evaluated.




Monday, October 23, 2017

Whither science blogging?

I read yesterday of the impending demise of scienceblogs, a site that has been around since late 2005 in one form or other.  I guess I shouldn't be surprised, since some of its bloggers have shifted to other sites in recent years, such as Ethan Siegel and Chad Orzel, who largely migrated to Forbes, and Rhett Allain, who went to Wired.  Steinn Sigurðsson is going back to his own hosted blog in the wake of this.

I hope this is just indicative of a poor business model at Seed Media, and not a further overall decline in blogging by scientists.  It's wonderful that online magazines like Quanta and Aeon and Nautilus are providing high quality, long-form science writing.  Still, I think everyone benefits when scientists themselves (in addition to professional science journalists) carve out some time to write about their fields.



Friday, October 20, 2017

Neutron stars and condensed matter physics

In the wake of the remarkable results reported earlier this week regarding colliding neutron stars, I wanted to write just a little bit about how a condensed matter physics concept is relevant to these seemingly exotic systems.

When you learn high school chemistry, you learn about atomic orbitals, and you learn that electrons "fill up" those orbitals starting with the lowest energy (most deeply bound) states, two electrons of opposite spin per orbital.  (This is a shorthand way of talking about a more detailed picture, involving words like "linear combination of Slater determinants", but that's a detail in this discussion.)  The Pauli principle, the idea that (because electrons are fermions) all the electrons can't just fall down into the lowest energy level, leads to this.  In solid state systems we can apply the same ideas.  In a metal like gold or copper, the density of electrons is high enough that the highest kinetic energy electrons are moving around at ~ 0.5% of the speed of light (!).  

If you heat up the electrons in a metal, they get more spread out in energy, with some occupying higher energy levels and some lower energy levels being empty.   To decide whether the metal is really "hot" or "cold", you need a point of comparison, and the energy scale gives you that.  If most of the low energy levels are still filled, the metal is cold.  If the ratio of the thermal energy scale, \(k_{\mathrm{B}}T\) to the depth of the lowest energy levels (essentially the Fermi energy, \(E_{\mathrm{F}}\) is much less than one, then the electrons are said to be "degenerate".  In common metals, \(E_{\mathrm{F}}\) is several eV, corresponding to a temperature of tens of thousands of Kelvin.  That means that even near the melting point of copper, the electrons are effectively very cold.

Believe it or not, a neutron star is a similar system.  If you squeeze a bit more than one solar mass into a sphere 10 km across, the gravitational attraction is so strong that the electrons and protons in the matter are crushed together to form a degenerate ball of neutrons.  Amazingly, by our reasoning above, the neutrons are actually very very cold.  The Fermi energy for those neutrons corresponds to a temperature of nearly \(10^{12}\) K.  So, right up until they smashed into each other, those two neutron stars spotted by the LIGO observations were actually incredibly cold, condensed objects.   It's also worth noting that the properties of neutron stars are likely affected by another condensed matter phenomenon, superfluidity.   Just as electrons can pair up and condense into a superconducting state under some circumstances, it is thought that cold, degenerate neutrons can do the same thing, even when "cold" here might mean \(5 \times 10^{8}\) K.

Sunday, October 15, 2017

Gravitational waves again - should be exciting

There is going to be a big press conference tomorrow, apparently to announce that LIGO/VIRGO has seen an event (binary neutron star collision) directly associated with a gamma ray burst in NGC 4993.  Fun stuff, and apparently the worst-kept secret in science right now.  This may seem off-topic for a condensed matter blog, but there's physics in there which isn't broadly appreciated, and I'll write a bit about it after the announcement.

Tuesday, October 10, 2017

Piezo controller question - followup.

A couple of weeks ago I posted:

Anyone out there using a Newport NPC3SG controller to drive a piezo positioning stage, with computer communication successfully talking to the NPC3SG?  If so, please leave a comment so that we can get in touch, as I have questions.

No responses so far.  This is actually the same unit as this thing:
https://www.piezosystem.com/products/piezo_controller/piezo_controller_3_channel_version/nv_403_cle/

In our unit from Newport, communications simply don't work properly.  Timeout problems.  The labview code supplied by Newport (the same code paired with the link above) has these problems, as do many other ways of trying to talk with the instrument.  Has anyone out there had success in using a computer to control and read this thing?   At issue is whether this is a hardware problem with our unit, or whether there is a general problem with these.  The vendor has been verrrrrrrrry slow to figure this out.

Sunday, October 08, 2017

The Abnormal Force

How does the chair actually hold you up when you sit down?  What is keeping your car tires from sinking through the road surface?  What is keeping my coffee mug from falling through my desk?  In high school and first-year undergrad physics, we teach people about the normal force - that is a force that acts normal (perpendicular) to a surface, and it takes on whatever value is needed so that solid objects don't pass through each other.

The microscopic explanation of the normal force is that the electrons in the atoms of my coffee mug (etc.) interact with the electrons in the atoms of the desk surface, through a combination of electrostatics (electrons repel each other) and quantum statistics (the Pauli principle means that you can't just shuffle electrons around willy-nilly).  The normal force is "phenomenological" shorthand.  We take the observation that solid objects don't pass through each other, deduce that whatever is happening microscopically, the effect is that there is some force normal to surfaces that touch each other, and go from there, rather than trying to teach high school students how to calculate it from first principles.  The normal force is an emergent effect that makes sense on macroscopic scales without knowing the details.  This is just like how we teach high school students about pressure as a useful macroscopic concept, without actually doing a statistical calculation of the average perpendicular force per area on a surface due to collisions with molecules of a gas or a liquid.  

You can actually estimate the maximum reasonable normal force per unit area.  If you tried to squeeze the electrons of two adjacent atoms into the volume occupied by one atom, even without the repulsion of like charges adding to the cost, the Pauli principle means you'd have to kick some of those electrons into higher energy levels.  If a typical energy scale for doing that for each electron was something like 1 eV, and you had a few electrons per atom, and the areal density of atoms is around 1014 per cm2, then we can find the average force \(F_{\mathrm{av}}\) required to make a 1 cm2 area of two surfaces overlap with each other.   We'd have \(F_{\mathrm{av}} d \sim 10^{15}\)eV, where \(d\) is the thickness of an atom, around 0.3 nm.   That's around 534000 Newtons/cm2, or around 5.3 GPa.   That's above almost all of the yield stresses for materials (usually worrying about tension rather than compression) - that just means that the atoms themselves will move around before you really push electrons around.

Very occasionally, when two surfaces are brought together, there is a force that arises at the interface that is not along the normal direction.  A great example of that is in this video, which shows two graphite surfaces that spontaneously slide in the plane so that they are crystallographically aligned.  That work comes from this paper.

As far as I can tell, there is no official terminology for such a spontaneous in-plane force.  In the spirit of one of my professional heroes David Mermin, who coined the scientific term boojum, I would like to suggest that such a transverse force be known as the abnormal force.  (Since I don't actually work in this area and I'm not trying to name the effect after myself, hopefully the barrier to adoption will be lower than the one faced by Mermin, who actually worked on boojums :-)  ).

Tuesday, October 03, 2017

Gravitational radiation for the win + communicating science

As expected, LIGO was recognized by the Nobel Prize in physics this year.  The LIGO experiment is an enormous undertaking that combines elegant, simple theoretical ideas; incredible engineering and experimental capabilities; and technically virtuosic numerical theoretical calculations and data analysis techniques.  It's truly a triumph.

I did think it was interesting when Natalie Wolchover, one of the top science writers out there today, tweeted:   Thrilled they won, thrilled not to spend this morning speed-reading about some bizarre condensed matter phenomenon.

This sentiment was seconded by Peter Woit, who said he thought she spoke for all science journalists.

Friendly kidding aside, I do want to help.  Somehow it's viewed as comparatively easy and simple to write about this, or this, or this, but condensed matter is considered "bizarre".  

Sunday, October 01, 2017

Gravitational radiation redux + Nobel speculation

This past week, there was exciting news that the two LIGO detectors and the VIRGO interferometer had simultaneously detected the same event, a merger of black holes estimated to have taken place 1.6 billion lightyears away.  From modeling the data, the black hole masses are estimated at around 25 and 30 solar masses, and around 2.7 solar masses worth of energy (!) was converted in the merger into gravitational radiation.  The preprint of the paper is here.  Check out figure 1.  With just the VIRGO data, the event looks really marginal - by eye you would be hard pressed to pick it out of the fluctuating detector output.  However, when that data is thrown into the mix with that from the (completely independent from VIRGO) detectors, the case is quite strong.

This is noteworthy for (at least) two reasons.  First, there has been some discussion about the solidity of the previously reported LIGO results - this paper (see here for a critique of relevant science journalism) argues that there are some surprising correlations in the noise background of the two detectors that could make you wonder about the analysis.  After all, the whole point of having two detectors is that a real event should be seen by both, while one might reasonably expect background jitter to be independent since the detectors are thousands of miles apart.  Having a completely independent additional detector in the mix should be useful in quantifying any issues.  Second, having the additional detector helps nail down the spot in the sky where the gravitational waves appear to originate.  This image shows how previous detections could only be localized by two detectors to a band spanning lots of the sky, while this event can be localized down to a spot spanning a tenth as much solid angle.    This is key to turning gravitational wave detectors into serious astronomy tools, by trying to link gravitational event detection to observations across the electromagnetic spectrum.  There were rumors, for example, that LIGO had detected what was probably a neutron star collision (smaller masses, but far closer to earth), the kind of event thought to produce dramatic electromagnetic signatures like gamma ray bursts.

On that note, I realized Friday that this coming Tuesday is the announcement of the 2017 Nobel in physics.  Snuck up on me this time.  Speculate away in the comments.  Since topology in condensed matter was last year's award, it seems likely that this year will not be condensed matter-related (hurting the chances of people like Steglich and Hosono for heavy fermion and iron superconductors, respectively).  Negative index phenomena might be too condensed matter related.   The passing last year of Vera Rubin and Debra Jin is keenly felt, and makes it seem less likely that galactic rotation curves (as evidence for dark matter) or ultracold fermions would get it this year.  Bell's inequality tests (Aspect, Zeilinger, Clauser) could be there.   The LIGO/VIRGO combined detection happened too late in the year to affect the chances of this being the year for gravitational radiation (which seems a shoe-in soon).

Tuesday, September 26, 2017

The terahertz gap

https://commons.wikimedia.org/wiki/File:Thz_freq_in_EM_spectrum.png?uselang=en-gb
At a thesis proposal talk yesterday, I realized that I hadn't ever written anything specifically about terahertz radiation (THz, or if you're trying to market something, t-rays).   Terahertz (1012 Hz) is the frequency of electromagnetic radiation higher than microwaves, but lower than what is traditionally labeled the far infrared.  Sometimes called "mm wave" radiation (1 THz would be a free-space wavelength of about 0.3 mm or 300 microns), THz is potentially very useful for communications (pdf, from here), imaging (here, here, here), and range detection (see here for an impressive google project; or here for an article about THz for self-driving cars), among other things.  It's also right around the frequency range of a lot of vibrations in molecules and solids, so it can be used for spectroscopy, though it's also around the energy range where water vapor in the atmosphere can be an efficient absorber.

This frequency region is an awkward middle ground, however.  That's sometimes why it's referred to as the "terahertz gap".

We tend to produce electromagnetic radiation by one of two approaches.  Classically, accelerating charges radiate electromagnetic waves.  In the low frequency limit, there are various ways to generate voltages that oscillate - we can in turn use those to drive oscillating currents and thus generate radio waves, for example.  See here for a very old school discussion.  It is not trivial to shake charges back and forth at THz frequencies, however.  It can be done, but it's very challenging.  One approach to generating a pulse of THz radiation is to use a photoconductive antenna.  Take two electrodes close together on a semiconductor substrate, with a voltage applied between them.  Smack the semiconductor with an ultrafast optical pulse that has a frequency high enough to photoexcite a bunch of charge carriers - those then accelerate from the electric field between the electrodes and emit a pulse of radiation, including THz frequencies.

The other limit we often take in generating light is to work with some quantum system that has a difference in energy levels that is the same energy as the photons we want to generate.  This is the limit of atomic emission (say, having an electron drop from the 2p orbital to the 1s orbital of a hydrogen atom, and emitting an ultraviolet photon of energy around 10 eV) and also the way many solid state devices work (say, having an electron drop from the bottom of the conduction band to the top of the valence band in InGaAsP to produce a red photon of energy around 1.6 eV in a red LED).  The problem with this approach for THz is that the energy scale in question is very small - 1 THz is about 4 milli-electron volts (!).  As far as I know, there aren't naturally occurring solids with energy level splittings that small, so the approach from this direction has been to create artificial systems with such electronic energy gaps - see here.   (Ironically, there are some molecular systems with transitions considerably lower in energy than the THz that can be used to generate microwaves, as in this famous example.)

It looks like THz is starting to take off for technologies, particularly as more devices are being developed for its generation and detection.  SiGe-based transistors, for example, can operate at very high intrinsic speeds, and like in the thesis proposal I heard yesterday, these devices are readily made now and can be integrated into custom chips for exactly the generation and detection of radiation approaching a terahertz.  Exciting times.


Friday, September 22, 2017

Lab question - Newport NPC3SG

Anyone out there using a Newport NPC3SG controller to drive a piezo positioning stage, with computer communication successfully talking to the NPC3SG?  If so, please leave a comment so that we can get in touch, as I have questions.

Monday, September 18, 2017

Faculty position at Rice - theoretical astro-particle/cosmology

Assistant Professor Position at Rice University in

Theoretical Astro-Particle Physics/Cosmology


The Department of Physics and Astronomy at Rice University in Houston, Texas, invites applications for a tenure-track faculty position (Assistant Professor level) in Theoretical Astro-Particle physics and/or Cosmology. The department seeks an outstanding individual whose research will complement and connect existing activities in Nuclear/Particle physics and Astrophysics groups at Rice University (see http://physics.rice.edu). This is the second position in a Cosmic Frontier effort that may eventually grow to three members. The successful applicant will be expected to develop an independent and vigorous research program, and teach graduate and undergraduate courses. A PhD in Physics, Astrophysics or related field is required.

Applicants should send the following: (i) cover letter; (ii) curriculum vitae (including electronic links to 2 relevant publications); (iii) research statement (4 pages or less); (iv) teaching statement (2 pages or less); and (v) the names, professional affiliations, and email addresses of three references.  To apply, please visit: http://jobs.rice.edu/postings/11772.  Applications will be accepted until the position is filled, but only those received by Dec 15, 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Paul Padley (padley@rice.edu).

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.


-->
         

Faculty position at Rice - experimental condensed matter

Faculty Position in Experimental Condensed Matter Physics Rice University


The Department of Physics and Astronomy at Rice University in Houston, TX invites applications for a tenure-track faculty position in experimental condensed matter physics.  The department expects to make an appointment at the assistant professor level. This search seeks an outstanding individual whose research interest is in hard condensed matter systems, who will complement and extend existing experimental and theoretical activities in condensed matter physics on semiconductor and nanoscale structures, strongly correlated systems, topological matter, and related quantum materials (see http://physics.rice.edu/). A PhD in physics or related field is required. 

Applicants to this search should submit the following: (1) cover letter; (2) curriculum vitae; (3) research statement; (4) teaching statement; and (5) the names, professional affiliations, and email addresses of three references. For full details and to apply, please visit: http://jobs.rice.edu/postings/11782. Applications will be accepted until the position is filled. The review of applications will begin October 15 2017, but all those received by December 1 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Emilia Morosan (emorosan@rice.edu).  

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Friday, September 15, 2017

DOE experimental condensed matter physics PI meeting, day 3

And from the last half-day of the meeting:

  • Because the mobile electrons in graphene have an energy-momentum relationship similar to that of relativistic particles, the physics of electrons bound to atomic-scale defects in graphene has much in common with the physics that sets the limits on the stability of heavy atoms - when the kinetic energy of the electrons in the innermost orbitals is high enough that relativistic effects become very important.  It is possible to examine single defect sites with a scanning tunneling microscope and look at the energies of bound states, and see this kind of physics in 2d.  
  • There is a ton of activity concentrating on realizing Majorana fermions, expected to show up in the solid state when topologically interesting "edge states" are coupled to superconducting leads.  One way to do this would be to use the edge states of the quantum Hall effect, but usually the magnetic fields required to get in the quantum Hall regime don't play well with superconductivity.  Graphene can provide a way around this, with amorphous MoRe acting as very efficient superconducting contact material.  The results are some rather spectacular and complex superconducting devices (here and here).
  • With an excellent transmission electron microscope, it's possible to carve out atomically well defined holes in boron nitride monolayers, and then use those to create confined potential wells for carriers in graphene.  Words don't do justice to the fabrication process - it's amazing.  See here and here.
  • It's possible to induce and see big collective motions of a whole array of molecules on a surface that each act like little rotors.
  • In part due to the peculiar band structure of some topologically interesting materials, they can have truly remarkable nonlinear optical properties.
My apologies for not including everything - side discussions made it tough to take notes on everything, and the selection in these postings is set by that and not any judgment of excitement.  Likewise, the posters at the meeting were very informative, but I did not take notes on those.

Wednesday, September 13, 2017

DOE experimental condensed matter PI meeting, day 2

More things I learned:

  • I've talked about skyrmions before.  It turns out that by coupling a ferromagnet to a strong spin-orbit coupling metal, one can stabilize skyrmions at room temperature.  They can be visualized using magnetic transmission x-ray microscopy - focused, circularly polarized x-ray studies.   The skyrmion motion can show its own form of the Hall effect.  Moreover, it is possible to create structures where skyrmions can be created one at a time on demand, and moved back and forth in a strip of that material - analogous to a racetrack memory.
  • Patterned arrays of little magnetic islands continue to be a playground for looking at analogs of complicated magnetic systems.  They're a kind of magnetic metamaterial.  See here.  It's possible to build in frustration, and to look at how topologically protected magnetic excitations (rather like skyrmions) stick around and can't relax.
  • Topological insulator materials, with their large spin-orbit effects and surface spin-momentum locking, can be used to pump spin and flip magnets.  However, the electronic structure of both the magnet and the TI are changed when one is deposited on the other, due in part to interfacial charge transfer.
  • There continues to be remarkable progress on the growth and understanding of complex oxide heterostructures and interfaces - too many examples and things to describe.
  • The use of nonlinear optics to reveal complicated internal symmetries (talked about here) continues to be very cool.
  • Antiferromagnetic layers can be surprisingly good at passing spin currents.  Also, I want to start working on yttrium iron garnet, so that I can use this at some point in a talk.
  • It's possible to do some impressive manipulation of the valley degree of freedom in 2d transition metal dichalcogenides, creating blobs of complete valley polarization, for example.  It's possible to use an electric field to break inversion symmetry in bilayers and turn some of these effects on and off electrically.
  • The halide perovskites actually can make fantastic nanocrystals in terms of optical properties and homogeneity.

Tuesday, September 12, 2017

DOE experimental condensed matter PI meeting, day 1

I'm pressed for time, so this is brief, but here are some things I learned yesterday:
  • An electric field perpendicular to the plane can split and shift the Landau levels of bilayer graphene.  See here.
  • The quantum Hall effect in graphene and other 2d systems still has a lot of richness and life in it.
  • I have one word for you...."polaritons".
  • It's possible to set up a tunneling experiment, from one "probe" 2d electron gas that has a small, tight Fermi surface, into a "sample" 2d electron gas of interest.  By playing with the in-plane magnetic field, the tunneling electrons can pick up momentum in the plane as they tunnel.  The result is, the tunneling current as a function of voltage and transverse fields lets you map out exactly the "sample" electronic states as a function of energy and momentum, like ARPES without the PES part.  See here.
  • Squeezing mechanically to apply pressure can actually produce dramatic changes (quantum phase transitions) in unusual fractional quantum Hall states.
  • How superconductivity dies in the presence of disorder, magnetic field, and temperature remains very rich and interesting.  The "Bose metal", when magnetic field kills global phase coherence without completely ripping apart Cooper pairs, can be an important part of that transition.  For related work, see here.
  • One should be very careful in interpreting ARPES data.  It's entirely possible that not everything identified as some exotic topological material really fits the bill - see here.  On the other hand, sometimes you do see real topologically interesting band structure.
  • The DOE still has laptops running Windows XP.

Sunday, September 10, 2017

DOE Experimental Condensed Matter PI meeting, 2017

The Basic Energy Sciences program is part of the US Department of Energy's Office of Science, and they are responsible for a lot of excellent science research funding.  The various research areas within BES have investigator meetings every two years, and at the beginning of this coming week is the 2017 PI meeting for the experimental condensed matter physics program.  As I've done in past years,  I will try to write up a bulleted list of things I learn.   (See here, here, and here for the 2013 meeting; see here, here, here, and here for the 2015 meeting).

Good luck and stay safe to those in Florida about to get hit by Hurricane Irma.  It's very different than Harvey (much more of a concern about wind damage and storm surge, much less about total rainfall), but still very dangerous.

Lastly, Amazon seems to have my book available for a surprisingly low price right now ($62, though the list is $85).  I (and my publisher) still have no idea how they can do this without losing money.  

Sunday, September 03, 2017

Capillary action - the hidden foe in the physics of floods

There is an enormous amount of physics involved in storms and floods.   The underlying, emergent properties of water are key to much of this.

An individual water molecule can move around, and it can vibrate and rotate in various ways, but it's not inherently wet.  Only when zillions of water molecules get together does something like "wetness" of water even take on meaning.  The zillions of molecules are very egalitarian:  They explore all possible microscopic arrangements (including how they're distributed in space and how they're moving) that are compatible with their circumstances (e.g., sitting at a particular temperature and pressure).  Sometimes the most arrangements correspond to the water molecules being close together as a liquid - the water molecules are weakly attracted to each other if they get close together; at other temperatures and pressures, the most arrangements correspond to the water molecules being spread out as a gas.    Big tropical systems are basically heat engines, powered by the temperature difference between the surface layers of seawater and the upper atmosphere.  That difference in temperatures leads to net evaporation, driving water into the gas phase (by the gigaton, in the case of Hurricane Harvey).  Up in the cold atmosphere, the water condenses again into droplets, and heating the air.  If those droplets are small enough, the forces from adjacent air molecules bouncing off the droplets slow the droplets to the point where they are borne aloft by large-scale breezes - that's why clouds don't fall down even though they're made of water droplets.

There is another feature that comes from the attraction between water molecules and each other, and the attraction between water molecules and their surroundings.   Because of the intramolecular attraction, water molecules would have less energy if they were close together, and therefore having a water-air interface costs energy.  One result is surface tension - the tendency for liquid droplets to pull into small blobs that minimize their (liquid/vapor interface) surface area.

However, sometimes the attractive interaction between a water molecule and some surface can be even stronger than the interaction between the water molecule and other water molecules.  When that happens, a water droplet on such a surface will spread out instead of "beading up".  The surface is said to be hydrophilic.  See here.  This is why some surfaces "like" to get wet, like your dirty car windshield.

Sneaking in here is actually the hidden foe that is known all too well to those who have ever dealt with flooding.  You've seen it daily, even if you've never consciously thought about it.  It's capillary action.  A network of skinny pores or very high surface area hydrophilic material can wick up water like crazy.  Again, the water is just exploring all possible microscopic arrangements, and it so happens that in a high surface area, hydrophilic environment, many many arrangements involve the water being spread out as much as possible on that surface.  This can be to our advantage sometimes - it helps get water to the top of trees, and it makes paper towels work well for drying hands.  However, it can also cause even a couple of cm of floodwater indoors to ruin the bottom meter of sheetrock, or bring water up through several cm of insulation into wood floors, or transport water meters up carpeted stairs.   Perhaps it will one day be economically and environmentally feasible to make superhydrophobic wall and flooring material, but we're not there yet.

(To all my Houston readers, I hope you came through the storm ok!  My garage had 0.8m of water, which killed my cars, but the house is otherwise fine, and the university + lab did very well.)

Friday, August 25, 2017

Hurricanes, heat engines, etc.

Looks like it's going to be a wet few days, with the arrival of Harvey.   I've mentioned previously that hurricanes and tropical storm systems are heat engines - they basically use the temperature difference between the heated water in the ocean and the cooler air in the upper atmosphere to drive enormous flows of matter (air currents, water in vapor and liquid form).  A great explanation of how this works is here.  Even with very crude calculations one can see that the power involved in a relatively small tropical rain event is thousands GW, hundreds of times greater than the power demands of a major city.   Scaling up to a hurricane, you arrive at truly astonishing numbers.  It's likely that Harvey is churning along at an average power some 200 times greater than the electrical generating capacity of the planet (!).  Conservative predictions right now are for total rainfall of maybe 40 cm across an area the size of the state of Louisiana, which would be a total amount of 5.2e10 metric tons of water.   Amazing.  I'm planning to write more in the future about some of this, time permitting.

Update:  For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water.  Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons.  How's that for estimating accuracy in the above?