Search This Blog

Saturday, February 29, 2020

APS March Meeting cancelled

Hello all - I have just heard from Dan Arovas, program chair of the APS March Meeting, that the APS has decided to cancel the meeting, which was scheduled to begin tomorrow: "Just finished a Zoom meeting with APS CEO Kate Kirby, APS presidential line, secretary treasurer, counselor. APS is preparing a statement for release to the press. Right now you can help by informing all your students, postdocs, and colleagues. The web site will be updated as soon as possible."

This is a response to COVID19. As I post this, the meeting website has not yet been updated.  I will post more when I learn more.

Update: The text of the APS email: "Due to rapidly escalating health concerns relating to the spread of the coronavirus disease (COVID-19), the 2020 APS March Meeting in Denver, CO, has been canceled. Please do not travel to Denver to attend the March Meeting. More information will follow shortly."

Update: APS website now confirms.

Update: Here is the text of the letter from the APS president and CEO about the decision.
To the Board, the Council and Unit Leaders of APS:
You have probably already heard that on Saturday, February 29, the APS Leadership decided to cancel the 2020 March Meeting in Denver. We are writing to give you some of the details that led to this difficult decision, which was made in consultation with the APS senior management and the March Meeting program chair.
APS leadership has been monitoring the spread of the coronavirus disease (COVID-19) in the days leading up to the meeting. As you know, a large number of March Meeting attendees come from outside the US. Many have already canceled their attendance, particularly those from China, where travel to the meeting is not currently possible. In addition, we had many planning to come from countries where the CDC has upgraded its warning to level 3 as recently as the day of our decision, yesterday February 29. Even more were coming from countries where the virus appears to be establishing itself in the general population, so that the warning level could rise during the course of our meeting, which might significantly delay their return travel or even lead to quarantines.
In this case the safety of the attendees has to be a primary concern. There is a reasonable expectation that in a meeting with many thousands of participants, some will fall ill. This always happens of course, but it presently takes some time to establish whether an illness is seasonal flu or COVID-19, and many attendees who have come into contact might need to be quarantined during the testing. In light of this danger, we realized that ordinary social events such as the evening receptions would have to be cancelled out of caution.
We appreciate the high cost of our decision, both for the APS and also the attendees. We don’t know the actual loss yet, but the APS portion alone is certain to be in the millions of dollars. We want to assure the APS Board, Council, and Unit Leaders, that we have considered this carefully. Our society is strong financially, and we can absorb this loss. The welfare of our community is certainly a greater concern.
We know you have many questions about the path forward following this decision. We will continue to communicate and confer with you regularly in the coming weeks, as we all come to terms with the need to find new ways to maintain strong international science contacts.
Phil Bucksbaum, APS President
Kate Kirby, APS CEO

Monday, February 24, 2020

BAHFest 2020 at Rice, Sunday March 8 UPDATE: postponed.

Update:  This event is going to be postponed until the fall semester.

For those in the Houston area:

Spread the word - 

Created by SMBC's Zach WeinersmithBAHFest is a celebration of well-argued and thoroughly researched but completely incorrect scientific theory. Our brave speakers present their bad theories in front of a live audience and a panel of judges with real science credentials, who together determine who takes home the coveted BAHFest trophy. And eternal glory, of course. If you'd like to learn more about the event, you can check out these articles from the Wall Street Journal and NPR's Science Friday

Here are some examples from past shows:

Our keynote for this year's event is the hilarious Phil Plait (AKA the Bad Astronomer)! Phil will be doing a book signing of his book "Death from the Skies" before and after the show. 

The event is brought to you by BAHFest, and the graduate students in Rice University's Department of BioSciences. Click here for more information about the show, including how to purchase tickets. We hope to see you there! 

[Full disclosure:  I am one of the judges at this year's event.]

Saturday, February 22, 2020

Brief items

As we head out of a very intense week here and toward the March APS meeting, a few brief items:

  • Speaking of the March Meeting, I hear (unofficially) that travel restrictions due to the coronavirus have made a big dent - over 500 talks may be vacant, and the program committee is working hard to explore options for remote presentation.  (For the record, I fully endorse the suggestion that all vacant talks be delivered in the form of interpretive dance by Greg Boebinger.)
  • There will be many talks about twisted bilayers of various 2D materials at the meeting, and on that note, this PRL (arxiv version here) shows evidence of "strange metallicity" in magic-angle bilayer graphene at temperatures above the correlated insulator state(s).
  • Following indirectly on my post about condensed matter and Christmas lights, I want to point out another example of how condensed matter physics (in the form of semiconductor physics and the light emitting diode) has changed the world for the better in ways that could never have been anticipated.  This video shows and this article discusses the new film-making technique pioneered in the making of The Mandalorian.  Thanks to the development of organic LED displays, infrared LEDs for motion tracking, and lots of processing power, it is possible to create a floor-to-ceiling wraparound high definition electronic backdrop.  It's reconfigurable in real time, produces realistic lighting on the actors and props, and will make a lot of green screen compositing obsolete.  Condensed matter:  This is The Way.
  • Superconducting gravimeters have been used to check to see if there are compact objects (e.g., hunks of dark matter, or perhaps microscopic black holes) orbiting inside the earth.  I remember reading about this issue while in college.  Wild creative idea of the day:  Maybe we should use helioseismology to try to infer whether there are any such objects orbiting inside the sun....

Thursday, February 13, 2020

Film boiling and the Leidenfrost point

While setting up my eddy current bounce demonstration, I was able to film some other fun physics.

Heat transfer and two-phase (liquid+gas) fluid flow is a complicated business that has occupied the time of many scientists and engineers for decades.  A liquid that is boiling at a given pressure is pinned to a particular temperature - that's the way the first-order liquid-vapor transition works.  Water at atmospheric pressure boils at 100 C; adding energy to the liquid water at 100 C via heat transfer converts water into vapor rather than increasing the temperature of the liquid.  

Here we are using liquid nitrogen (LN2), which boils at 77 K = -196 C at atmospheric pressure, and are trying to cool a piece of copper plate that initially started out much warmer than that.  When the temperature difference between the copper and the LN2 is sufficiently high, there is a large heat flux that creates a layer of nitrogen vapor between the copper and the liquid.  This is called film boiling.   You've seen this in practice if you've ever put a droplet of water into a really hot skillet, or dumped some LN2 on the floor.  The droplet slides around with very low friction because it is supported by that vapor layer.  

Once the temperature difference between the copper and the LN2 becomes small, the heat flux is no longer sufficient to support film boiling (the Leidenfrost point), and the vapor layer collapses - that brings more liquid into direct contact with the copper, leading to more vigorous boiling and agitation.  That happens at about 45 seconds into the video.  Then, once the copper is finally at the same temperature as the liquid, boiling ceases and everything gets calm.  

For a more technical discussion of this, see here.  It's written up on a site about nuclear power because water-based heat exchangers are a key component of multiple power generation technologies.  

Tuesday, February 11, 2020

Eddy currents - bouncing a magnet in mid-air

Changing a magnetic field that permeates a conductor like a metal will generate eddy currents.  This is called induction, and it was discovered by Michael Faraday nearly 200 years ago.   If you move a ferromagnet near a conductor, the changing field produces eddy currents and those eddy currents create their own magnetic fields, exerting forces back on the magnet.  Here is a rather dramatic demo of this phenomenon, shamelessly stolen by me from my thesis adviser.

In the video, you can watch in slow motion as I drop a strong NdFe14B2 magnet from about 15 cm above a 2 cm thick copper plate.  The plate is oxygen-free, high-purity copper, and it has been cooled to liquid nitrogen temperatures (77 K = -196 C).   That cooling suppresses lattice vibrations and increases the conductivity of the copper by around a factor of 20 compared with room temperature.  (If cooled to liquid helium temperatures, 4.2 K, the conductivity of this kind of copper goes up to something like 200 times its room temperature value, and is limited by residual scattering from crystalline grain boundaries and impurities.)

As the magnet falls, the magnetic flux \(\Phi\) through the copper increases, generating a circumferential electromotive force and driving eddy currents.  Those eddy currents produce a magnetic field directed to repel the falling magnet.  The currents become large enough that the resulting upward force becomes strong enough to bring the magnet to a halt about 2 cm above the copper (!).  At that instant, \(d\Phi/dt = 0\), so the inductive EMF is zero.  However, the existing currents keep going because of the inductance of the copper.  (Treating the metal like an inductor-resistor circuit, the timescale for the current to decay is \(L/R\), and \(R\) is quite small.)  Those continuing currents generate magnetic fields that keep pushing up on the magnet, making it continue to accelerate upward.  The magnet bounces "in mid air".  Of course, the copper isn't a perfect conductor, so much of the energy is "lost" to resistively heating the copper, and the magnet gradually settles onto the plate.  If you try this at room temperature, the magnet clunks into the copper, because the copper conductivity is worse and the eddy currents decay so rapidly that the repulsive force is insufficient to bounce the magnet before it hits the plate.

(Later I'll make a follow-up post about other neat physics that happens while setting up this demo.)



Sunday, February 09, 2020

Updated: Advice on choosing a grad school

I realized it's been several years since I've run a version of this, and it's the right season....

This is written on the assumption that you have already decided, after careful consideration, that you want to get an advanced degree (in physics, though much of this applies to any other science or engineering discipline).  This might mean that you are thinking about going into academia, or it might mean that you realize such a degree will help prepare you for a higher paying technical job outside academia.  Either way,  I'm not trying to argue the merits of a graduate degree.
  • It's ok at the applicant stage not to know exactly what you want to do.  While some prospective grad students are completely sure of their interests, that's more the exception than the rule.  I do think it's good to have narrowed things down a bit, though.  If a school asks for your area of interest from among some palette of choices, try to pick one (rather than going with "undecided").  We all know that this represents a best estimate, not a rigid commitment.
  • If you get the opportunity to visit a school, you should go.  A visit gives you a chance to see a place, get a subconscious sense of the environment (a "gut" reaction), and most importantly, an opportunity to talk to current graduate students.  Always talk to current graduate students if you get the chance - they're the ones who really know the score.  A professor should always be able to make their work sound interesting, but grad students can tell you what a place is really like.
  • International students may have a very challenging time being able to visit schools in the US, between the expense (many schools can help defray costs a little but cannot afford to pay for airfare for trans-oceanic travel) and visa challenges.  Trying to arrange skype discussions with people at the school is a possibility, but that can also be challenging.  I understand that this constraint tends to push international students toward making decisions based heavily on reputation rather than up-close information.  
  • Picking an advisor and thesis area are major decisions, but it's important to realize that those decisions do not define you for the whole rest of your career.  I would guess (and if someone had real numbers on this, please post a comment) that the very large majority of science and engineering PhDs end up spending most of their careers working on topics and problems distinct from their theses.  Your eventual employer is most likely going to be paying for your ability to think critically, structure big problems into manageable smaller ones, and knowing how to do research, rather than the particular detailed technical knowledge from your doctoral thesis.  A personal anecdote:  I did my graduate work on the ultralow temperature properties of amorphous insulators.  I no longer work at ultralow temperatures, and I don't study glasses either; nonetheless, I learned a huge amount in grad school about the process of research that I apply all the time.
  • Always go someplace where there is more than one faculty member with whom you might want to work.  Even if you are 100% certain that you want to work with Prof. Smith, and that the feeling is mutual, you never know what could happen, in terms of money, circumstances, etc.  Moreover, in grad school you will learn a lot from your fellow students and other faculty.  An institution with many interesting things happening will be a more stimulating intellectual environment, and that's not a small issue.
  • You should not go to grad school because you're not sure what else to do with yourself.  You should not go into research if you will only be satisfied by a Nobel Prize.  In both of those cases, you are likely to be unhappy during grad school.  
  • I know grad student stipends are low, believe me.  However, it's a bad idea to make a grad school decision based purely on a financial difference of a few hundred or a thousand dollars a year.  Different places have vastly different costs of living - look into this.  Stanford's stipends are profoundly affected by the cost of housing near Palo Alto and are not an expression of generosity.  Pick a place for the right reasons.
  • Likewise, while everyone wants a pleasant environment, picking a grad school largely based on the weather is silly.
  • Pursue external fellowships if given the opportunity.  It's always nice to have your own money and not be tied strongly to the funding constraints of the faculty, if possible.  (It's been brought to my attention that at some public institutions the kind of health insurance you get can be complicated by such fellowships.  In general, I still think fellowships are very good if you can get them.)
  • Be mindful of how departments and programs are run.  Is the program well organized?  What is a reasonable timetable for progress?  How are advisors selected, and when does that happen?  Who sets the stipends?  What are TA duties and expectations like?  Are there qualifying exams?  Where have graduates of that department gone after the degree?  Know what you're getting into!  Very often, information like this is available now in downloadable graduate program handbooks linked from program webpages.   
  • It's fine to try to communicate with professors at all stages of the process.  We'd much rather have you ask questions than the alternative.  If you don't get a quick response to an email, it's almost certainly due to busy-ness, and not a deeply meaningful decision by the faculty member.  For a sense of perspective:  even before I was chair, I would get 50+ emails per day of various kinds not counting all the obvious spam that gets filtered. 
There is no question that far more information is now available to would-be graduate students than at any time in the past.  Use it.  Look at departmental web pages, look at individual faculty member web pages.  Make an informed decision.  Good luck!

Wednesday, January 29, 2020

Charles Lieber

As one of the only surviving nano-related blogs, I feel somewhat obligated to write a post about this.  Charles Lieber, chair of the department of chemistry and chemical biology at Harvard, was arrested yesterday by the FBI on charges of fraud.  Lieber is one of the premier nano researchers in the world.  The relevant documents are here (pdf) and they make for quite a read.

In brief, Lieber is alleged to have signed on to China's Thousand Talents program with an affiliation at Wuhan University of Technology back in 2011.  This involved the setting up of a joint research lab in Wuhan and regular interactions, including WUT students to come to Harvard.  That in itself is not necessarily problematic.  Much more concerning is the claim that WUT would pay $50K/month (plus living expenses) for his involvement, and the stipulation in the agreement that he would be working at least nine months/yr with them.  That alone would raise serious conflict-of-commitment and percentage-effort issues.  Worse is the allegation that this went on for years, none of this was disclosed appropriately, and in fact was denied to both DOD and (via Harvard internal folks) NIH. 

These allegations are shocking, and the story is hard to fathom for multiple reasons. 

Putting on my department chair hat, I can't help but think about how absolutely disruptive this will be for his students and postdocs, since he was placed on immediate leave.  It will be a nontrivial task for the department and the Faculty of Arts and Sciences at Harvard to come up with a way to transition the students to other advising and pay circumstances, and even more challenging for the postdocs.  What a mess.

Wednesday, January 22, 2020

Stretchy bioelectronics and ptychographic imaging - two fun talks

One of the great things about a good university is the variety of excellent talks that you can see. 

Yesterday we had our annual Chapman Lecture on Nanotechnology, in honor of Rice alum Richard Chapman, who turned down a first-round draft selection to the Detroit Lions to pursue a physics PhD and a career in engineering.  This year's speaker was Zhenan Bao from Stanford, whom I know from back in my Bell Labs postdoc days.  She spoke about her group's remarkable work on artificial skin:  biocompatible, ultraflexible electronics including active matrices of touch sensors, transistors, etc.  Here are a few representative papers that give you some idea of the kind of work that goes into this: Engineering semiconducting polymers to have robust elastic properties while retaining high charge mobilities; a way of combining conducting polymers (PEDOT) with hydrogels so that you can pattern them and then hydrate to produce super-soft devices; a full-on demonstration of artificial skin for sensing applications.  Very impressive stuff. 

Today, we had a colloquium by Gabe Aeppli of ETH and the Paul Scherrer Institute, talking about x-ray ptychographic imaging.  Ptychography is a simple enough idea.  Use a coherent source of radiation to illuminate some sample at some spot, and with a large-area detector, measure the diffraction pattern.  Now scan the spot over the sample (including perhaps rotating the sample) and record all those diffraction patterns as well.  With the right approach, you can combine all of those diffraction patterns and invert to get the spatial distribution of the scatterers (that is, the matter in the sample).  Sounds reasonable, but these folks have taken it to the next level (pdf here).   The video I'm embedding here is the old result from 2017.  The 2019 paper I linked here is even more impressive, able to image, nondestructively, in 3D, individual circuit elements within a commercial integrated circuit at nanoscale resolution.  It's clear that a long-term goal is to be able to image, non-destructively, the connectome of brains. 


Monday, January 20, 2020

Brief items

Here are some items of interest:

  • An attempt to lay out a vision for research in the US beyond Science: The Endless Frontier.  The evolving roles of the national academies are interesting, though I found the description of the future of research universities to be rather vague - I'm not sure growing universities to the size of Arizona State is the best way to provide high quality access to knowledge for a large population.  It still feels to me like an eventual successful endpoint for online education could be natural language individualized tutoring ("Alexa, teach me multivariable calculus."), but we are still a long way from there.
  • Atomic-resolution movies of chemistry are still cool.
  • Dan Ralph at Cornell has done a nice service to the community by making his lecture notes available on the arxiv.  The intent is for these to serve as a supplement to a solid state course such as one out of Ashcroft and Mermin, bringing students up to date about Berry curvature and topology at a similar level to that famous text.
  • This preprint tries to understand an extremely early color photography process developed by Becquerel (the photovoltaic one, who was the father of the radioactivity Becquerel).  It turns out that there are systematic changes in reflectivity spectra of the exposed Ag/AgCl films depending on the incident wavelength.  Why the reflectivity changes that way remains a mystery to me after reading this.
  • On a related note, this led me to this PNAS paper about the role of plasmons in the daguerreotype process.  Voila, nanophotonics in the 19th century.
  • This preprint (now out in Nature Nano) demonstrates incredibly sensitive measurements of torques on very rapidly rotating dielectric nanoparticles.  This could be used to see vacuum rotational friction.
  • The inventors of chemically amplified photoresists have been awarded the Charles Stark Draper prize.  Without that research, you probably would not have the computing device sitting in front of you....

Tuesday, January 14, 2020

The Wolf Prize and how condensed matter physics works

The Wolf Prize in Physics for 2020 was announced yesterday, and it's going to Pablo Jarillo-Herrero, Allan MacDonald, and Rafi Bistritzer, for twisted bilayer graphene.  This prize is both well-deserved and a great example of how condensed matter physics works.  

MacDonald and Bistritzer did key theory work (for example) highlighting how the band structure of twisted bilayer graphene would become very interesting for certain twist angles - how the moire pattern from the two layers would produce a lateral periodicity, and that interactions between the layers would lead to very flat bands.  Did they predict every exotic thing that has been seen in this system?  No, but they had the insight to get key elements, and the knowledge that flat bands would likely lead to many competing energy scales, including electron-electron interactions, the weak kinetic energy of the flat bands, the interlayer coupling, effective magnetic interactions, etc.  Jarillo-Herrero was the first to implement this with sufficient control and sample quality to uncover a remarkable phase diagram involving superconductivity and correlated insulating states.  Figuring out what is really going on here and looking at all the possibilities in related layered materials will keep people busy for years.   (As an added example of how condensed matter works as a field, Bistritzer is in industry working for Applied Materials.)

All of this activity and excitement, thanks to feedback between well-motivated theory and experiment, is how the bulk of physics that isn't "high energy theory" actually works.  

Monday, January 13, 2020

Popular treatment of condensed matter - topics

I'm looking more seriously at trying to do some popularly accessible writing about condensed matter.  I have a number of ideas about what should be included in such a work, but I'm always interested in other peoples' thoughts on this.   Suggestions? 

Sunday, January 05, 2020

Brief items

Happy new year.  As we head into 2020, here are a few links I've been meaning to point out:

  • This paper is a topical review of high-throughput (sometimes called combinatorial) approaches to searching for new superconductors.   The basic concept is simple enough:  co-deposit multiple different elements in a way that deliberately produces compositional gradients across the target substrate.  This can be done via geometry of deposition, or with stencils that move during the deposition process.  Then characterize the local properties in an efficient way across the various compositional gradients, looking for the target properties you want (e.g., maximum superconducting transition temperature).  Ideally, you combine this with high-throughput structural characterization and even annealing or other post-deposition treatment.  Doing all of this well in practice is a craft.  
  • Calling back to my post on this topic, Scientific American has an article about wealth distribution based on statistical mechanics-like models of economies.   It's hard for me to believe that some of these insights are really "new" - seems like many of these models could have been examined decades ago....
  • This is impressive.  Jason Petta's group at Princeton has demonstrated controlled entanglement between single-electron spins in Si/SiGe gate-defined quantum dots separated by 4 mm.  That may not sound all that exciting; one could use photons to entangle atoms separated by km, as has been done with optical fiber.  However, doing this on-chip using engineered quantum dots (with gates for tunable control) in an arrangement that is in principle scalable via microfabrication techniques is a major achievement.
  • Just in case you needed another demonstration that correlated materials like the copper oxide superconductors are complicated, here you go.  These investigators use an approach based on density functional theory (see here, here, and here), and end up worrying about energetic competition between 26 different electronic/magnetic phases.  Regardless of the robustness of their specific conclusions, just that tells you the inherent challenge of those systems:  Many possible ordered states all with very similar energy scales.

Monday, December 30, 2019

Energy scales and crystals in science fiction

Crystals are fascinating.  Somehow, for reasons that don't seem at all obvious at first glance, some materials grow in cool shapes as solids, with facets and obvious geometric symmetries.  This was early support for the idea of atoms, and it's no wonder at all that people throughout history have looked upon obviously crystalline materials as amazing, possibly connected with magical powers.

In science fiction (or maybe more properly science fantasy), crystals show up repeatedly as having special properties, often able to control or direct energies that seem more appropriate for particle physics.  In Star Trek, dilithium crystals are able to channel and control the flow of matter-antimatter reactions needed for warp drive, the superluminal propulsion system favored by the Federation and the Klingon Empire.  In Star Wars, kyber crystals are at the heart of lightsabers, and were also heavily mined by the Empire for use in the planet-killing main weapon of the Death Star.

In real life, though, crystals don't do so well in interacting with very high energy electromagnetic or particle radiation.  Yes, it is possible for crystals to scatter x-rays and high energy electrons - that's the way x-ray diffraction and electron diffraction work.  On very rare occasions, crystals can lead to surprising nuclear processes, such as all the iron atoms in a crystal sharing the recoil when an excited iron nucleus spits out a gamma ray, as in the Mossbauer Effect.   Much more typically, though, crystals are damaged by high energy radiation - if the energy scale of the photon or other particle is much larger than the few eV chemical energy scales that hold atoms in place or bind core electrons (say a few tens of eV), then the cool look and spatial arrangement of the atoms really doesn't matter, and atoms get kicked around.  The result is the creation of vacancies or interstitial defects, some of which can even act as "color centers", so that otherwise colorless Al2O3, for example, can take on color after being exposed to ionizing radiation in a reactor.

Ahh well.  Crystals are still amazing even if they can't propel starships faster than light.

(Happy new year to my readers!  I'm still trying to be optimistic, even if it's not always easy.)


Sunday, December 22, 2019

Condensed matter and Christmas decorations - 'tis the season

Modern outdoor decorations owe quite a bit to modern science - polymers; metallurgy; electric power for the lighting, fans, sensors, and motors which make possible the motion-actuated inflatable Halloween decorations that scare my dog....  Condensed matter physics has, as in many areas of society, had a big impact on Christmas decorations that is so ubiquitous and pervasive that no one even thinks about it.  In particular, I'm thinking about the light emitting diode and its relative, the diode laser.  I'm pretty sure that Nick Holonyak and Shuji Nakamura never imagined that LEDs would pave the way for animated multicolor icicle decorations.  Likewise, I suspect that the inventors discussed here (including Holonyak) never envisioned laser projected holiday lighting.  So, the next time someone asks if any of this quantum stuff or basic research is useful, remember that these inherently quantum devices have changed the world in all kinds of ways that everyone sees but few observe. 

Wednesday, December 18, 2019

Materials and neuromorphic computing

(In response to a topic suggestion from the Pizza Perusing Physicist....)

Neuromorphic computing is a trendy concept aimed at producing computing devices that are structured and operate like biological neural networks.  

In standard digital computers, memory and logic are physically separated and handled by distinct devices, and both are (nearly always) based on binary states and highly regular connectivity.  That is, logic gates take inputs that are two-valued (1 or 0), and produce outputs that are similarly two-valued; logic gates have no intrinsic memory of past operations that they've conducted; memory elements are also binary, with data stored as a 1 or 0; and everything is organized in a regular, immutable pattern - memory registers populated and read by clocked, sequential logic gates via a bus.

Natural neural networks, on the other hand, are very different.  Each neuron can be connected to many others via synapses.  Somehow memory and logic are performed by the same neuronal components.   The topology of the connections varies with time - some connections are reinforced by repeated use, while others are demoted, in a continuous rather than binary way.  Information traffic involves temporal trains of pulses called spikes.  

All of these things can be emulated with standard digital computers.  Deep learning methods do this, with multiple layers playing the roles of neurons, and weighted links between nodes modeling the connections and strengths.  This is all a bit opaque and doesn't necessarily involve simulating the spiking dynamics at all.  Implementing neural networks via standard hardware loses some of the perceived benefits of biological neural nets, like very good power efficiency.

In the last few years, as machine learning and big data have become increasingly important, there has been a push to try to implement in device hardware architectures that look a lot more like the biological analogs.  To do this, you might want nonvolatile memory elements that can also be used for logic, and can have continuously graded values of "on"-ness determined by their history.  Resistive switching memory elements, sometimes called memristors (though that is a loaded term - see here and here), can fit the bill, as in this example.  Many systems can act as resistive switches, with conduction changes often set by voltage-driven migration of ions or vacancies in the material.

On top of this, there has been a lot of interest in using strongly correlated materials in such applications.  There are multiple examples of correlated materials (typically transition metal oxides) that undergo dramatic metal-insulator transitions as a function of temperature.  These materials then offer a chance to emulate spiking - driving a current can switch such a material from the insulating to the metallic state via local Joule heating or more nontrivial mechanisms, and then revert to the insulating state.  See the extensive discussion here.  

Really implementing all of this at scale is not simple.  The human brain involves something like 100,000,000,000 neurons, and connections run in three dimensions.  Getting large numbers of effective solid-state neurons with high connectivity via traditional 2D planar semiconductor-style fab (basically necessary if one wants to have many millions of neurons) is not easy, particularly if it requires adapting processing techniques to accommodate new classes of materials.

If you're interested in this and how materials physics can play a role, check out this DOE report and this recent review article.

Sunday, December 08, 2019

Brief items

Here are some tidbits that came across my eyeballs this past week:

  • I just ran into this article from early in 2019.  It touches on my discussion about liquids, and is a great example of a recurring theme in condensed matter physics.  The authors look at the vibrational excitations of liquid droplets on surfaces.  As happens over and over in physics, the imposition of boundary conditions on the liquid motion (e.g., wetting conditions on the surface and approximately incompressible liquid with a certain surface tension) leads to quantization of the allowed vibrations.  Discrete frequencies/mode shapes/energies are picked out due to those constraints, leading to a "periodic table" of droplet vibrations.  (This one looks moderately like atomic states, because spherical harmonics show up in the mode description, as they do when looking at atomic orbitals.)
  • Another article from the past, this one from 2014 in the IEEE Spectrum.  It talks about how we arrived at the modern form for Maxwell's equations.  Definitely a good read for those interested in the history of physics.  Maxwell's theory was developing in parallel with what became vector calculus, and Maxwell's original description (like Faraday's intuition) was very mechanistic rather than abstract.
  • Along those lines, this preprint came out recently promoting a graphical pedagogical approach to vector calculus.  The spirit at work here is that Feynman's graphical diagrammatic methods were a great way to teach people perturbative quantum field theory, and do perhaps a diagrammatic scheme for vector calc could be good.  I'm a bit of a skeptic - I found the approach by Purcell to be very physical and intuitive, and this doesn't look simpler to me.
  • This preprint about twisted bilayer graphene and the relationship between superconductivity and strongly insulating states caught my eye, and I need to read it carefully.  The short version:  While phase diagrams showing superconductivity and insulating states as a function of carrier density make it tempting to think that SC evolves out of the insulating states via doping (as likely in the cuprates), the situation may be more complicated.

Saturday, November 30, 2019

What is a liquid?

I wrote recently about phases of matter (and longer ago here).  The phase that tends to get short shrift in the physics curriculum is the liquid, and this is actually a symptom indicating that liquids are not simple things. 

We talk a lot about gases, and they tend to be simple in large part because they are low density systems - the constituents spend the overwhelming majority of their time far apart (compared to the size of the constituents), and therefore tend to interact with each other only very weakly.  We can even look in the ideal limit of infinitesimal particle size and zero interactions, so that the only energy in the problem is the kinetic energy of the particles, and derive the Ideal Gas Law.  

There is no such thing as an Ideal Liquid Law.  That tells you something about the complexity of these systems right there.

A classical liquid is a phase of matter in which the constituent particles have a typical interparticle distance comparable to the particle size, and therefore interact strongly, with both a "hard core repulsion" so that the particles are basically impenetrable, and usually through some kind of short-ranged attraction, either from van der Waals forces and/or longer-ranged/stronger interactions.  The kinetic energy of the particles is sufficiently large that they don't bond rigidly to each other and therefore move past and around each other continuously.  However, the density is so high that you can't even do very well by only worrying about pairs of interacting particles - you have to keep track of three-body, four-body, etc. interactions somehow.    

The very complexity of these strongly interacting collections of particles leads to the emergence of some simplicity at larger scales.  Because the particles are cheek-by-jowl and impenetrable, liquids are about as incompressible as solids.  The lack of tight bonding and enough kinetic energy to keep everyone moving means that, on average and on scales large compared to the particle size, liquids are homogeneous (uniform properties in space) and isotropic (uniform properties in all directions).  When pushed up against solid walls by gravity or other forces, liquids take on the shapes of their containers.  (If the typical kinetic energy per particle can't overcome the steric interactions with the local environment, then particles can get jammed.  Jammed systems act like "rigid" solids.)

Because of the constant interparticle collisions, energy and momentum get passed along readily within liquids, leading to good thermal conduction (the transport of kinetic energy of the particles via microscopic, untraceable amounts we call heat) and viscosity (the transfer of transverse momentum between adjacent rough layers of particles just due to collisions - the fluid analog of friction).  The lack of rigid bonding interactions means that liquids can't resist shear; layers of particles slide past each other.  This means that liquids, like gases, don't have transverse sound waves.   The flow of particles is best described by hydrodynamics, a continuum approach that makes sense on scales much bigger than the particles.   

Quantum liquids are those for which the quantum statistics of the constituents are important to the macroscopic properties.  Liquid helium is one such example.  Physicists have also adopted the term "liquid" to mean any strongly interacting, comparatively incompressible, flow-able system, such as the electrons in a metal ("Fermi liquid").  

Liquids are another example emergence that is deep, profound, and so ubiquitous that people tend to look right past it.  "Liquidity" is a set of properties so well-defined that a small child can tell you whether something is a liquid by looking at a video of it; those properties emerge largely independent of the microscopic details of the constituents and their interactions (water molecules with hydrogen bonds; octane molecules with van der Waals attraction; very hot silica molecules in flowing lava); and none of those properties are obvious if one starts with, say, the Standard Model of particle physics.  


Monday, November 25, 2019

General relativity (!) and band structure

Today we had a seminar at Rice by Qian Niu of the University of Texas, and it was a really nice, pedagogical look at this paper (arxiv version here).   Here's the basic idea.

As I wrote about here, in a crystalline solid the periodic lattice means that single-particle electronic states look like Bloch waves, labeled by some wavevector \(\mathbf{k}\), of the form \(u_{\mathbf{k}}(\mathbf{r}) \exp(i \mathbf{k}\cdot \mathbf{r})\) where \(u_{\mathbf{k}}\) is periodic in space like the lattice.  It is possible to write down semiclassical equations of motion of some wavepacket that starts centered around some spatial position \(\mathbf{r}\) and some (crystal) momentum \(\hbar \mathbf{k}\).  These equations tell you that the momentum of the wavepacket changes with time as due to the external forces (looking a lot like the Lorentz force law), and the position of the wavepacket has a group velocity, plus an additional "anomalous" velocity related to the Berry phase (which has to do with the variation of \(u_{\mathbf{k}}\) over the allowed  values of \(\mathbf{k}\)).

The paper asks the question, what are the semiclassical equations of motion for a wavepacket if the lattice is actually distorted a bit as a function of position in real space.  That is, imagine a strain gradient, or some lattice deformation.  In that case, the wavepacket can propagate through regions where the lattice is varying spatially on very long scales while still being basically periodic on shorter scales still long compared to the Fermi wavelength.

It turns out that the right way to tackle this is with the tools of differential geometry, the same tools used in general relativity.  In GR, when worrying how the coordinates of a particle change as it moves along, there is the ordinary velocity, and then there are other changes in the components of the velocity vector because the actual geometry of spacetime (the coordinate system) is varying with position.  You need to describe this with a "covariant derivative", and that involves Christoffel symbols.  In this way, gravity isn't a force - it's freely falling particles propagating as "straight" as they can, but the actual geometry of spacetime makes their trajectory look curved based on our choice of coordinates.

For the semiclassical motion problem in a distorted lattice, something similar happens.  You have to worry about how the wavepacket evolves both because of the local equations of motion, and because the wavepacket is propagating into a new region of the lattice where the \(u_{\mathbf{k}}\) functions are different because the actual lattice is different (and that also affects the Berry phase anomalous velocity piece).   Local rotations of the lattice can lead to an affective Coriolis force on the wavepacket; local strain gradients can lead to effective accelerations of the wavepacket.

(For more fun, you can have temporal periodicity as well.  That means you don't just have Bloch functions in 3d, you have Bloch-Floquet functions in 3+1d, and that's where I fell behind.)

Bottom line:  The math of general relativity is an elegant way to look at semiclassical carrier dynamics in real materials.  I knew that undergrad GR course would come in handy....


Friday, November 22, 2019

Recent results on the arxiv

Here are a few interesting results I stumbled upon recently:
  • This preprint has become this Science paper that was published this week.  The authors take a cuprate superconductor (YBCO) and use reactive ion etching to pattern an array of holes in the film.  Depending on how long they etch, they can kill global superconductivity but leave the system such that it still behaves as a funny kind of metal (resistance decreasing with decreasing temperature), with some residual resistance at low temperatures.  The Hall effect in this metallic state produces no signal - a sign that the is balance between particle-like and hole-like carriers (particle-hole symmetry).  For magnetic field perpendicular to the film, they also see magnetoresistance with features periodic in flux through one cell of the pattern, with a periodicity that indicates the charge carriers have charge 2e.  This is an example of a "Bose metal".  Neat!  (The question about whether there are pairs without superconductivity touches on our own recent work.)
  • This preprint was recently revised (and thus caught my eye in the arxiv updates).  In it, the authors are using machine learning to try to find new superconductors.  The results seem encouraging.  I do wonder if one could do a more physics-motivated machine learning approach (that is, something with an internal structure to the classification scheme and the actual weighting procedure) to look at this and other related problems (like identifying which compounds might be growable via which growth techniques).
  • This preprint is not a condensed matter topic, but has gotten a lot of attention.  The authors look at a particular nuclear transition in 4He, and find a peculiar angular distribution for the electron-positron pairs that come out.  The reason this is of particular interest is that this paper by the same investigators looking at a nuclear transition in 8Be three years ago found something very similar.  If one assumes that there is a previously unobserved boson (a dark matter candidate perhaps) of some sort with a mass of around 17 MeV that couples in there, that could explain both results.  Intriguing, but it would be great if these observations were confirmed independently by a different group.  

Tuesday, November 12, 2019

Advice on proposal writing

Many many people have written about how to write scientific grant proposals, and much of that advice is already online.   Rather than duplicate that work, and recognizing that sometimes different people need to hear advice in particular language, I want to link to some examples.

  • Here (pdf) is some advice straight from the National Science Foundation about how to write a compelling proposal.  It's older (2004) and a bit out of date, but the main points are foundational.
  • This is a very good collection of advice that has been updated (2015) to reflect current practice about NSF.  
  • Here are lecture notes from a course at Illinois that touched on this as well, generalizing beyond the NSF.
Fundamentally, sponsored academic research is an odd thing.  You are trying to convince an agency or foundation with finite (often very finite) resources that allocating some of their precious support to you will be a good thing.  Limiting the conversation to the often ill-named "basic research" (see here and the book therein for a discussion of "basic" vs "applied"), this means work where the primary products of the research are (i) fundamental advances in our understanding of some system or phenomena; (ii) personnel trained in scientific/engineering/research knowledge and skills; (iii) scholarly publications (and/or patents for more technology-focused topics) that report the results, with the intent of propagating the work to the community and having an impact. 

This last one has taken a pre-eminent position of importance because it's something that can be readily counted and measured.  There is a rough rule that many program officers in NSF and DOE will tell you; averaging over their programs, they get roughly one high impact paper per $100K total cost.  They would like more, of course. 

Talk with program officers before writing and submitting - know the audience.  Program officers (including foundational ones) tend to take real pride in their portfolios.  Everyone likes funding successful, high-impact, exciting, trend-setting work.  Still, particular program officers have areas of emphasis, in part so that there is not duplication of effort or support within an agency or across agencies.  (This is especially true in areas like high energy theory, where if you've got DOE funding, you essentially can't get NSF support, and vice versa.)  You will be wasting your time if you submit to the wrong program or pitch your idea to the wrong reviewing audience.   NSF takes a strong line that their research directions are broadly set by the researchers themselves, via their deep peer review process (mail-in reviews, in-person or virtual panel discussions) and workshops that define programmatic goals.  DOE likewise has workshops to help define major challenges and open questions, though my sense is that the department takes a more active role in delineating priorities.   The DOD is more goal-directed, with program officers having a great deal of sway on topics of interest, and the prospect that such research may transition closer to technology-readiness.  Foundations are idiosyncratic, but a common refrain is that they prefer to fund topics that are not already supported by federal agencies.

Think it through, and think like a referee.  When coming up with an idea, do your best to consider in some detail how you would actually pull this off.  How could you tell if it works?  What would the implications be of success?  What are the likely challenges and barriers?  If some step doesn't go as planned, is it a show-stopper, or are their other ways to go?  As an experimentalist:  Do you have the tools you need to do this?  How big a signal are you trying to detect?   Remember, referees are frequently asked to evaluate strengths and weaknesses of technical approach.  Better to have this in mind while at an early stage of the process.

Clearly state the problem, and explain the proposal's organization.  Reviewers might be asked to read several proposals in a short timeframe.  It seems like a good idea to say up front, in brief (like in a page or so):  What is the problem?  What are the open scientific/engineering questions you are specifically addressing?  What is your technical approach?  What will the results mean?  Then, explain the organization of the proposal (e.g., section 2 gives a more detailed introduction to the problem and open questions; section 3 explains the technical approach, including a timeline of proposed work; etc.).  This lets readers know where to find things. 

I'll confess:  I got this organizational approach by emulating the structure of an excellent proposal that I reviewed a number of years ago.  It was really terrific - clear; pedagogical, so that a non-expert in that precise area could understand the issues and ideas; very cleanly written; easy-to-read figures, including diagrams that really showed how the ideas would work.   Reviewing proposals is very helpful in improving your own.  Very quickly you will get a sense of what you think makes a good or bad proposal.  NSF is probably the most open to getting new investigators involved in the reviewing process. 

Don't wait until the last minute.  You know that classmate of yours from undergrad days, the one who used to brag about how they waited until the night before to blitz through a 20 page writing assignment?  Amazingly, some of these people end up as successful academics.  I genuinely don't know how they do it, because these days research funding is so competitive and proposals are detailed and complicated.  There are many little formatting details that agencies enforce now.  You don't want to get to an hour before the deadline and realize that all of your bibliographic references are missing a URL field.   People really do read sections like data management plans and postdoctoral mentoring plans - you can't half-ass them.   Also, while it is unlikely to sink a really good proposal, it definitely comes across badly to referees if there are missing or mislabeled references, figures, etc. 

I could write more, and probably will amend this down the line, but work calls and this is at least a start.



Thursday, November 07, 2019

Rice Academy of Fellows 2020

As I had posted a year ago:  Rice has a university-wide competitive postdoctoral fellow program known as the Rice Academy of Fellows.   Like all such things, it's very competitive.  The new application listing has gone live here with a deadline of January 3, 2020.  Applicants have to have a faculty mentor, so in case someone is interested in working with me on this, please contact me via email.  We've got some fun, exciting stuff going on!

Friday, November 01, 2019

Sorry for the hiatus

My apologies for the unusually long hiatus in posts.  Proposal deadlines + department chair obligations + multiple papers in process made the end of October very challenging.   Later next week I expect to pick up again.  Suggested topics (in the comments?) are always appreciated.  I realize I've never written an advice-on-grant-proposal-writing post.  On the science side, I'm still mulling over the most accessible way to describe quantum Hall physics, and there are plenty of other "primer" topics that I should really write at some point.

If I hadn't been so busy, I would've written a post during the baseball World Series about how the hair of Fox Sports broadcaster Joe Buck is a study in anisotropic light scattering.  Viewed straight on, it's a perfectly normal color, but when lit and viewed from an angle, it's a weirdly iridescent yellow - I'm thinking that this really might have interesting physics behind it, in the form of some accidental structural color

Thursday, October 17, 2019

More items of interest

This continues to be a very very busy time, but here are a few interesting things to read:

Monday, October 07, 2019

"Phase of matter" is a pretty amazing emergent concept

As we await the announcement of this year's physics Nobel tomorrow morning (last chance for predictions in the comments), a brief note:

I think it's worth taking a moment to appreciate just how amazing it is that matter has distinct thermodynamic phases or states.

We teach elementary school kids that there are solids, liquids, and gases, and those are easy to identify because they have manifestly different properties.  Once we know more about microscopic details that are hard to see with unaided senses, we realize that there are many more macroscopic states - different structural arrangements of solids; liquid crystals; magnetic states; charge ordered states; etc.

When we take statistical physics, we learn descriptively what happens.  When you get a large number of particles (say atoms for now) together, the macroscopic state that they take on in thermal equilibrium is the one that corresponds to the largest number of microscopic arrangements of the constituents under the given conditions.  So, the air in my office is a gas because, at 298 K and 101 kPa, there are many many more microscopic arrangements of the molecules with that temperature and pressure that look like a gas than there are microscopic arrangements of the molecules that correspond to a puddle of N2/O2 mixture on the floor. 

Still, there is something special going on.  It's not obvious that there should have to be distinct phases at all, and such a small number of them.  There is real universality about solids - their rigidity, resistance to shear, high packing density of atoms - independent of details.  Likewise, liquids with their flow under shear, comparative incompressibility, and general lack of spatial structure.  Yes, there are detailed differences, but any kid can recognize that water, oil, and lava all have some shared "liquidity".  Why does matter end up in those configurations, and not end up being a homogeneous mush over huge ranges of pressure and temperature?  This is called emergence, because while it's technically true that the standard model of particle physics undergirds all of this, it is not obvious in the slightest how to deduce the properties of snowflakes, raindrops, or water vapor from there.    Like much of condensed matter physics, this stuff is remarkable (when you think about it), but so ubiquitous that it slides past everyone's notice pretty much of the time.

Saturday, September 28, 2019

Items of interest

As I struggle with being swamped this semester, some news items:
  • Scott Aaronson has a great summary/discussion about the forthcoming google/John Martinis result about quantum supremacy.  The super short version:  There is a problem called "random circuit sampling", where a sequence of quantum gate operations is applied to some number of quantum bits, and one would like to know the probability distribution of the outcomes.  Simulating this classically becomes very very hard as the number of qubits grows.  The google team apparently just implemented the actual problem directly using their 53-qubit machine, and could infer the probability distribution by directly sampling a large number of outcomes.   They could get the answer this way in 3 min 20 sec for a number of qubits where it would take the best classical supercomputer 10000 years to simulate.  Very impressive and certainly a milestone (though the paper is not yet published or officially released).  This has led to some fascinating semantic discussions with colleagues of mine about what we mean by computation.  For example, this particular situation feels a bit to me like comparing the numerical solution to a complicated differential equation (i.e. some Runge-Kutta method) on a classical computer with an analog computer using op-amps and R/L/C components.  Is the quantum computer here really solving a computational problem, or is it being used as an experimental platform to simulate a quantum system?  And what is the difference, and does it matter?  Either way, a remarkable achievement.  (I'm also a bit jealous that Scott routinely has 100+ comment conversations on his blog.)
  • Speaking of computational solutions to complex problems.... Many people have heard about chaotic systems and why numerical solutions to differential equations can be fraught with peril due to, e.g., rounding errors.  However, I've seen two papers this week that show just how bad this can be.  This very good news release pointed me to this paper, where it shows that even using 64 bit precision doesn't save you from issues in some systems.  Also this blog post points to this paper, which shows that n-body gravitational simulations have all sorts of problems along these lines.  Yeow.
  • SpaceX has assembled their mammoth sub-orbital prototype down in Boca Chica.  This is going to be used for test flights up to 22 km altitude, and landings.  I swear, it looks like something out of Tintin or The Conquest of Space.  Awesome.
  • Time to start thinking about Nobel speculation.  Anyone?

Wednesday, September 18, 2019

DOE Experimental Condensed Matter PI Meeting, day 3 and wrapup

On the closing day of the PI meeting, some further points and wrap-up:

  • I had previously missed work that shows that electric field can modulate magnetic exchange in ultrathin iron (overview).
  • Ferroelectric layers can modulate transport in spin valves by altering the electronic energetic alignment at interfaces.  This can result in some unusual response (e.g., the sign of the magnetoresistance can flip with the sign of the current, implying spin-diode-like properties).
  • Artificial spin ices are still cool model systems.  With photoelectron emission microscopy (PEEM), it's possible to image ultrathin, single-domain structures to reveal their mangetization noninvasively.  This means movies can be made showing thermal fluctuations of the spin ice constituents, revealing the topological character of the magnetic excitations in these systems.  
  • Ultrathin oxide membranes mm in extent can be grown, detached from their growth substrates, and transferred or stacked.  When these membranes are really thin, it becomes difficult to nucleate cracks, allowing the membranes to withstand large strains (several percent!), opening up the study of strain effects on a variety of oxide systems.
  • Controlled growth of stacked phthalocyanines containing transition metals can generate nice model systems for studying 1d magnetism, even using conventional (large-area) methods like vibrating sample magnetometry.
  • In situ oxide MBE and ARPES, plus either vacuum annealing or ozone annealing, has allowed the investigation of the BSCCO superconducting phase diagram over the whole range of dopings, from severely underdoped to so overdoped that superconductivity is completely suppressed.  In the overdoped limit, analyzing the kink found in the band dispersion near the antinode, it seems superconductivity is suppressed at high doping because the coupling (to the mode that causes the kink) goes to zero at large doping.  
  • It's possible to grow nice films of C60 molecules on Bi2Se3 substrates, and use ARPES to see the complicated multiple valence bands at work in this system.  Moreover, by doing measurements as a function of the polarization of the incoming light, the particular molecular orbitals contributing to those bands can be identified.
  • Through careful control of conditions during vacuum filtration, it's possible to produce dense, locally crystalline films of aligned carbon nanotubes.  These have remarkable optical properties, and with the anisotropy of their electronic structure plus ultraconfined character, it's possible to get exciton polaritons in these into the ultrastrong coupling regime.
Overall this was a very strong meeting - the variety of topics in the program is impressive, and the work shown in the talks and posters was uniformly interesting and of high quality. 

Tuesday, September 17, 2019

DOE Experimental Condensed Matter PI Meeting, Day 2

Among the things I heard about today, as I wondered whether newly formed Tropical Storm Imelda would make my trip home a challenge:

  • In "B20" magnetic compounds, where the crystal structure is chiral but lacks mirror or inversion symmetry, a phase can form under some circumstances that is a spontaneous lattice of skyrmions.  By adding disorder through doping, it is possible to un-pin that lattice.
  • Amorphous cousins of those materials still show anomalous Hall effect (AHE), even though the usual interpretation these days of the AHE is as a consequence of Berry phase in momentum space that is deeply connected to having a lattice.  It's neat to see that some Berry physics survives even when the lattice does not.
  • There is a lot of interest in coupling surface states of topological insulators to ferromagnets, including using spin-orbit torque to switch the magnetization direction of a ferromagnetic insulator.
  • You could also try to switch the magnetization of \(\alpha-Fe_{2}O_{3}\) using spin-orbit torques, but watch out when you try to cram too much current through a 2 nm thick Pt film.
  • The interlayer magnetic exchange in van der Waals magnets continues to be interesting and rich.
  • Heck, you could look at several 2D materials with various kinds of reduced symmetry, to see what kinds of spin-orbit torques are possible.
  • It's always fun to find a material where there are oscillations in magnetization with applied field even though the bulk is an insulator.
  • Two-terminal devices made using (Weyl superconducting) MoTe2 show clear magnetoresistance signatures, indicating supercurrents carried along the material edges.
  • By side-gating graphene structures hooked up to superconductors, you can also make a superconducting quantum intereference device using edge states of the fractional quantum Hall effect.
  • In similar spirit, coupling a 2D topological insulator (1T'-WTe2) to a superconductor (NbSe2) means it's possible to use scanning tunneling spectroscopy to see induced superconducting properties in the edge state.
  • Just in time, another possible p-wave superconductor.
  • In a special stack sandwiching a TI between two magnetic TI layers, it's possible to gate the system to break inversion symmetry, and thus tune between quantum anomalous Hall and "topological Hall" response.
  • Via a typo on a slide, I learned of the existence of the Ohion, apparently the smallest quantized amount of Ohio.


DOE experimental condensed matter PI meeting, day 1

The first day of the DOE ECMP PI meeting was very full, including two poster sessions.  Here are a few fun items:

  • Transition metal dichalcogenides (TMDs) can have very strongly bound excitons, and if two different TMDs are stacked, you can have interlayer excitons, where the electron and hole reside in different TMD layers, perhaps separated by a layer or two of insulating hBN.  Those interlayer excitons can have long lifetimes, undergo Bose condensation, and have interesting optical properties.  See here, for example.   
  • Heterojunctions of different TMDs can produce moire lattices even with zero relative twist, and the moire coupling between the layers can strongly affect the optical properties via the excitons.
  • Propagating plasmons in graphene can have surprisingly high quality factors (~ 750), and combined with their strong confinement have interesting potential.
  • You can add AlAs quantum wells to the list of materials systems in which it is possible to have very clean electronic transport and see fractional quantum Hall physics, which is a bit different because of the valley degeneracy in the AlAs conduction band (that can be tuned by strain).
  • And you can toss in WSe2 in there, too - after building on this and improving material quality even further.
  • There continues to be progress in trying to interface quantum Hall edge states with superconductors, with the end goal of possible topological quantum computing.  A key question is understanding how the edge states undergo Andreev processes at superconducting contacts.
  • Application of pressure can take paired quantum Hall states (like those at \(\nu = 5/2, 7/2\)) and turn them into unpaired nematic states, a kind of quantum phase transition.  
  • With clever (and rather involved) designs, it is possible to make high quality interferometers for fractional quantum Hall edge states, setting the stage for detailed studies of exotic anyons.

Sunday, September 15, 2019

DOE Experimental Condensed Matter PI meeting, 2019

The US Department of Energy's Basic Energy Sciences component of the Office of Science funds a lot of basic scientific research, and for the last decade or so had a tradition of regular gatherings of their funded principal investigators for a number of programs.  Every two years there has been a PI meeting for the Experimental Condensed Matter Physics program, and this year's meeting starts tomorrow.

These meetings are very educational (at least for me) and, because of their modest size, a much better networking setting than large national conferences.  In past years I've tried to write up brief highlights of the meetings (for 2017, see a, b, c; for 2015 see a, b, c; for 2013 see a, b).   I will try to do this again; the format of the meeting has changed to include more poster sessions, which makes summarizing trickier, but we'll see.

update:  Here are my write-ups for day 1, day 2, and day 3.

Tuesday, September 10, 2019

Faculty position at Rice - Astronomy

Faculty position in Astronomy at Rice University

The Department of Physics and Astronomy at Rice University invites applications for a tenure-track faculty position in astronomy in the general field of galactic star formation and planet formation, including exoplanet characterization. We seek an outstanding theoretical, observational, or computational astronomer whose research will complement and extend existing activities in these areas within the department. In addition to developing an independent and vigorous research program, the successful applicant will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the department and university. The department expects to make the appointment at the assistant professor level. A Ph.D. in astronomy/astrophysics or related field is required.

Applications for this position must be submitted electronically at http://jobs.rice.edu/postings/21236. Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) teaching statement; (5) PDF copies of up to three publications; and (6) the names, affiliations, and email addresses of three professional references. We will begin reviewing applications December 1, 2019. To receive full consideration, all application materials must be received by January 10, 2020. The appointment is expected to begin in July, 2020.

Rice University is an Equal Opportunity Employer with a commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability, or protected veteran status. We encourage applicants from diverse backgrounds to apply.

Friday, September 06, 2019

Faculty position at Rice - Theoretical Biological Physics

Faculty position in Theoretical Biological Physics at Rice University

As part of the Vision for the Second Century (V2C2), which is focused on investments in research excellence, Rice University seeks faculty members, preferably at the assistant professor level, starting as early as July 1, 2020, in all areas of theoretical biological physics. Successful candidates will lead dynamic, innovative, and independent research programs supported by external funding, and will excel in teaching at the graduate and undergraduate levels, while embracing Rice’s culture of excellence and diversity.

This search will consider applicants from all science and engineering disciplines. Ideal candidates will pursue research with strong intellectual overlap with physics, chemistry, biosciences, bioengineering, chemical and biomolecular engineering, or other related disciplines. Applicants pursuing all styles of theory and computation integrating the physical and life sciences are encouraged to apply.

For full details and to apply, please visit https://jobs.rice.edu/postings/21170.  Applicants should please submit the following materials: (1) cover letter, including the names and contact information for three references, (2) curriculum vitae, (3) research statement, and (4) statement of teaching philosophy. Application review will commence no later than October 15, 2019 and continue until the position is filled. Candidates must have a PhD or equivalent degree and outstanding potential in research and teaching. We particularly encourage applications from women and members of historically underrepresented groups who bring diverse cultural experiences and who are especially qualified to mentor and advise members of our diverse student population.

Rice University, located in Houston, Texas, is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability, or protected veteran status.