My student (with theorist collaborators) had a paper published online in Nature Nanotechnology yesterday, and this gives me an excuse to talk about using metal nanostructures as optical antennas. The short version: using metal electrodes separated by a sub-nanometer gap as a kind of antenna, we have been able to get local enhancement of the electromagnetic intensity by roughly a factor of a million (!), and we have been able to determine that enhancement experimentally via tunneling measurements.
As I've discussed previously, light can excite collective excitations (plasmons) of the electronic fluid in a metal. Because these plasmons involve displacing the electrons relative to the ions, they are associated with local electric fields at the metal surface. When the incident light is resonant with the natural frequency of these modes, the result can be local electromagnetic fields near the metal that can significantly exceed the fields from the incident light. These enhanced local fields can be useful for many things, from spectroscopy to nonlinear optics. One way to get particularly large field enhancements is to look at the region separating two very closely spaced plasmonic structures. For example, closely spaced metal nanoparticles have been used to enhance fields sufficiently in the interparticle gap to allow single-molecule Raman spectroscopy (see here and here).
A major challenge, however, has been to get an experimental measure of those local fields in such gaps. That is where tunneling comes in. In a tunnel junction, electrons are able to "tunnel" quantum mechanically from one electrode to the other. The resulting current as a function of voltage may be slightly nonlinear, meaning that (unlike in a simple resistor) the second derivative of current with respect to voltage (d2I/dV2) is non-zero. From a simple math argument, the presence of a nonlinearity like this means that an AC voltage applied across the junction gives rise to a DC current proportional to the nonlinearity, a process called "rectification". What we have done is turned this around. We use low frequency (kHz) electronic measurements to determine the nonlinearity. We then measure the component of the DC current due to light shining on the junction (for experts: we can do this with lock-in methods at the same time as measuring the nonlinearity). We can then use the measured nonlinearity and photocurrent to determine the optical-frequency voltage that must be driving the tunneling photocurrent. From the tunneling conductance, we can also estimate the distance scale over which tunneling takes place. Dividing the optical frequency voltage by that distance gives us the optical-frequency electric field at the tunneling gap, which may be compared with the field from the incident light to get the enhancement.
It's not at all obvious on the face of it that this should work. After all, the analysis relies on the idea that the tunneling nonlinearity measured at kHz frequencies is still valid at frequencies nearly 1012 times higher. Experimentally, the data show that this does work, however, and our theorist colleagues are able to explain why.
When you think about it, it's pretty amazing. The radiation intensity in the little nanogap between our electrodes can be hundreds of thousands or millions of times higher than that from the incident laser. Wild stuff, and definitely food for thought.