- Scott Aaronson has a great summary/discussion about the forthcoming google/John Martinis result about quantum supremacy. The super short version: There is a problem called "random circuit sampling", where a sequence of quantum gate operations is applied to some number of quantum bits, and one would like to know the probability distribution of the outcomes. Simulating this classically becomes very very hard as the number of qubits grows. The google team apparently just implemented the actual problem directly using their 53-qubit machine, and could infer the probability distribution by directly sampling a large number of outcomes. They could get the answer this way in 3 min 20 sec for a number of qubits where it would take the best classical supercomputer 10000 years to simulate. Very impressive and certainly a milestone (though the paper is not yet published or officially released). This has led to some fascinating semantic discussions with colleagues of mine about what we mean by computation. For example, this particular situation feels a bit to me like comparing the numerical solution to a complicated differential equation (i.e. some Runge-Kutta method) on a classical computer with an analog computer using op-amps and R/L/C components. Is the quantum computer here really solving a computational problem, or is it being used as an experimental platform to simulate a quantum system? And what is the difference, and does it matter? Either way, a remarkable achievement. (I'm also a bit jealous that Scott routinely has 100+ comment conversations on his blog.)
- Speaking of computational solutions to complex problems.... Many people have heard about chaotic systems and why numerical solutions to differential equations can be fraught with peril due to, e.g., rounding errors. However, I've seen two papers this week that show just how bad this can be. This very good news release pointed me to this paper, where it shows that even using 64 bit precision doesn't save you from issues in some systems. Also this blog post points to this paper, which shows that n-body gravitational simulations have all sorts of problems along these lines. Yeow.
- SpaceX has assembled their mammoth sub-orbital prototype down in Boca Chica. This is going to be used for test flights up to 22 km altitude, and landings. I swear, it looks like something out of Tintin or The Conquest of Space. Awesome.
- Time to start thinking about Nobel speculation. Anyone?
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Search This Blog
Saturday, September 28, 2019
Items of interest
As I struggle with being swamped this semester, some news items:
Subscribe to:
Post Comments (Atom)
15 comments:
I have to say the Google paper is going to be a great milestone for Quantum Computing. I just hope the hype won't cause letdowns and disinterest down the road when the field runs into problems and will need public support.
On the Nobel: my guess is Quantum Information, e.g. Aspect and others.
>>
Time to start thinking about Nobel speculation. Anyone?
<<
Such shameless vote courting, Doug. You're better than that!
:)
Swamped as you are, I don't expect you want to have 50 comments each day to moderate and perhaps to reply to. I'm currently getting e-mail notifications after Sabine's post about MWI: you don't want that extra work.
I think your first comment is almost correct, "this particular situation feels a bit to me like comparing the numerical solution to a complicated differential equation (i.e. some Runge-Kutta method) on a classical computer with an analog computer using op-amps and R/L/C components", with the proviso that noise is an important part of the way the solution is found, so not just a differential equation and an analog computer of the R/L/C kind. Monte Carlo and variants thereof is a better analogy, IMO. There are many ways to model such systems, and people can have their favorites, of which mine is to use a random field because such can be extremely close to a quantum field, as I argue in Physica Scripta, 94 (2019) 075003 (12pp), "Classical states, quantum field measurement", arXiv:1709.06711. In answer to your questions "what is the difference, and does it matter?", however, I think the answers are "so little as to make no difference" and "no": QM modeling is very effective.
IMO, your next bullet point is vital. Specifying the properties of a system by specifying its dynamics is problematic because approximations are almost always asymptotic series. Wilson&Kogut have what seems to me might be good advice, that it might be helpful to develop a formalism in which "the behavior of the system is determined primarily by the fact that there is cooperative behavior, plus the nature of the degrees of freedom themselves. The interaction Hamiltonian plays only a secondary role." Ask how different wave numbers at different scales are correlated with each other, in mathematical detail, without asking what the dynamics might be that causes those correlations to be as they are; of course for some people this will not be anywhere enough explanation, but if it's mathematically better-defined than renormalization I'm for it.
See? You want a hundred comments that engage closely with your post, but instead you'll more likely get a hundred comments that are as wayward as the above or more. Part of the reason people comment on her posts a lot is that she often answers, most often very briefly, but not infrequently at some length, so that there's a sense of real engagement. She's been doing that for a long time. Peter Woit, OTOH, moderates ferociously, so that comments that he thinks are not absolutely to the point never appear. Your version of a blog is wonderful just as it is, however! More condensed matter focus is more what the world needs than another blog hacking away at fashionable topics about QM.
Berry and Aharonov
Thanks all. Peter, it might be more accurate to say that I’m jealous of the magnitude of his readership, rather than wishing I had dozens of comments to which to reply. Regarding the comparison with analog computers: I still think there is a vital distinction between, e.g., computationally solving the equations of motion for a rolling coin to determine its trajectory, and actually rolling a coin and measuring its trajectory. In some sense the present experiment is a validation of quantum simulation more than quantum computation. That’s still impressive and a landmark. Implementing Shor’s algorithm to factor some enormous number, for instance, would feel to my tastes more like a computing demonstration.
The Nobels this year do seem to be particularly hard to predict. But why not topological insulators? Or has that ship sailed?
Berry. By himself.
Berry and Aharonov has been my go-to pick for a long time, but I keep being wrong. Peter, as more of an optics person than me, do you think that Berry is out of the running because Pancharatnam did intellectually related work and is dead? Aspect and other Bell’s Inequality folks are also a perennial favorite.
Thats a good point, but dunno regarding Berry vis a vis Pancharatnam... In a ironic twist, it is Berry who was mostly responsible (AFAIK) for publicizing Pancharatnam's work. Those papers of Berry's are ridiculously even handed and fair, but my impression is that Berry understood much much more than Pancharatnam. It would be amusing to know if Berry himself thinks he deserves a Nobel prize.
Clauser, Aspect, and... Zeilinger? That would be a good one.
https://www.currentscience.ac.in/Downloads/article_id_067_04_0220_0223_0.pdf
Embarrassingly white and male:
1. Josephson devices: Martinis, Devoret, Clarke (or substitute Schoelkopf?)
2. Meta-optics: Pendry, Joannopoulis, Capasso
3. Quantum dots: Ekimov, Efros, Brus
@Grumpy: if Federico wins, it will be for the QC laser idea, not for meta-optics. Which would be well deserved as a stand-alone prize, in my opinion.
Other possibilities in optics:
(a) Pendry and Smith for metamaterials (unlikely that Joannopoulis would share that, I think)
(b) Yablonovich (and perhaps Sajeev John) for photonic crystal stuff
If Louis Brus wins, my sense is that it would more likely be the chemistry prize not the physics prize.
On the other hand, last year's prize was optics, so this year's prize almost certainly won't be.
Another optics possibility:
Ferenc Kraus, Paul Corkum, and maybe also Margaret Murnane, for high-harmonic generation and attosecond science. That topic is bound to warrant a prize eventually. Maybe it's too soon. Definitely too soon after Mourou/Strickland.
DanM, agreed on both fronts re: HHG/ultrafast (deserving but won't be this year).
Doubt photonic crystals will win on their own, which is why I thought it may be combined with other meta-optics. But you're probably right that Capasso wouldn't be included in that one but could win for qcl.
I've been trying to guess Nobels for last ~10 years and am wrong literally every year.
I was right once, when Zewail won the Chemistry prize. The day before he won, I swore that I'd be in a bad mood for a year if he won. Unfortunately, people heard me, so I had to stay in a bad mood for a whole year. That's more challenging than it sounds, unless, perhaps, your name is "Grumpy".
Post a Comment