Several items of note:
- Quanta Magazine remains a generally outstanding source of science articles and opinion pieces. In this opinion column, high energy theorist Robert Dijkgraaf gives his views on whether we are reaching "the end of physics". Spoilers: he thinks not, and condensed matter physics, with its emergence of remarkable phenomena from underlying building blocks, is one reason.
- Similarly, I should have pointed earlier to this interesting article by Natalie Wolchover, who asked a number of physicists to define what they mean by "a particle". I understand the mathematical answer ("a particle is an irreproducible representation of the Poincare group", meaning that it's an object defined by having particular numbers describing how it changes or doesn't under translations in time, space, and rotation). That also lends itself to a nice definition of a quasiparticle (such an object, but one that results from the collective action of underlying degrees of freedom, rather than existing in the universe's vacuum). As an experimentalist, though, I confess a fondness for other perspectives.
- Springer Nature has released its approach for handling open access publication. I don't think I'm alone in thinking that its fee structure is somewhere between absurd and obscene. It's simply absurd to think that the funding agencies (at least in the US) are going to allow people to budget €9,500 for a publication charge. That's equivalent to four months of support for a graduate student in my discipline. Moreover, the publisher is proposing to charge a non-refundable €2,190 fee just to have a manuscript evaluated for "guided open access" at Nature Physics. Not that I lack confidence in the quality of my group's work, but how could I possibly justify spending that much for a 75% probability of a rejection letter? Given that they do not pay referees, am I really supposed to believe that finding referees, soliciting reviews, and tracking manuscript progress costs the publisher €2,190 per submission?
- It's older news, but this approach to computation is an interesting one. Cerebras is implementing neural networks in hardware, and they are doing this through wafer-scale processors (!) with trillions (!) of transistors and hundreds of thousands of cores. There must be some impressive faulty tolerance built into their network training approach, because otherwise I'd be surprised if even the amazing manufacturing reliability of the semiconductor industry would produce a decent yield of these processors.
- Older still, one of my colleagues brought this article to my attention, about someone trying to come up with a way to play grandmaster-level chess in a short period of time. I don't buy into the hype, but it was an interesting look at how easy it seems to be now to pick up machine learning coding skills. (Instead of deeply studying chess, the idea was to find a compact, memorizable/mentally evaluatable decision algorithm for chess based on training a machine learning system against a dedicated chess engine.)