I have recently been wondering on the periodic table. How come we can talk about orbitals for complex nuclei, say copper, when there should exist a lot of correlation between electrons in different shells? The entaglement bewteen these electrons should prevent one fro meaningfully talk about orbitals, very naively. In other words, do we have a compelling explanation for madelung's rule from quantum mechanics? I understand techniques such as DFT can predict the correc electron density, but not the separate wave fucntions.
I am a fourth year Ph.D. student looking to apply for postdocs soon. I have thre papers in preparation, including one that was submitted. However, none have been published yet. I have presented my work at conferences and have gotten positive feedback on my results from fellow scientists.
Naively, it seems to me that the business of academia places a lot more emphasis on the number of papers one published than it does on the science that those papers are about.
This concerns me with regards to applying for postdocs, and eventually faculty positions. I feel like the job pressures would push me to think more about publishing ten 'average' papers a year, rather than publishing a few papers that I put a lot of effort into. This is especially true with the sequester coming up, and funding becoming ever more competitive.
Furthermore, I feel like I would be discouraged from taking risks or doing work that will not produce immediate publications. P.W. Anderson in his latest book, "More And Different", wrote about how if Schrodinger, Heisenberg and Dirac were assistant professors in modern America, they would never have been able to invent Wave Mechanics.
I guess my question is: am I being overly pessimistic? What advice would you give to young scientists concerned about the pressure to publish? Is it really a zero-sum game? I feel that a lot of young people have these concerns but do not articulate them.
On a blog discussion here some months ago, you mentioned you were going to discuss what engineers need from freshman physics. Did you have any further thoughts on that?
Thanks for the responses so far. Plenty of food for thought here.
Cesar, you're right that in some ways it's a miracle that the atomic orbital structure we learn about when solving the hydrogen atom for one electron is applicable for, e.g., uranium, with 92 electrons in an atomic-scale volume. It's the same main reason why the free electron picture works as well as it does for metals like aluminum and even gold. The Pauli exclusion principle leads to two consequences here. First, the kinetic energy tends to grow faster with increasing electron density than the interelectron interactions. Second, the requirement that the many-electron wavefunction be antisymmetric under exchange of any two electrons tends to keep the electrons comparatively far apart. It is impressive, though, isn't it!
Doru, I actually haven't published much in open access journals, but I try to put as much of my work as possible up on the arxiv. Open access is a great idea, but the current business model in many open access journals is a bit broken. I can't afford a relative thousands of dollars increase in publication costs per paper for the sake of open access. At the same time, putting a paper up on the arxiv costs me basically nothing and makes the work accessible to anyone with an internet connection.
Anon@3:25, you're not alone in having these concerns. I do think that most good people are aware of the timescale for publication, for example - that your big work from your thesis may not be all the way into print when you're giving talks about it to land a postdoc. Regarding the risk vs reward aspects of scientific publishing these days, there is cause for concern, but the situation isn't as bad as, e.g., Anderson outlines, in my view. It is true that assistant professors in the US are under many pressures and have lots of demands on their time, compared to, e.g., Heisenberg at his academic start. It's a fair question to ask whether that environment is the best for iconoclastic geniuses. It is also true that in my view there is too much emphasis (particularly at the hiring stage, but also at the promotion stage) on the idea that Science and Nature papers are the end-all and be-all. It is not a zero-sum game, though, in the sense that someone else's success does not take away success from you. (Well, I suppose there is a little of that, in the sense that there are a fixed number of, e.g., Nature covers per year, but that's the exception rather than the rule.) The adage is, professors are hired on promise and tenured on accomplishment. While publications in good journals and citations are not perfect metrics, they're better than nothing. It would be extremely risky for a junior faculty member to spend all of their time pursuing a "homerun" project that would not result in publications for the first five years. (That would also not be good for the students working on it, by the way.) Part of being successful is having a portfolio of projects that produce some results along the way. Remember, even Andrew Wiles waited until he had tenure before he spent several years chasing Fermat. (I also think Anderson is exaggerating for effect. Heisenberg and Pauli were recognized early as off-scale genius prodigies. They would've done fine.) I am happy to talk more about this, by the way.
Anzel, I really should write a full post about this. In short, engineers should come away learning quantitative reasoning and the crucial ideas of approximations (how to make them, and when do the ones underlying their favorite formulas break down) and limiting cases. The stereotype of engineer undergrads as people who just want to memorize the right formula is not the way to end up with good engineers.
Anon@10:43, that's tricky. I've certainly had ideas that didn't pan out. We had to yank back the last paper I'd worked on as a grad student after it had been accepted (but prior to publication) in PRL because we realized (thanks to follow-up experiments by my successor in the lab) that our assumptions about one tricky piece of thermometry were breaking down just where things seemed to be getting interesting. On the success front, I'm very pleased with how the simultaneous Raman and electronic transport measurements have worked out, particularly as it has become clearer recently that in many ways we got lucky (with our particular choice of device shape, for example).
Anon@5:24, I've posted about this a couple of times here and here. Short version: good marketing, not as fundamental as the inductor or the capacitor, and the colloquial definition has broadened now to include all history-dependent electronic elements.
"I did make it to John Martinis' talk about whether materials are good enough to build a superconducting quantum computer. It sounds like there is cause for cautious optimism, but wow is it going to be a difficult engineering task"
I am wondering about your perspective on the recent purchase of a D-wave "quantum computer" by Google (and Lockheed).
Last I heard - this was a few years ago - these D-wave machines were a bit of a joke. As far as I know, the company has never proven that they work as true quantum computers.
So I was really surprised to see that well-respected companies actually spent hard-earned cash on the equipment. For Lockheed Martin, one plausible explanation is that they are really interested in the early patents behind these systems.
But do you really believe that D-wave is able to solve NP-complete optimization problems better than classical computers?
I am concerned about the invasion of string theorists into the field of condensed matter. Can we expect anything positive from this development?
More recently, close connections between condensed matter problems and string theory have emerged. The AdS/CFT correspondence, discovered by string theorists, connects a string theory and gravity to a quantum field theory without gravity in one lower dimension. This correspondence has been applied to quantum condensed matter systems, revealing new insights into quantum phase transitions (transitions at zero temperature) and strongly correlated electron states.
Anon@2:48 - From what I remember, there are serious engineering issues with device uniformity, control of oxide properties, and even things like optimized layout to minimize the numbers of wirebonds. Martinis et al. want to implement an error correction scheme based on surface codes, and like most such approaches requires more physical qubits than logical qubits. This makes the layout particularly challenging.
Unknown, the ordinary Seebeck effect is the creation of a voltage (through the buildup of a charge imbalance) due to a temperature difference. The spin Seebeck effect is, I believe, the buildup of a net spin imbalance across some structure due to a temperature difference across the structure. The energy dependence of the density of states (the breaking of particle-hole symmetry) is important in the Seebeck coefficient. Naively (meaning I haven't really looked at this), I'd guess that the spin splitting of the band structure in a ferromagnet could lead to a different Seebeck coefficient for the different spin species. Likewise, the presence of strong spin-orbit coupling should matter. Sorry - I haven't had time to read up on this.
Anon@6:35, while I don't think of multiscale modeling as a discipline unto itself, it's a very important general problem. For example, right now there is no nice way even to do something relatively straightforward to state, like solve Maxwell's equations efficiently on length scales spanning from deep sub-wavelength to several wavelengths. That's even a simple case, solving one kind of problem across a large range of scales. Once you have to do, e.g., optical properties + electronic and heat transport across multiple intrinsic length scales, then things are very difficult. Describing that as ad hoc linking of models is rather pejorative. The people doing this right are not making ad hoc assumptions, but well-grounded ones.
Anon@6:45, if string-based math (and we can decide whether things like conformal field theory results are automatically stringy) lets people map hard strong-coupling (CM) problems onto simple weak-coupling (gravitational) analogs, then that's a good thing. However, it would be great if Subir Sachdev and the other practitioners of this would write a readable-for-non-field-theorists discussion of how these mappings work. Maybe such an article exists, but I haven't found it.
would you care to speculate on the logn-term future of computing technology over the next century? What types of new devices might replace/augment the traditional transistor? For example, do you think photonic logic might be used at the nanoscale? Surface plasmons? Individual spins/molecules/atoms?
Ray Kurzweil has conjectured that the full onset of the Technological Singularity will occur by the year 2045 C.E. What probability would you assign to the probability that Kurzweil's conjecture is correct? By the end of the year 2018 C.E. how many states in the U.S.A. do you think will have licensed Google's driverless automobiles?
Grumpy, my best guess is, for mainstream computing applications over the next couple of decades, silicon all the way down (to a few nm critical feature size), followed by 3d integration and architecture advances to allow cooling in such geometries. Longer term, probably some kind of quantum computation in niche applications. A rise in optical interconnect technology, but right now I don't see nanoscale photonic logic happening any time soon.
David Brown, while I respect Kurzweil's contributions in computing (e.g., optical character recognition), on this topic frankly I think he's a bit around the bend. I'm a bit sci-fi fan, but the Singularity as envisioned by some just isn't something I worry about. I think the probability that the Singularity (I'd be curious for your precise definition - strong AI, mind uploading, ubiquitous molecular nanofab or at least 3d printing, a post-scarcity societ?) happens for a significant part of the world's population by 2045 is very very small. Less than 1%. Regarding driverless cars, at most one state (California), though I think you are going to see some serious irrationality about risk from the general public on this one. Wait until the first accident where a driverless car has a failure or design glitch and kills a pedestrian. It won't matter what the ordinary per-mile-driven accident rates are - there will be big-dollar lawsuits and a backlash. I do think driverless cars will happen eventually, but five years is way to fast.
There are at least five linear transport laws, named after Ohm, Fick, Fourier, Darcy, and Newton. Newton’s Law is not F = ma, it is the transport of momentum across a velocity gradient. The laws have much in common. All of them go nonlinear if you try to go too fast. At low temperatures, Ohm’s Law and Darcy’s Law become supertransporters: something moves without a potential gradient.
In the old Engineering Library at Princeton, I read a speculation that cooled materials become supertransporters of Newtonian momentum, and for some materials super transport extends above room temperature. Consider the legs of your desk; they shrinketh not but they transporteth momentum. Gravity distorts telescope lenses less than a quarter wavelength of light during 100 years.
Later, I went back to the E-Quad library and couldn’t find the original reference.
The second edition of Pitzer and Brewer’s rewrite of Lewis and Randall’s Thermodynamics lists on page 513 eight pairs of intensive and extensive properties. For each pair, the product of intensive times extensive has the units of energy. (Their list can be extended, how about E = mc2)
However, right after their list, Pitzer and Brewer had to punt. They don’t list magnetic properties; the intensive-extensive pair does not have the units of energy. The intensive SI Tesla seems OK. But the conventional extensive side of the pair has units of ampere meter. Their product is not energy. On the other hand, magnetic moment has the units of ampere (turns) times the area of the coil. That’s much more intuitive than ampere-meter and the product has the units of energy.
Would “Killer E&M” need to have its textbook rewritten?
Wasn’t Ken Pitzer the President of Rice? BEE & AITCH
The mineral pyrite, fool’s gold, has the formula FeS2. Pyrite recently turned up in primary lithium batteries. What are the valences of iron and sulfur in pyrite?
With some pain, in high-school chemistry we all learned to balance oxidation-reduction equations. Up for discussion is the rumor that anyone with a program that solves simultaneous linear equations can balance a redox reaction. It’s built into the TI-84 pocket calculator.
For an example, try balancing the equation for a mixture of ammonium nitrate and nitromethane (CH3NO2). You want to end up with only nitrogen gas, water vapor, and carbon dioxide. It’s not very nice, but that was the recipe for the Oklahoma City bomb.
@Doug replying to Grumpy & David Brown: I hope Kurzweil is wrong, but I am not sure his prediction is nothing to worry about. My definition of the full onset of the Singularity is when (and if) the net AI IQ exceeds the sum of all the human IQs by a factor of at least one billion. The human brain might have about 100 billion neurons and 100 trillion synapses. The nervous system of the nematode C. elegans might have about 300 neurons. Thus my definition is roughly when (and if) the net AI IQ makes the net human IQ seem like a nematode level. (Bad news if it happens) http://www.singularityweblog.com/17-definitions-of-the-technological-singularity/ http://en.wikipedia.org/wiki/Google_driverless_car http://www.ted.com/talks/sebastian_thrun_google_s_driverless_car.html According to Wikipedia, Nevada, Florida, and California have already licensed the testing of Google's driverless automobiles on public streets. One such car has driven across the Golden Gate Bridge in ordinary traffic.
Anon re: Dwave - sorry - somehow blogger ate my original response to you and I didn't notice. Short version: I basically agree w/ Scott Aaronson on most things. Dwave's approach to publicity doesn't sit terribly well with me, though I understand it. Just because Google and LMCO spend money on something doesn't automatically make it correct, and both companies are willing to look at long-shot investment as a form of insurance.
Anon@4:25pm - There is a natural symbiosis between theorists and experimentalists. The best experimentalists understand a lot of theory, and know when to worry about quantitative precision vs. qualitative scaling; the best theorists understand quite a bit about relevant experiments.
Anon 8:57-9:13: 1) No, I don't think there's a deep connection between the Gibbs phase rule and geometric solids, though I'm prepared to learn if there is something deep there. 2) Don't know what you're asking here. The "laws" you mention (except Newton) are all related to linear-order solutions to transport-like equations. I suppose you could say that a "rigid body" is a super-tranporter of momentum, but I don't see how that's a useful way to think about things. 3) Magnetic units are awful, but if you properly treat magnetic work self-consistently you can find appropriate thermodynamic relations. See here: http://cdn.intechopen.com/pdfs/33437/InTech-Thermodynamics_of_electric_and_magnetic_systems.pdf Yes, Pitzer was president at Rice, before my time. 4) Wikipedia is your friend: http://en.wikipedia.org/wiki/Pyrite 5) Again, not sure what you're asking, though this comment sounds like Uncle Al a bit.
Anon 9:01 - Be careful. Science is pretty sloppy with the word "Law". When I teach freshman physics, I call Ohm's Law "Ohm's observation that was approximately true for materials of interest in 1827". Ohm also has a law of acoustics that is widely held to be untrue.
Doug- Thanks for your response to my question about D-wave. Knowing how fragile these nanoscale circuits can be, I wonder what will happen when one of the Josephson junctions of their fancy computer fails, and the $10M machine stops "annealing". I hope they have a good warranty!
Could you please comment on those recent experiments on MoS2 films? I read that MoS2 may be better than graphene for electronics applications because it has a large bandgap, while graphene does not.
Does MoS2 have drawbacks that might make its large-scale use problematic? Do you think one could achieve current densities in planar MoS2 that could compete with Si FinFETs?
And, lastly, is it an interesting material from the physicist's point of view?
Anon@4:40 - Perhaps that's a better question to ask a theorist. I think there are plenty of problems out there (e.g., general mechanism of high-Tc; computationally better ways to deal w/ exchange/correlations in DFT; strong coupling problems; quantum systems out of equilibrium).
Anon@12:58, I don't know of a book that really takes the experimental perspective like that. There are a couple of very nice books about low temperature experiments (Pobell; Richardson and Smith), but if there's a meso/nano one, I don't know it. I have a colleague yesterday who was bemoaning even the lack of a really definitive work about the experimental details of scanning tunneling spectroscopy.
Curious, MoS2 and other related dichalcogenides are very interesting. Spin orbit physics, correlation effects like charge density waves, the ability to intercalate things between the layers - all of these make these materials exciting. As for large scale use, I've said before there is an enormous economic inertia behind silicon. Given that growth of high quality, large-area MoS2 and the technology for making complementary devices on it are both rather immature, it seems unlikely that it will be a major player in conventional electronics applications for a long, long time. Still, it's early days yet.
Anon@3:24, well, the last revisions to Landau and Lifschitz are about fifty years in the past as far as I know. That means that it omits many things that have become very useful in modern experimental biology. That being said, LL is a tremendous set of books, and if you've really mastered it, you should be in great shape for a quantitative approach to many disciplines.
Want to Make Website or Software...? Any Type of Website and Software and Desktop and Web Based Applications. Want to Make Now, Just Contact us at Skype Skype : Jobz.Corner
Hi prof. Natelson,
ReplyDeleteI have recently been wondering on the periodic table. How come we can talk about orbitals for complex nuclei, say copper, when there should exist a lot of correlation between electrons in different shells? The entaglement bewteen these electrons should prevent one fro meaningfully talk about orbitals, very naively. In other words, do we have a compelling explanation for madelung's rule from quantum mechanics? I understand techniques such as DFT can predict the correc electron density, but not the separate wave fucntions.
Excuse the verborragia,
Cesar Uliana
I noticed you've got a few publications in open journals. What do you think about open access publishing?
ReplyDeleteDoru Constantin
http://blitiri.blogspot.com/
Dear Prof. Natelson,
ReplyDeleteI am a fourth year Ph.D. student looking to apply for postdocs soon. I have thre papers in preparation, including one that was submitted. However, none have been published yet. I have presented my work at conferences and have gotten positive feedback on my results from fellow scientists.
Naively, it seems to me that the business of academia places a lot more emphasis on the number of papers one published than it does on the science that those papers are about.
This concerns me with regards to applying for postdocs, and eventually faculty positions. I feel like the job pressures would push me to think more about publishing ten 'average' papers a year, rather than publishing a few papers that I put a lot of effort into. This is especially true with the sequester coming up, and funding becoming ever more competitive.
Furthermore, I feel like I would be discouraged from taking risks or doing work that will not produce immediate publications. P.W. Anderson in his latest book, "More And Different", wrote about how if Schrodinger, Heisenberg and Dirac were assistant professors in modern America, they would never have been able to invent Wave Mechanics.
I guess my question is: am I being overly pessimistic? What advice would you give to young scientists concerned about the pressure to publish? Is it really a zero-sum game? I feel that a lot of young people have these concerns but do not articulate them.
On a blog discussion here some months ago, you mentioned you were going to discuss what engineers need from freshman physics. Did you have any further thoughts on that?
ReplyDeleteWhat would you say has been the greatest moment in your scientific career so far and what was your worst/biggest failure?
ReplyDeleteWhat’s up with the story of the “memristor” – the so-called fourth fundamental circuit element?
ReplyDeleteThanks for the responses so far. Plenty of food for thought here.
ReplyDeleteCesar, you're right that in some ways it's a miracle that the atomic orbital structure we learn about when solving the hydrogen atom for one electron is applicable for, e.g., uranium, with 92 electrons in an atomic-scale volume. It's the same main reason why the free electron picture works as well as it does for metals like aluminum and even gold. The Pauli exclusion principle leads to two consequences here. First, the kinetic energy tends to grow faster with increasing electron density than the interelectron interactions. Second, the requirement that the many-electron wavefunction be antisymmetric under exchange of any two electrons tends to keep the electrons comparatively far apart. It is impressive, though, isn't it!
Doru, I actually haven't published much in open access journals, but I try to put as much of my work as possible up on the arxiv. Open access is a great idea, but the current business model in many open access journals is a bit broken. I can't afford a relative thousands of dollars increase in publication costs per paper for the sake of open access. At the same time, putting a paper up on the arxiv costs me basically nothing and makes the work accessible to anyone with an internet connection.
Anon@3:25, you're not alone in having these concerns. I do think that most good people are aware of the timescale for publication, for example - that your big work from your thesis may not be all the way into print when you're giving talks about it to land a postdoc. Regarding the risk vs reward aspects of scientific publishing these days, there is cause for concern, but the situation isn't as bad as, e.g., Anderson outlines, in my view. It is true that assistant professors in the US are under many pressures and have lots of demands on their time, compared to, e.g., Heisenberg at his academic start. It's a fair question to ask whether that environment is the best for iconoclastic geniuses. It is also true that in my view there is too much emphasis (particularly at the hiring stage, but also at the promotion stage) on the idea that Science and Nature papers are the end-all and be-all. It is not a zero-sum game, though, in the sense that someone else's success does not take away success from you. (Well, I suppose there is a little of that, in the sense that there are a fixed number of, e.g., Nature covers per year, but that's the exception rather than the rule.) The adage is, professors are hired on promise and tenured on accomplishment. While publications in good journals and citations are not perfect metrics, they're better than nothing. It would be extremely risky for a junior faculty member to spend all of their time pursuing a "homerun" project that would not result in publications for the first five years. (That would also not be good for the students working on it, by the way.) Part of being successful is having a portfolio of projects that produce some results along the way. Remember, even Andrew Wiles waited until he had tenure before he spent several years chasing Fermat. (I also think Anderson is exaggerating for effect. Heisenberg and Pauli were recognized early as off-scale genius prodigies. They would've done fine.) I am happy to talk more about this, by the way.
ReplyDeleteAnzel, I really should write a full post about this. In short, engineers should come away learning quantitative reasoning and the crucial ideas of approximations (how to make them, and when do the ones underlying their favorite formulas break down) and limiting cases. The stereotype of engineer undergrads as people who just want to memorize the right formula is not the way to end up with good engineers.
Anon@10:43, that's tricky. I've certainly had ideas that didn't pan out. We had to yank back the last paper I'd worked on as a grad student after it had been accepted (but prior to publication) in PRL because we realized (thanks to follow-up experiments by my successor in the lab) that our assumptions about one tricky piece of thermometry were breaking down just where things seemed to be getting interesting. On the success front, I'm very pleased with how the simultaneous Raman and electronic transport measurements have worked out, particularly as it has become clearer recently that in many ways we got lucky (with our particular choice of device shape, for example).
Anon@5:24, I've posted about this a couple of times here and here. Short version: good marketing, not as fundamental as the inductor or the capacitor, and the colloquial definition has broadened now to include all history-dependent electronic elements.
http://nanoscale.blogspot.ca/2013/03/aps-march-meeting-day-4-and-wrapup-for.html
ReplyDelete"I did make it to John Martinis' talk about whether materials are good enough to build a superconducting quantum computer. It sounds like there is cause for cautious optimism, but wow is it going to be a difficult engineering task"
What are the specific challenges?
What is the spin Seebeck effect?
ReplyDeleteDoug-
ReplyDeleteI am wondering about your perspective on the recent purchase of a D-wave "quantum computer" by Google (and Lockheed).
Last I heard - this was a few years ago - these D-wave machines were a bit of a joke. As far as I know, the company has never proven that they work as true quantum computers.
So I was really surprised to see that well-respected companies actually spent hard-earned cash on the equipment. For Lockheed Martin, one plausible explanation is that they are really interested in the early patents behind these systems.
But do you really believe that D-wave is able to solve NP-complete optimization problems better than classical computers?
Thanks!
Does the so called multiscale modelling a credible field of study at this time or is it a simply an ad-hoc coupling of phenomenological models
ReplyDeleteI am concerned about the invasion of string theorists into the field of condensed matter. Can we expect anything positive from this development?
ReplyDeleteMore recently, close connections between condensed matter problems and string theory have emerged. The AdS/CFT correspondence, discovered by string theorists, connects a string theory and gravity to a quantum field theory without gravity in one lower dimension. This correspondence has been applied to quantum condensed matter systems, revealing new insights into quantum phase transitions (transitions at zero temperature) and strongly correlated electron states.
Anon@2:48 - From what I remember, there are serious engineering issues with device uniformity, control of oxide properties, and even things like optimized layout to minimize the numbers of wirebonds. Martinis et al. want to implement an error correction scheme based on surface codes, and like most such approaches requires more physical qubits than logical qubits. This makes the layout particularly challenging.
ReplyDeleteUnknown, the ordinary Seebeck effect is the creation of a voltage (through the buildup of a charge imbalance) due to a temperature difference. The spin Seebeck effect is, I believe, the buildup of a net spin imbalance across some structure due to a temperature difference across the structure. The energy dependence of the density of states (the breaking of particle-hole symmetry) is important in the Seebeck coefficient. Naively (meaning I haven't really looked at this), I'd guess that the spin splitting of the band structure in a ferromagnet could lead to a different Seebeck coefficient for the different spin species. Likewise, the presence of strong spin-orbit coupling should matter. Sorry - I haven't had time to read up on this.
Anon@6:35, while I don't think of multiscale modeling as a discipline unto itself, it's a very important general problem. For example, right now there is no nice way even to do something relatively straightforward to state, like solve Maxwell's equations efficiently on length scales spanning from deep sub-wavelength to several wavelengths. That's even a simple case, solving one kind of problem across a large range of scales. Once you have to do, e.g., optical properties + electronic and heat transport across multiple intrinsic length scales, then things are very difficult. Describing that as ad hoc linking of models is rather pejorative. The people doing this right are not making ad hoc assumptions, but well-grounded ones.
ReplyDeleteAnon@6:45, if string-based math (and we can decide whether things like conformal field theory results are automatically stringy) lets people map hard strong-coupling (CM) problems onto simple weak-coupling (gravitational) analogs, then that's a good thing. However, it would be great if Subir Sachdev and the other practitioners of this would write a readable-for-non-field-theorists discussion of how these mappings work. Maybe such an article exists, but I haven't found it.
Here is a review article that Subir Sachdev wrote last year:
ReplyDeletehttp://arxiv.org/abs/1108.1197
It's got some specialist stuff in there, but I found it to be the most helpful 'general' intro to AdS/CFT for condensed matter.
would you care to speculate on the logn-term future of computing technology over the next century? What types of new devices might replace/augment the traditional transistor? For example, do you think photonic logic might be used at the nanoscale? Surface plasmons? Individual spins/molecules/atoms?
ReplyDeleteRay Kurzweil has conjectured that the full onset of the Technological Singularity will occur by the year 2045 C.E. What probability would you assign to the probability that Kurzweil's conjecture is correct? By the end of the year 2018 C.E. how many states in the U.S.A. do you think will have licensed Google's driverless automobiles?
ReplyDeleteGrumpy, my best guess is, for mainstream computing applications over the next couple of decades, silicon all the way down (to a few nm critical feature size), followed by 3d integration and architecture advances to allow cooling in such geometries. Longer term, probably some kind of quantum computation in niche applications. A rise in optical interconnect technology, but right now I don't see nanoscale photonic logic happening any time soon.
ReplyDeleteDavid Brown, while I respect Kurzweil's contributions in computing (e.g., optical character recognition), on this topic frankly I think he's a bit around the bend. I'm a bit sci-fi fan, but the Singularity as envisioned by some just isn't something I worry about. I think the probability that the Singularity (I'd be curious for your precise definition - strong AI, mind uploading, ubiquitous molecular nanofab or at least 3d printing, a post-scarcity societ?) happens for a significant part of the world's population by 2045 is very very small. Less than 1%. Regarding driverless cars, at most one state (California), though I think you are going to see some serious irrationality about risk from the general public on this one. Wait until the first accident where a driverless car has a failure or design glitch and kills a pedestrian. It won't matter what the ordinary per-mile-driven accident rates are - there will be big-dollar lawsuits and a backlash. I do think driverless cars will happen eventually, but five years is way to fast.
I would like your opinion on the relationship between theorists and experimentalists in condensed matter.
ReplyDeleteIt seems there exists a substantial divide.
http://condensedconcepts.blogspot.ca/2013/05/listen-to-experimentalists-sometimes.html#comment-form
F + P = C + 2 Gibbs
ReplyDeleteF + V = E + 2 Euler
Are these two equations the same thing? Is Gibbs’ degrees of Freedom, number of Phases, and Components related to Euler’s Faces, Vertices, and Edges?
Love,
Deeth Keynes-Neff
There are at least five linear transport laws, named after Ohm, Fick, Fourier, Darcy, and Newton. Newton’s Law is not F = ma, it is the transport of momentum across a velocity gradient. The laws have much in common. All of them go nonlinear if you try to go too fast. At low temperatures, Ohm’s Law and Darcy’s Law become supertransporters: something moves without a potential gradient.
ReplyDeleteIn the old Engineering Library at Princeton, I read a speculation that cooled materials become supertransporters of Newtonian momentum, and for some materials super transport extends above room temperature. Consider the legs of your desk; they shrinketh not but they transporteth momentum. Gravity distorts telescope lenses less than a quarter wavelength of light during 100 years.
Later, I went back to the E-Quad library and couldn’t find the original reference.
Regards,
Que es, Dee
The second edition of Pitzer and Brewer’s rewrite of Lewis and Randall’s Thermodynamics lists on page 513 eight pairs of intensive and extensive properties. For each pair, the product of intensive times extensive has the units of energy. (Their list can be extended, how about E = mc2)
ReplyDeleteHowever, right after their list, Pitzer and Brewer had to punt. They don’t list magnetic properties; the intensive-extensive pair does not have the units of energy. The intensive SI Tesla seems OK. But the conventional extensive side of the pair has units of ampere meter. Their product is not energy. On the other hand, magnetic moment has the units of ampere (turns) times the area of the coil. That’s much more intuitive than ampere-meter and the product has the units of energy.
Would “Killer E&M” need to have its textbook rewritten?
Wasn’t Ken Pitzer the President of Rice?
BEE & AITCH
Here’s one from condensed matter:
ReplyDeleteThe mineral pyrite, fool’s gold, has the formula FeS2. Pyrite recently turned up in primary lithium batteries. What are the valences of iron and sulfur in pyrite?
Enjoy,
Rox for Jacques
With some pain, in high-school chemistry we all learned to balance oxidation-reduction equations. Up for discussion is the rumor that anyone with a program that solves simultaneous linear equations can balance a redox reaction. It’s built into the TI-84 pocket calculator.
ReplyDeleteFor an example, try balancing the equation for a mixture of ammonium nitrate and nitromethane (CH3NO2). You want to end up with only nitrogen gas, water vapor, and carbon dioxide. It’s not very nice, but that was the recipe for the Oklahoma City bomb.
Sincerely,
Born in Oklahoma City
Hi Doug,
ReplyDeleteI think I found the answer to my previous question about the D-wave news... Scott Aaronson has a very informative post about the "quantum annealer":
http://www.scottaaronson.com/blog/?p=1400
Vive l'internet!
@Doug replying to Grumpy & David Brown: I hope Kurzweil is wrong, but I am not sure his prediction is nothing to worry about. My definition of the full onset of the Singularity is when (and if) the net AI IQ exceeds the sum of all the human IQs by a factor of at least one billion. The human brain might have about 100 billion neurons and 100 trillion synapses. The nervous system of the nematode C. elegans might have about 300 neurons. Thus my definition is roughly when (and if) the net AI IQ makes the net human IQ seem like a nematode level. (Bad news if it happens)
ReplyDeletehttp://www.singularityweblog.com/17-definitions-of-the-technological-singularity/
http://en.wikipedia.org/wiki/Google_driverless_car
http://www.ted.com/talks/sebastian_thrun_google_s_driverless_car.html According to Wikipedia, Nevada, Florida, and California have already licensed the testing of Google's driverless automobiles on public streets. One such car has driven across the Golden Gate Bridge in ordinary traffic.
Anon re: Dwave - sorry - somehow blogger ate my original response to you and I didn't notice. Short version: I basically agree w/ Scott Aaronson on most things. Dwave's approach to publicity doesn't sit terribly well with me, though I understand it. Just because Google and LMCO spend money on something doesn't automatically make it correct, and both companies are willing to look at long-shot investment as a form of insurance.
ReplyDeleteAnon@4:25pm - There is a natural symbiosis between theorists and experimentalists. The best experimentalists understand a lot of theory, and know when to worry about quantitative precision vs. qualitative scaling; the best theorists understand quite a bit about relevant experiments.
Anon 8:57-9:13:
ReplyDelete1) No, I don't think there's a deep connection between the Gibbs phase rule and geometric solids, though I'm prepared to learn if there is something deep there.
2) Don't know what you're asking here. The "laws" you mention (except Newton) are all related to linear-order solutions to transport-like equations. I suppose you could say that a "rigid body" is a super-tranporter of momentum, but I don't see how that's a useful way to think about things.
3) Magnetic units are awful, but if you properly treat magnetic work self-consistently you can find appropriate thermodynamic relations. See here: http://cdn.intechopen.com/pdfs/33437/InTech-Thermodynamics_of_electric_and_magnetic_systems.pdf
Yes, Pitzer was president at Rice, before my time.
4) Wikipedia is your friend: http://en.wikipedia.org/wiki/Pyrite
5) Again, not sure what you're asking, though this comment sounds like Uncle Al a bit.
Anon 9:01 - Be careful. Science is pretty sloppy with the word "Law". When I teach freshman physics, I call Ohm's Law "Ohm's observation that was approximately true for materials of interest in 1827". Ohm also has a law of acoustics that is widely held to be untrue.
ReplyDeleteDo you have a personal list of the top 10 outstanding theoretical problems to consider in condensed matter?
ReplyDeleteIs there any in depth books on the subtle issues of experimentation at the microscopic and mesoscopic scales
ReplyDeleteDoug- Thanks for your response to my question about D-wave. Knowing how fragile these nanoscale circuits can be, I wonder what will happen when one of the Josephson junctions of their fancy computer fails, and the $10M machine stops "annealing". I hope they have a good warranty!
ReplyDeleteDear Prof. Natelson,
ReplyDeleteCould you please comment on those recent experiments on MoS2 films? I read that MoS2 may be better than graphene for electronics applications because it has a large bandgap, while graphene does not.
Does MoS2 have drawbacks that might make its large-scale use problematic? Do you think one could achieve current densities in planar MoS2 that could compete with Si FinFETs?
And, lastly, is it an interesting material from the physicist's point of view?
Best regards,
Anon@4:40 - Perhaps that's a better question to ask a theorist. I think there are plenty of problems out there (e.g., general mechanism of high-Tc; computationally better ways to deal w/ exchange/correlations in DFT; strong coupling problems; quantum systems out of equilibrium).
ReplyDeleteAnon@12:58, I don't know of a book that really takes the experimental perspective like that. There are a couple of very nice books about low temperature experiments (Pobell; Richardson and Smith), but if there's a meso/nano one, I don't know it. I have a colleague yesterday who was bemoaning even the lack of a really definitive work about the experimental details of scanning tunneling spectroscopy.
Curious, MoS2 and other related dichalcogenides are very interesting. Spin orbit physics, correlation effects like charge density waves, the ability to intercalate things between the layers - all of these make these materials exciting. As for large scale use, I've said before there is an enormous economic inertia behind silicon. Given that growth of high quality, large-area MoS2 and the technology for making complementary devices on it are both rather immature, it seems unlikely that it will be a major player in conventional electronics applications for a long, long time. Still, it's early days yet.
Cynical question, lets say i'm an amateur who has studied all the volumes of the Course of Theoretical Physics:
ReplyDeletehttp://en.wikipedia.org/wiki/Course_of_Theoretical_Physics
and wish to study Cell Biology. am i actually missing in regards to a proper grounding in physics due to recent research?
Anon@3:24, well, the last revisions to Landau and Lifschitz are about fifty years in the past as far as I know. That means that it omits many things that have become very useful in modern experimental biology. That being said, LL is a tremendous set of books, and if you've really mastered it, you should be in great shape for a quantitative approach to many disciplines.
ReplyDeleteYou have addressed molecular electronics so I will ask 'what is the state of superconductors?'
ReplyDeletehttp://en.wikipedia.org/wiki/Quantum_biology
ReplyDeleteThe physics department should get funds for this?
Want to Make Website or Software...? Any Type of Website and Software and Desktop and Web Based Applications.
ReplyDeleteWant to Make Now, Just Contact us at Skype
Skype : Jobz.Corner
You completed certain reliable points there. I did a search on the subject and found nearly all persons will agree with your blog.
ReplyDeleteThis blog is one of a kind! Showing tips on how artists do their artwork is very useful to other artist also.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDelete