I've posted about this topic before, but Gordon Watts' recent post on the subject of long-term research makes me want to throw this out there again. That, and the disturbing news I heard at the APS March Meeting about a round of layoffs of some of the few remaining physical sciences researchers at Bell Labs. It's terribly depressing: since my time in high school, long-term industrial R&D has been gutted in this country (and in most of the world). "Long-term" now means two years. Companies are under so much pressure to have year-over-year quarterly revenue increases that they blanch at the idea of spending money on something risky that may not lead to a big revenue stream quickly. Maybe that's always been true to some extent, and places like Bell Labs and IBM Research (and RCA and GE Research and GM and Ford Scientific and Westinghouse Research) were all effectively accidental monopolies or near-monopolies when they had major research labs. It's demonstrably much worse now.
More distressing to me is the tacit assumption, mentioned by Gordon, that university research will somehow pick up the slack. That is, federal dollars are more appropriate for this kind of basic work, and companies can always fund university labs to do work for them, too. Anyone who knows how university research actually works can tell you many reasons why this is a bad idea. Apart from low-level practical considerations (publish vs patent? foreign vs. domestic students? export controls?), the big killer here is just one of resources. Back when I was at Bell, if they wanted to they could have put a dozen condensed matter PhDs to work on a problem, along with technical support staff. Given how universities work, with teaching commitments, administrative tasks, student timescales, etc., no university achieve that kind of critical mass.