Arthur C. Clarke's most famous epigram is that "Any sufficiently advanced technology is indistinguishable from magic." A question that I've heard debated in recent years is, have we gone far enough down that road that it's adversely affecting the science and engineering education pipeline? There was a time when young people interested in technology could rip things apart and actually get a moderately good sense of how those gadgets worked. This learning-through-disassembly approach is still encouraged, but the scope is much more limited.
For example, when I was a kid (back in the dim mists of time known as the 1970s and early 80s), I ripped apart transistor radios and at least one old, busted TV. Inside the radios, I saw how the AM tuner worked by sliding a metal contact along a wire solenoid - I learned later that this was tuning an inductor-capacitor resonator, and that the then-mysterious diodes in there (the only parts on the circuit board with some kind of polarity stamped on them, aside from the electrolytic capacitors on the power supply side) somehow were important at getting the signal out. Inside the TV, I saw that there was a whopping big transformer, some electromagnets, and that the screen was actually the front face of a big (13 inch diagonal!) vacuum tube. My dad explained to me that the electromagnets helped raster an electron beam back and forth in there, which smacked on phosphors on the inside of the screen. Putting a big permanent magnet up against the front of a screen distorted the picture and warped the colors in a cool way that depended strongly on the distance between the magnet and the screen, and on the magnet's orientation, thanks to the magnet screwing with the electron beam's trajectory.
Now, a kid opening up an ipod or little portable radio will find undocumented integrated circuits that do the digital tuning. Flat screen LCD TVs are also much more black-box-like (though the light source is obvious), again containing lots of integrated circuits. Touch screens, the accelerometers that determine which way to orient the image on a cell phone's screen, the chip that actually takes the pictures in a cell phone camera - all of these seem almost magical, and they are either packaged monolithically (and inscrutably), or all the really cool bits are too small to see without a high-power optical microscope. Even automobiles are harder to figure out, with lots of sensors, solid-state electronics, and an architecture that often actively hampers investigation.
I fully realize that I'm verging on sounding like a grumpy old man with an onion on his belt (non-US readers: see transcript here). Still, the fact that understanding of everyday technology is becoming increasingly inaccessible, disconnected with common sense and daily experience, does seem like a cause for concern. Chemistry sets, electronics sets, arduinos and raspberry pi-s, these are all ways to fight this trend, and their use should be encouraged!