Sunday, May 4, 2008

Memristors: Rewriting Electronics Theory

[by Mr.Hengist]

Researchers at Hewlett-Packard’s Information and Quantum Systems Lab have created the memristor, the last to be created of the four fundamental circuit elements, joining resistors, capacitors, and inductors. It’s big news in the Electrical Engineering community as it completes the set and, with development and refinement, should lead to significant new capabilities in the field of electronics, not the least of which is a reduction in the leakage current which plagues modern computer chips.

Leakage current in chips is akin to the problem of a leaky aqueduct: more leakage means inefficiency in water transport – and a lot of soggy ground around the leaks. Leakage current is why chips get hot when they run; in personal computers, modern CPUs leak so much electricity that they need at a minimum fan-assisted heatsinks to keep from self-destructing when under load, and motherboards now use cross-connected heatpipes and heatsinks on support chips to keep them from burning themselves out. It’s a problem that’s gotten bad and is getting worse; the main driver of increased chip speed for the last two decades has been feature shrink – smaller circuitry – which in turn has the natural side effect of making the circuitry faster, but feature shrink also results in an increase in leakage current. The writing on the walls has been clear for years: if this problem is not mitigated it will be the limiting factor to what has been the most effective means of making computers faster and cheaper.

Memristors may or may not be a big part of the solution to the leakage current problem in semiconductors, but this discovery is not just adding another class of widget to the toolbox of electrical engineers: it’s going to rewrite their textbooks on electronics. As this article in EETimes explains:

The hold-up over the last 37 years, according to professor Chua, has been a misconception that has pervaded electronic circuit theory. That misconception is that the fundamental relationship in passive circuitry is between voltage and charge. What the researchers contend is that the fundamental relationship is actually between changes-in-voltage, or flux, and charge. Such is the insight that enabled HP to invent the memristor, according to Chua and Williams.


What astonishes me is that such changes are possible even today. Just as the discoveries of dark energy and dark matter have turned upside-down our understanding of the composition of the universe, so too are there discoveries being made which render our understanding of basic principals in well-studied fields moot. How did we get so far while still not knowing this? The tragedy of scientific understanding that it is usually wrong, but it is only through self-correction that we know that to be so, and in that self-correction we see the strength of science.

2 comments:

Noocyte said...

How very extraordinary! The potential resolution of the leakage current problem is itself Big News, if it pans out; Oh for a day when the whirring of fans does not disturb the quiet passages of music played on my system, and my laptop doesn't burn the top of my lap. The removal of a size barrier on electronic components also offers many tantalizing possibilities, as does the SF buff's dream of a computer that retains and processes and applies 'experience' on an on-going basis (SKYNET fantasies notwithstanding). If nothing else, the chance to edit out the various rituals I perform to make use of the time waiting for my system to boot up is worth the price of admission!

Your comments about the scientific process are well-met. I always saw the intrinsic tentativeness of even cherished scientific models as a feature, not a bug. I find it tremendously exciting that so much of what we know could potentially be turned on its ear by subsequent discoveries (or recombinations of existing discoveries). It is so much more dynamic and organic a world-view than one which rests on a set of certitudes and fixed vantage points. This has always been one of the things which so chaps my posterior about those who can look at evolution and smugly toss out the barb that it's "Just A Theory." As though saying that gravitation is "just" a theory will suddenly make it possible to hoist a piano with a flick of the wrist.

For some time now I have subscribed to the theory (there's that word again) of evolutionary epistemology, the notion that ideas themselves compete in an evolutionary manner, based on their fitness with respect to other ideas and to their ability to correspond to observations and predictions of the natural world. That a particular meme should emerge which alters the topography of a host of other meme complexes, resulting in a whole new ecosystem of knowledge is a tremendously hopeful developoment for me. There is always something new under the sun!

I am looking forward to the adventures of the memristor in the jungle of knowledge [circuits], with hopes that it will spawn all manner of interesting and useful progeny.

Mr.Hengist said...

TC> The removal of a size barrier on electronic components also offers many tantalizing possibilities [...]

I should clarify what I wrote about leakage current and the waste heat it creates as being a limiting factor in the reduction of semiconductor feature size. In order to do a process shrink there are always known and unknown issues to overcome. The known ones have to do with problems like the necessity of using a shorter wavelength in the photolithographic process (because only shorter wavelengths can be focused to the higher resolutions necessary for smaller feature size), whereas an unknown problem might be something like the elimination of spurious artifacts which are only apparent when using that higher wavelength at the higher resolution it enables (i.e., a problem that only manifests itself when pushing the envelope).

What Andy Grove was pointing out back in 2002 was that leakage current was going to be the showstopper for process shrinks before they could get to solving what had become the usual limitations. That is to say, engineers have been primarily concentrating on solving the technical barriers to creating smaller feature sizes and mitigating the increasingly deleterious effects of leakage current, but known approaches would not solve the problem at the feature sizes anticipated in coming years, then projected to be about five to ten years away.

It's sort of like this: if I ask you, "How far can your car go on a tank of gas?" and you answer in terms of fuel: "About 500 miles". The analogous problem would be expressed as, "Your wheels will fall off before you get halfway there." That would be the limiting factor to how far you can go - wheels, not fuel.

TC> [...] as does the SF buff's dream of a computer that retains and processes and applies 'experience' on an on-going basis (SKYNET fantasies notwithstanding).

Hit the brakes, Speed Racer,you've gone a bit too far. The retention of memory does not imply the above. The problem with current memory is that it requires constant refreshing; in effect, the memory chips must constantly "rewrite" their contents or they rapidly decay. Flash chips work differently and can retain their contents for up to a decade or so without having to be refreshed, but they have a variety of problems which make them unsuitable for use as the main memory of personal computers. The hope is that memristors will provide a means to design a memory chip that has the benefits of both worlds without their comparative limitations.

TC> If nothing else, the chance to edit out the various rituals I perform to make use of the time waiting for my system to boot up is worth the price of admission!

There we go! Back in the days of the original IBM PC, my father would start it up and then go make himself a cup of tea; by the time it was done the computer would be booted. Boot times are shorter now but still considerable; memory that would remember when turned off would enable an "instant-on" feature, so you could turn on the machine and it would be ready to use immediately.

TC> Your comments about the scientific process are well-met. I always saw the intrinsic tentativeness of even cherished scientific models as a feature, not a bug. I find it tremendously exciting that so much of what we know could potentially be turned on its ear by subsequent discoveries (or recombinations of existing discoveries).

I agree that it's a feature, not a bug, but I'm not entirely comfortable when well-established scientific models are turned on their ear. While "Mr. Mustard in the Library with the Candlestick" may be well-enough for a game of "Clue", it's another thing entirely when we're talking about basic laws of physics. Truth must win out in the end - that's not a concession, it's a requirement. Yet, when basic laws are overturned it strengthens my confidence in the process of science while undercutting my belief that we've got it right. What it always comes down to is that we hope we're closer, and we have have the argument and evidence to demonstrate that we're closer now than we were before. That's too fuzzy-wuzzy for my temperament but it's the best we can do.

TC> For some time now I have subscribed to the theory (there's that word again) of evolutionary epistemology, the notion that ideas themselves compete in an evolutionary manner, based on their fitness with respect to other ideas and to their ability to correspond to observations and predictions of the natural world.

"Evolutionary Epistemology" - thanks! Now I have a term that describes my understanding of that aspect of the process of science. In my blogpost I was going to take a stab at describing the competition of ideas as being analogous to the process of natural selection, but I decided against it as it seemed to be a bit self-referential, i.e., using a scientific theory (natural selection) to describe the process of science itself seemed almost to beg the question. Of course, knowing me, I'll probably forget it by the end of the day.