Sunday, December 15, 2013

Where's My Quantum Computer?

Actually, the title is a bit of a tease. I know exactly where they are. But people can't seem to get their head around the coding. And that's holding back demand.

Do you think transistors (and therefore computers) are Boolean devices? Would you be uncomfortable if I told you that's wrong, and easily proved? Which means they are not logical and deterministic machines which will always carry out the same series of instructions given the same program, like they told you. Sorry.

Sure, logically they do. But physically, things are more complicated. I'm sure you guessed that already. If you happen to present a transistor with a gate level that just happens to be a critical 'in-between' voltage, the transistor will not switch into a state representing either zero or one, on or off, but instead goes gray.

The technical term is "metastable", because often it doesn't just sit maddeningly in-between - it balances there and generates white noise, blasting out randomness into the rest of the circuit.

This ambiguity value can propagate, if it matches the metastability of the next transistor. And the next.

Why don't you experience this all the time? Because pretty much every aspect of digital logic design at the physical level is intended to hide it. Big fat specifications with timing diagrams which say you should never present such indeterminate voltages, and if you do then it's your own fault. We have bistable latches, Schmitt triggers, lockup protection, thin films designed to decrease metastability, and design features so core we take them for granted...

Such as the clock. The entire point of a central propagated clock (and all the resources it requires) is to create a moment where all the transistors shout "change places!" and move through their metastable zone before any measurements get made. It is why "clockless logic" has devolved to just redefining what "clock" means.

And if any of these elaborate protections guarding each and every Boolean 'bit' (actually made from rushing flows of billions of electrons sloshing from potential to potential through the bulk silicon) fails for just the tiniest nanosecond, your computer crashes.

That's why they crash. That's what a "hardware fault" is, and why they're so frustratingly random.

It is the underlying reality of your machine asserting itself, in contradiction of your Bool-ish fantasies. You can't get rid of "noise" entirely, because much of it comes from the atoms within the wires your computer is made from. Just ask Benoit Mandelbrot.

The fact we can construct near-perfect self-correcting boolean simulation machines out of piles of reheated sand is really nothing short of a technological miracle. And is taken entirely for granted.

Students of Object Oriented Programming are taught the tenants of the faith: "Encapsulation, Abstraction, Polymorphism". But they think they are virtues, rather than necessary evils. Abstraction in particular gets totally out of control, with over-generalized interfaces that map well to human concepts (as defined by committees) that bear no relation whatsoever to how a real machine actually would perform the task.

It's why "shader" programmers for game engines are a special breed. They have to smash classic linear algorithms into parallel pieces which will fit into the graphics card pipeline. Abstraction doesn't help one whit. It is the enemy of performance.

There's an equivalent in relational database design: "Fourth Normal Form". (Or even Fifth!) Students are taught how to renormalize their database designs to make them more logical, and are graded on it. Then you get to work on real high-performance transactional systems and quickly rip all DB designs back down to second (or first) normal form, because otherwise the system is too slow for words and users get angry.

If you are using abstraction to hide the details of a problem rather than reveal them, you are using it the wrong way around. Encapsulate the code, not the problem. You can't generalize from a sample of one.

This obsessive need to abstract away and deny the underlying machine is why we're very bad at quantum programming, which pretty much by definition is a sneaky way of arranging the dominoes of reality to fall in a certain way. And while reality is playing quantum dominoes, we keep designing programs as if the game is billiard balls.

And when you ask why, the answer is essentially "because it's easier for people to reason by analogy about billiards".

The assumption here is that the point of computer science is to create nice and easy structures for humans to comprehend.

Um... OK.

And that's why you can't have a quantum computer. Because the only metaphor or abstraction that has any value currently looks like this:


Sure, we have names for some of these concepts. "Superposition" and "entanglement" and so forth, but they have common characteristics and behaviours we have yet to find well-rounded words for, that everyone intrinsically understands. Unless you count "timey wimey".

So forget trying to understand Quantum Information Theory in terms of something else. There isn't.

Just bang the wavefunctions together!

4 comments:

  1. I thought I was reading "annals from the GOTO programmer" there for a little while.

    But I want to say "abstraction" is measurement. We love lines. We divide and multiply them, and square them.

    Despite models of physical measures the world I experience is far far from the world that is represented by mathematical entities. And also far far closer than I can appreciate when I think.

    Abstraction is not something which you can easily reduce away!

    ReplyDelete
  2. Interjection! Adjective pronoun adverb verb noun past-plu-perfect subphrase. Imperative conjunction, participle verb personal pronoun!

    ReplyDelete
  3. "The fact we can construct near-perfect self-correcting boolean simulation machines out of piles of reheated sand is really nothing short of a technological miracle. And is taken entirely for granted." - so very true.

    I know very little about computing and programing Jeremy, but with the way you write about it I understand the concepts and issues in general terms. That's quite a gift you have!

    ReplyDelete