Dreaming back like that
There is no precise distinction between analog and digital computing, but, in general, digital computing deals with integers, binary sequences, and time that is idealized into discrete increments, while analog computing deals with real numbers and continuous variables, including time as it appears to exist in the real world. The past sixty years have brought such advances in digital computing that it may seem anachronistic to view analog computing as an important scientific concept, but, more than ever, it is.