Bounded, Unbounded...
Is Infinity boundless? Well, by definition it is. The frankly inconceivable concept of endlessness is a thing. But is it one thing? Therein lies the rub. And a lot of mathematicians, logicians and philosophers. Two things - as per usual - I've read in the last couple of days: one regarding infinity as considered by mathematicians, logicians and maybe philosophers; and a piece on the resurgence of analog [sic] computing as the way out of the log-jam of large-scale digital modelling and its ridiculous environmental impact.
Taking the second first, super-large-scale data modelling involves, obviously, very, very large data sets, that have to reside in their discretely packaged bits and bytes within digital memory chips - lots and lots of them - that require constant power to maintain and access that data. At the highest levels of this kind of enterprise, this involves eye-watering amounts of electricity, with a significant impact on our planet. These mega data sets, are typically at the heart of digital currencies, such as Bitcoin, and AI: artificial intelligence, which requires enormous matrices of real-world data from which to learn, all of which has to be stored and operated on.
The energy requirements are huge, as the process of digitizing amorphous analogue real-world data is truly Sisyphean: chopping up the continuous into discrete chunks to analyse takes time and energy, and it scales badly. Like geometrically. Re-enter The Dragon: Analog(ue) computing. Once thought consigned to history forever by the ever-expanding Moore's Law world of the digital age, analog computing is once more finding its voice in surprising new ways. Analog computers measure stuff in an entirely continuous fashion, using voltages to model the, well, continuous processes of the real world, in real time.
They were traditionally used in a single dedicated modelling environment: constructed to solve one particular problem, and they were inherently non-programmable: they modelled for the problem they were designed for, and no other; but where they lacked in digital "precision" or adaptability - i.e. rounding out to momentary, approximate, discrete values to solve one problem and one problem only - they scored in speed and efficiency. They modelled the fuzzy analogue world in its own image, without fuss and with minimal effort in the form of energy expenditure. Now, if you could make them programmable, and therefore general purpose computing machines, that would be a thing indeed.
Turns out, that little nut is being cracked as I write. Purpose-built hybrid chips that can be programmed to model all manner of continuous real-world simulations are being developed at this time. This is potentially huge in so many areas: climatology and meteorology are the first two that spring to mind, from both ends of those particular use cases: a deeper understanding of the environment, our impact on it and how to deal with it, without huge climate-damaging data-centres to drive the research? No-brainer.
OK - the first, second. Infinity. It's a thing, right? Turns out that mathematicians have far different ideas than you and me on the subject. Infinity as a concept is enough in itself to turn most people to drink or psychedelics, but the math-heads have been chewing this one over for a very long time, and have come to some conclusions - or not, maybe - of their own on the subject of this inconceivable conceivable. There's not one infinity, but maybe an infinite number [that in itself is a logically inconsistent usage of the term number, or is it? Mmmm...] of infinities. This all stems from some fairly basic concepts in number theory. I think.
At the heart of the thing is the difference between the set of natural numbers and the set of real numbers. In the first case the integers; 1,2,3,4... etc., on and on and on to infinity: endless, innit? Just keep adding one, and you will keep on going forever. In the second, the real numbers; i.e. the endless subdivisions that we can fractionate the real world into: 1, 1.1, 1.11, 1.2... etc., on and on and on to infinity: endless, innit? Except, are these two infinities equal? Consider the fact that in between each of the integers, i.e. one and two, there are an infinite number of fractions of that interval in a continuum of increasingly tiny divisions, each of which can be similarly subdivided.
So for a given, countable number of natural numbers, there are an infinite number of uncountable real numbers, by definition. However, consider the infinite set of natural numbers, and what are we to make of the difference between the two sets? Equal or not? Apparently, and in a way happily, though no less mind-bogglingly, they are, it seems mathematically equal. As to explaining this, I'm way out of my depth already, so Google the source material if you're brave enough.
In conclusion(?), though, I do feel that there is a deep and significant connection between these two topics: Leibniz and Isaac Newton both hit on the idea of the Calculus at the same time in the history of human thinking: the notion that you can model and approximate the continuously changing real-world on paper. An infinite continuum of change expressed as concisely as possible, with the least effort and overhead. Analogue, anyone?
Comments
Post a Comment