Infinite Computing: Bah, Humbug!

At Autodesk University, Autodesk CEO Carl Bass introduced the term “Infinite Computing” in an attempt to define Autodesk’s perspective on “the cloud” from a unique angle. I think the term is a brilliant and effective use of terminology because it focuses an otherwise nebulous concept and it radiates a sense of real and immediate purpose.

Infinite computing is not really infinite, of course, and it’s certainly not infinitely accessible. However the metaphor is apt, because like the physical universe, as long as the virtual universe keeps expanding it is essentially infinite. [I can’t resist having some fun and taking the analogy a little bit further: at some point, Moore’s law will encounter relativistic effects, and we’ll realize that every transistor warps the virtual space-time continuum in proportion to the square of its clock speed.]

So why am I bearish on the prospect of infinite computing?

Let’s say you buy a computer with multiple processors for, say, AutoCAD. Two processors can produce a nice performance boost, because AutoCAD can utilize 100% of one processor while the operating system uses the other. But what happens if you quadruple your capacity to eight processors? Unless you’re running independent programs that can use the extra processors, they offer very little benefit and are essentially wasted.

The moral of the story is this: an infinite computer is ineffective and inefficient unless it has an infinite number of simultaneous tasks to perform. It costs computing power to manage parallel tasks, so the practical limitations of “infinite” computing make it obviously unrealistic for all but highly specialized tasks. Even if we give it a more accurate name like “massively parallel computing“, such a system is hardly “sustainable” (to use another modern term of art) due to the inherent inefficiencies.

A compromise is necessary. There are new ways to look at old problems that enable a more parallel approach to finding solutions, and I have no doubt that many engineering problems can be restated in a way that makes them amenable to parallel processing solutions — but that’s hardly a revolutionary concept, and it certainly does not require an infinite computer for its implementation.

In the final analysis, “the cloud” is going to be about individuals connecting to each other and to their data seamlessly and in a location-agnostic way, and the “infinite computer” will be what they use to do it. Nothing more, nothing less.

One thought on “Infinite Computing: Bah, Humbug!”

  1. Marketing varnish & wax.

    Try this:
    http://en.wikipedia.org/wiki/Supercomputer

    >The fastest cluster, Folding@home, reported 9.739 petaflops of processing power as of mid January 2011. Of this, 7.1 petaflops are contributed by clients running on various GPUs, 1.8 petaflops come from PlayStation 3 systems, and the rest from various computer systems.[11]

    Another distributed computing project is the BOINC platform, which hosts a number of distributed computing projects. As of April 2010, BOINC recorded a processing power of over 5 petaflops through over 580,000 active computers on the network.[12] The most active project (measured by computational power), MilkyWay@home, reports processing power of over 1.4 petaflops through over 30,000 active computers.[13]

    Software must always lag hardware, anyway, for obvious reasons.

    If you have a late model Intel i7 2600, your software is not taking advantage of the new vector instructions, etc.

    But you do have the processing power of a 1980’s supercomputer on your desktop, running at room temperature, consuming a few hundred watts, for less than $1000 (w/o graphics processor).

    There is always spare capacity available, somewhere.

    None of which has anything to do with AutoCAD, or any Autodesk product.

Leave a Reply

Your email address will not be published. Required fields are marked *