In my last post I covered the concept of fabric computing and why it matters in the world of cloud computing. With a “fabric” approach towards creating a cloud application, we include the virtual compute, storage and network components inside a fully software-based model of the service. This is distinctly different from a more traditional approach, where the various resources are added and configured one by one.
In response to a comment, I also suggested that this new approach could be compared to a modern espresso machine. Such a machine delivers a complete service (coffee!) – in an integrated fashion. No need to worry about the temperature of the water, grinding the beans, any other steps or equipment required to make it happen.
In a cloud computing context, the fabric is integrated “out of the box,” like the espresso machine. It doesn’t require provisioning, managing, integrating and monitoring lots of VMs and appliances individually. Most cloud solutions approach building a cloud by automating these individual steps – typically through scripting — but this approach has distinct drawbacks. I included a short demo of the fabric approach below, but to understand these drawbacks we will use another analogy:
Have a look at the demo below to see how that applies also to a fabric cloud application (including “save as”, creating multiple versions, send it to someone else to run). But first ask yourself: when did you last see an accountant with a calculator? In fact automated calculators never really took off. Spreadsheets (fabrics) are simply the better way.
Seeing is believing: the epiphany of a demo
When I personally saw my first demo of this concept in action, it reminded me of two earlier occasions where a demo later reshaped IT as I knew it. The first was after I installed Windows 1.0 (all 12 floppies). Sure back then it was still monochrome, there were no applications and the performance was not great, but it did make me think: “Boy, if they ever get this to work, it will really change how we use desktop computers.”
The second “epiphany” was my first experience with X86 virtualization. After having confiscated the biggest machine in the office with the most memory, and after quite some tinkering I saw an actual X86 machine boot inside a window (of course this was not an actual machine – it was a virtual one). After it booted, it could not do much, and running two of them brought the whole machine to a grinding halt. Yet, it did make me think: “Wow, if this ever scales, it can completely change how we handle our machines.” And (admittedly somewhat to my surprise), about a decade later around 2009, X86 virtualization did actually start to scale and it developed into the billion dollar industry that is changing the way we manage our servers.
Just like Windows profoundly changed the way we use desktops and virtualization is changing the way we manage servers, this new software-based, virtual fabric approach, in my view will change the way we manage data centers. Now I was certainly not the first person to realize this. Nicholas Carr already acknowledged the power of this approach in his book “The Big Switch.” In an interview with eCommerce Times, he subsequently said:
“In 3Tera’s AppLogic, you can see the broad potential of virtualization to reshape how corporate IT systems are built and managed…”
How is that so? Well, my marketing colleagues here created quite an entertaining video , but the original – now vintage – 5-minute demo that shows how to define an application as an integrated fabric is also still out there (and shows the potential much better than my above ramblings).
CA 3Tera AppLogic software essentially enables you to do three things:
1) First, you set it up on commodity x86 servers, creating a single fabric for the storage, network and compute capabilities on those servers.
2) Then, using an integrated modeling tool, you take your application or service – including all its components, such as data, networking, load balancing, security etc.– and create a 100% software based model (using a Visio-like drawing tool – check out this InformationWeek demo from Cloud Connect to see this in action).
3) Next, you can deploy a model of the fabric, with the software allocating resources based on the model, and providing automated scaling, metering and fail over capabilities.
You can also move the service or application to another data center very simply, even to one in another country or at another provider. Or, you can copy it and provide the same service to another department or customer (nearly instantly).
For a long time, CA 3Tera AppLogic software was kind of an industry insider secret. Several analysts and writers – like Nicholas Carr – were aware of it, discussed it and listed it in their publications. But today there are many case studies and real life success stories of both small and large implementations out there. This may be a good time for you to have a closer look; if only as an interesting implementation of these new fabric computing trends and principles in action.