How many CEO’s to screw in a lightbulb

Is it just me or has the CEO title recently fallen pray to title inflation? More and more companies seem to find economies of scale in appointing CEO’s by the dozen, one per division, one per continent or even one per country. It used to be pretty clear that if you had a meeting with the CEO, you spoke to the guy in charge (back then in most cases it actually was a guy, although recently – especially in tech – we saw more and more gals in the top CEO position). The CEO was the one person where all the different reporting lines of modern matrix management came together and who could actually make decisions on their own (not saying the good ones did, but they could). Famous it the card saying “Ï’m CEO, B….” featured in the movie The Network. Also meetings with “the CEO” were different. First of all they generally tend to happen on the top floor (of the office, the hotel or whatever the location of the meeting was), and the road to the meeting room was full of rather nervous staff (nowadays we would call it an entourage) huddling people in and out of the inner sanctum and giving pointers like “Best not to run over as we have to get him/her to his (typically private) plane right after”.

The CEO title also – at least in Europe – was deemed only approiate if you were at the head of a fairly large and preferably multi-national organization. Sure, startups of just two people sometimes had democratically divided all available CxO titles among themselves (Hi, I am the CT/F/MO and this is Bart who is the COO and CISO). But since founder sounds so much cooler, we nowadays see less and less of that. Also founder has the distinct advantage that you get to keep that title if you grow (Have a look at the movie ”Jobs” to get an idea of the process I mean). Soon the CEO title may go the way of the VP title, which was rapidly enhanced by adding terms like Senior, Corporate, Executive and of course Senior-Corporate-Executive VP. Let me know if you get the first card that says Sr. or Corporate CEO.

Now I am not implying or even assuming that adding multiple CEO’s is just about cosmetics and ego’s. In many cases there is a need to have units that are more agile, more aggressive and more focussed than the typical large corporate multinational. And if you as headquarter actually manage the unit as if you are merely a shareholder – meaning you sit on the non-executive board of this CEO and decide on firing/hiring/paying him but do not get involved in running the unit yourself in any way shape or form – than fine. We used to have the term “general manager” to describe that role, but within the aforementioned modern matrix organization general managers often cannot even decide when to change the aforementioned lightbulb (as facility management does so on a global basis), or on how to organize sales (as corporate product units and lines of business leaders appoint product and sales leads into his unit).

Personally I am a big believer in organizing using cell structures. The late Eckard Wintzen – founder of Origin, later part of Atos Origin – wrote a great little book about this called “Eckarts Notes” (in Dutch and strangely enough never translated, but for a summary in English see http://reinout.vanrees.org/weblog/2011/01/23/eckarts-notes.html). The cell approach – where you split cells if they get too big to be managed by one person – was pioneered earlier and is still in use today by other IT service companies. The general idea was that within a unit of between 50-100 people you don’t need a HR department (as the general manager knows each of his people – and their strengths and weaknesses), you don’t need facility management (as you can tell people to clean up their own mess behind them) and you on’t need a purchasing department (as the GM does large purchases himself and leaves individual purchases to the (empowered) individuals . It’s a very entrepreneurial approach, some practitioners even incorporate each cell as a separate company, meaning that cells can go out of business if not managed well.

The draw back is you miss out on synergies of being big – on artificial, paper synergies like having the same type of coffee machine in all your 2000 offices worldwide (sure global facility management shows a saving of 12.5 million over 5 years, but it is 1 cent per cup or 0.0021% of overal your staff cost, so who cares!) but also on some real synergies like shared systems, a shared accounting function or shared computer capacity. But wait a minute! With cloud computing I can have those synergies (and more), without having to be organized like a 19th century industrial estate.

Tell you what. Why don’t you have as many CEO’s as you like (but do give them full P&L responsibility and operational/tactical and strategic authority for their cell, unit or division), but stick to one CCO (Yes, one Chief Cloud Officer) who coordinates the internal cloud services (that your employees consume) and the external cloud services (that you provide to your customers). Something to ponder on in 2014? Is cloud really more about enabling cells to thrive and prosper, than about centralizing everything into a large grey monotony?

Building (or buying) a better mousetrap

Most people by now agree that “Build a better mousetrap and the world will beat a path to your door” is not a recipe for commercial success in technology innovation. In many cases it is not the quality of the technology that determines the winner. It is about timing, branding and addressing the right problem with the right audience using a fairly adequate solution.

The history of IT is full of examples of technologies that were not necessarily superior, but that turned out to become winners. Who would have expected the fairly random and uncontrolled TCP/IP to win over Tokenring or other more robust technologies. Not to mention the classic battle of Windows versus OS/2. The question is whether the cloud race will run along significantly different paths.

All who lived through trying to implement serious enterprise and business solutions on top of these historic “winners” remember how hard this actually was. Not to say it was impossible, but it did require some serious high-wire acrobatics and advanced juggling. Think of the tools that companies had to deploy or develop to manage the infamous DLL hell and the advanced acrobatics needed to manage memory space or the database tricks needed to live with page level locking.

Luckily winning is not the end state for technology innovations. It is merely the beginning of a ongoing race to becoming better, faster and robuster. But for customers moving from 8 or 16 to 32 or 64 bit was far from a ride in the park. It required hard work and meant leaving some casualties behind (mainly in the form of applications not being able to make the transition).

The cloud race will likely be subtly – but not radically – different from these historic technology rides. Aspiring providers are frantically working on building better mousetraps, while established providers (but how established can one be in such a young and rapidly growing market) are aggressively expanding or even reinventing their offerings.

Companies that tried to run or build enterprise solutions on windows 3.1 now agree that in retrospect it was more bleeding than leading edge (although is did establish good starting competitive positions for some). In retrospect it was always hard to predict the infliction point. At exactly what point did the technology reach a level that it became feasible as a mass solution. Time will tell whether we will look back at today’s clouds effort as brave (but a bit foolish) or as brilliant (and a major step forward).

Cloud, Are we there yet?

The Washington Post recently ran an article by By Andrea Peterson on RIM (now BlackBerry), with a chart they called ”The decline of blackberry in one chart“. But more than the story of BlackBerry this chart rang home for me the enormous dynamics of a relative new industry.

As their chart showed the 4 mobile vendors that together had about hundred percent market share in 2005 barely managed to hold on to 20% by 2013. In only 8 years they went from hero to zero, and were replaced by platforms that were introduced in 2007 (Apple) and 2009 (Android). I don’t cover mobile platforms so see this data mainly as a consumer, but it did make me wonder about the cloud market.

The mobile market in 2005 was by no stretch of imagination a startup market, I was on my third cellphone, after having enjoyed a car bound phone (car bound because it took up about half the boot) for about 4 years. The vendors were established, companies were handing out cellphones to most of their road warriors. Something that actually started in Europe – my US colleagues initially were juggling company provided calling cards and dialing codes – but by 2005 this was pretty much a global movement. A movement that felt more mature, established and business as usual as today’s cloud market.

And see below what happened then. So a good question to ask (and an interesting debate to have) is where the cloud market is today. And the cloud market off course is not a homogeneous market. It becomes even more interesting if we ask the question for SaaS, for IaaS and for PaaS. What is the probability of today’s leading cloud vendors becoming tomorrow’s cloud market gorilla’s?

Yes, the end of the year is slowly nearing (with fall up upon us and the shortest day already behind us), so time to start reflecting on the future. Have a look at the graph, but do remember that the chart shows relative share. If the chart would show absolute market size ift would have the shape of a cone and the leaders of 2005 would be mere rounding errors by 2013 (just like total cloud spend today only is about 5% of today’s overall enterprise IT market?).

Interested in your thoughts, please let me know via the comments.

PS For a behind the scenes view on Blackberry see this long form article from the Canadian Globe and Mail: “Inside the Fall of Blackberry

Connecting The Dots in Enterprise Cloud

Conventional wisdom is that the fastest connection between two points- for example between today and tomorrow – is a straight line. But just like in aviation this is not necessarily true in cloud computing. First because cloud computing is not one thing (not one dot on the map) it is a conglomerate of many different types of services (IaaS, PaaS, SaaS, BPaaS) each with its own characteristics and following its own timeline.

This makes it very difficult (if even useful) to get organisations to agree on a cloud strategy. A colleague of mine once compared it to leading five blind folded people each to a separate part of an elephant and then afterwards asking them to agree what they just “saw” and what action to take.

Trying to extrapolate these many views into the future and agree a possible path or strategy in such a diverse environment is even harder. That however should not stop us from trying. The illustration on the right actually comes from a research note just published* on the topic that identifies three factors that will significantly impact cloud adoption in the enterprise space.

As Gartner made the note – in anticipation of the upcoming Symposium Season and the Outsourcing Summits in London and Orlando – generally available via this press release, I wont try and give an even shorter summary here. Suffice to say that some established technology marketing truths – like the ones Geoffrey Moore described over twenty years in his classic “Crossing the Chasm” – still hold true, even today.

Why the cloud may require you to learn multiple words for snow

Cloud is at the center of a convergence trend that is impacting people across all of ICT. This convergence is breaking down the walls that separated the traditional silo’s of IT, networking, storage and security. But with this breaking down of the walls we also need to better understand the subtleties of each others domains in more details.

A famous urban legend is that eskimo’s have many words for snow, as it makes sense to – if you spend your whole day in snow – to distinguish the subtle and not so subtle differences.

Similar in IT, where others simply refer to IT as IT , the people living in IT tend to distiguish between operations, development, support (helpdesk), testing, portfolio management, information and master data management, etc. etc.

And the same is true for networking, where others see the network (or even the internet) as a homogenous blob, the people running and managing the network distinguish many parts and layers, seperate it out into core and non core and many other subtleties.

The cloud (together with its enbaling peer: software defined functionality) is rapidly changing that. Anyone who tried to set up a simple IT fuction like cloud compute at one of the many cloud IaaS providers will have noticed how many of the configuration questions are network related (and not trivial to answer).

The challenge for network folk is slightly different as more of their traditional network functions are no longer implemented in or on dedicated network kit (their kit) but run on general purpose compute infrastructure as software. SDN (software defined networking) and NFV (network function virtalisation) are two of the main drivers in this area.

But even a bigger driver to learn and understand each other languages is the fact that the clouds inherent “as a service” model is driving a move from beeing organized along horizontal or functional layers (network, storage, compute, applications, support) to beeing organized around services (CRM, Collaboration, Supply Chain, etc.). These services (typically implemented as SaaS services) namely include their own implementation of all the underlying layers inside their service.

In theory (and often in practice) these services hide the underlying complexities from their end users, but often not from the teams supporting or offering (for example in the case of “private SaaS”) these services. These teams will need a more holistic and less silood view of the whole stack than they had in the past.

And that will mean we will all need to speak and understand the languages and words that are used in the layers that used to be foreign to our own areas of expertise.

At some point we may even develop a simplified high level language that goes across the domains. A bit like Esperanto or like Fangalo, the language that miners used in South Africa, a mix of words from Dutch, English and the about 500 local languages, with a simplified grammar, no distinction between past, present and other tenses and a vocabulary of only about 2000 words. It could be learned quit rapidly and allowed people to work in cross functional and cross national teams within a very short time.

In the cloud we could have such a “digital” language (or skill set) for people working on the cloud (whose tasks will be more technical than traditional business tasks) , but still a lot less technical (and less specialised) than the task of the people working behind the cloud (engineering the very complex cloud engines and networks that power the cloud).

Cloud for Business

Earlier this month the Times ran its regular “Cloud for Business” insert, in which I was asked to write a column on “Cloud for Digital Business” (see full text here). For reference I included below the piece on “The future of Cloud Computing” that it ran in an earlier edition. An item called:

Cloud Spotting is the Shape of Things to Come.When asked to write a column with an ambitious title like “The future of the cloud”, it is a good idea to look first at where cloud is at the moment and to realise that it is still very early days. Today, of the $2.7 trillion that global business spends annually on IT, just 4.8 per cent is spent on cloud computing.

The cloud’s penetration of the world of business is considerably less than its penetration of our daily lives. As consumers we get most of our news, information and increasingly entertainment through cloud services. In fact, the consumer industry has become the major driver of IT innovation. Many years ago, IT innovation was largely driven by defence spending, later by space exploration and then, for many years, the best place to see new innovative products in action, such as colour laptops, projectors and colour printers, was inside the offices of large enterprises.

How different things are now. The largest screens with the highest resolutions, the fastest network connections, “premium” tablets and the best connected mobiles are probably the ones you use at home. Even in large data centres, innovation is now largely driven by consumer-oriented organisations, such as Facebook, with its open computer server design initiative or Google with its MapReduce approach to making sense of very large sets of information. Many enterprises are still getting used to the idea that they are no longer in the driving seat. Where in the past they typically expected vendors to come running to deliver exactly what they asked for, they now need to figure out how standard offerings based on consumer-facing innovations can be used inside their business.

The cloud is not the only consumer-driven innovation businesses need to make sense of. It is joined by a nexus of new digital forces, including the social, mobile and information, or big data, forces. These offer truly new capabilities when combined. For example, businesses can now connect more directly with their customers using social and mobile technology. This provides huge amounts of customer information that can be analysed to enable businesses to communicate in a more relevant and useful way – a change facilitated by the use of software and IT infrastructure in the cloud.

Another element shaping the future of the cloud is the increasing importance of sharing content and knowledge across organisations. The value of using cloud services, such as Google Maps and LinkedIn, comes from sharing more information (on roads, traffic, people, skills) than individual companies could ever hope to collect or keep up to date in their own internal systems. For many years businesses sought efficiencies and gained competitive advantage by integrating and co-ordinating their internal processes better. Now we see increasingly that, driven by big data, future efficiencies lie in optimising processes across multiple organisations. This leads to “social collaboration networks” between businesses, similar to what a cloud application like Facebook did for individuals.

However, by saying that the cloud is especially valuable for doing things that were not possible before, or at least not practicable or affordable, we make predicting the future of the cloud a lot harder. Ironically, uncertainty about the future is also one of the main reasons for using the cloud. When a business is uncertain whether one thousand or one million customers will watch its latest webcast or play its latest game, the elasticity of the cloud makes it an ideal platform because a business can rapidly scale up its IT capacity to meet its needs at the time and then scale it back when it is no longer required.

Similarly, when organisations are uncertain whether a new software implementation will be a tremendous success (leading to all divisions in all countries using the whole platform on a daily basis) or a modest failure (leading to a few divisions using some parts of the software in a couple of countries), it is better not to invest up-front but to “pay as you go”. An example of a company using this is game provider Zynga, which uses cloud computing extensively when launching new games, but then brings established games back in-house when many of the uncertainties about usage and capacity requirements have been resolved. It is this agility, the ability to “turn on a sixpence”, not cost-savings, that is the main driver for the cloud’s use by businesses today and it is likely to remain an important driver in future.

Using the cloud should not be a goal in itself: it should be a means of achieving specific benefits. As a result, business cloud strategies, today and in the future, must be driven by higher-level objectives, such as reducing lead times or improving customer experience by delivering core services through the cloud. Cloud adoption should not be driven by artificial IT metrics, such as aiming to have a certain percentage of applications running in the cloud by 2015. Selecting projects potentially suitable for the cloud is a question of weighing the benefits against the risks. If the risks of using the cloud are very high and the benefits relatively low – for instance, when migrating stable and predictable yet crucial workloads that already run efficiently internally – it may be something to avoid.

Conversely, if benefits are high and risks low, as for example in big data modelling and analytics where tens of thousands of processors run for short periods to simulate or predict customer behaviour, using the cloud may be a “no-brainer”, allowing a business to pay only for what it needs, and to avoid overheads and maintenance during times when this capability is not required.

Risk, however, is not a constant. Just as Japanese manufacturers pursuing just-in-time and Lean manufacturing concepts refused to accept that the changeover time between products on a production line was a constant and so actively worked to reduce them, consumers and cloud providers should work on reducing the risks, both real and perceived, associated with cloud computing. There are many ways to do this.

Concerns about the security of information held in the cloud can be allayed by applying technologies, such as encryption or by agreeing to more transparency about the physical location in which data is stored. It is also important to realise that confidence in the cloud will grow naturally as users and providers gain more experience. Passengers of the first airlines needed to be quite brave as there was no track record that warranted trust in pilots and their equipment.

Trust increased as the industry learned from initial disasters and actively improved service – you can also expect to see both disasters and subsequent improvements with the cloud. Not all the cloud will be as regulated as the airline business, but some cloud services could well be more regulated than they are today, for example in privacy-sensitive areas like healthcare.

The cloud changes attitudes to IT more than it changes IT itself. The concept of delivery “as a service”, so central to cloud computing, will lead to IT organisation acting as an intermediary within business where managing the consumption of external cloud services will be as important as managing the delivery of internal services – leading to a future that is facilitated by, but not determined by, the cloud.

Author: G. Petri Source: Racontuer – Times instert

Did the US just give a bigger stimulus towards local European Cloud activities than the EU ever could?

Unless you have been under a rock for the last week it was impossible not to notice the uproar regarding the Guardian’s story on alleged information collection , allegedly called PRISM that -again allegedly- involved several major cloud service providers. The most detailed and nuanced piece so far – but it is only Sunday when I am writing this – is this one from the Washington Post.

As at this stage many things are unclear and some reports may be incorrect, I – for one – have not decided whether I will move my personal information from the many US based providers that I use in my personal live to local alternatives. But in this blog I do want to share my (strictly personal) views and thinking on the topic and explore potential alternatives. As usual I will stay far away from any politics in my blogs (something that must be doable given that the public reactions from different political sides are so varied and diverse).

Till today , individuals – like myself – often took a relaxed view towards protection of their privacy, using phrases like: “Well, nothing I do here is secret or illegal, so if they wanna peak, no problem”. But illegal in an international context is a relative term. Think of copyright law, where what is legal in one country (for example downloading copyrighted materials for personal use), leads to several year of incarceration in other countries, or think of controversies around travel of people carrying a certain disease or -maybe in the future – a certain gen, or of people of a certain origin. Currently the – already controversial – access to this data is only permitted for anti-terrorism and not for fraud-related or other criminal investigations. But we need to take into account that regimes may change and that as a result also this applicability can change (for example the detailed and accurate paper-based administration systems of local government entities in my country lead to significant, unforeseen and unintended harm following the regime change during WWII).

The increase of control that comes with massive centralized data-processing always carries some drawbacks (as Nicholas Carr – again with remarkable timing – republished just prior to all this hitting the press) and use of alternatives may to some extend be similar to the now famous statement about Democracy: Democracy is far from ideal, but it sure is better than any of the alternatives tried so far. For those individuals who want to try an alternative, here are some thoughts on cloud services to replace the ones currently under scrutiny or discussion.

  • Email: Most of the providers listed as part of the program deliver (free) email services. Many European individuals started using these because they delivered convenient webmail that did not tie email addresses to a particular ISP (and thus allowed changing internet provider without being locked in to their proprietary email domain name). Maybe it is time to reconsider ISP-provided email, but at the same time investigate the use of your own domain name (which makes your email a lot more portable). Make however sure that the mail provider your choose is not just owned by a European company , but that it runs under European jurisdiction (for example a European owned mail alternative I looked at turned out to be “a corporation organized and existing under the laws of the State of Delaware”).
  • VoIP Calls: Although leading consumer VoIP provider Skype started from Luxembourg, it is now part of a US headquartered corporation. Also most alternative voice and video calling solutions come from US based companies (with some even limiting their services to US based consumers only). Although European Telco’s have been talking about offering VoiP based alternatives to their regular mobile and fixed voice services, only very few have gone to market yet (check you local providers for possibilities) and even fewer offer it as a cost effective alternative for international calling.
  • Social Networks: Up to a few years ago most leading social networks in Europe were national providers, but today Facebook is very much the name of the game. If it was not for editorial independence a media corporation like the Telegraaf group might consider leveraging the current media driven FUD to drive local consumers back to the recently acquired (and formerly leading) social network Hyves. However, moving to a new social network all by yourself is not a very social thing to do (and kind of defeats the purpose of a social network) so some group orchestration may be required.
  • Short Message Services: So far the reporting did not mention any short message services , such as WhatsApp, Instamessage, Viber etc. Nor did it include other new web destinations becoming popular with the under 20ties (as their parents took over on Facebook) . Many of those however, such as Instagram and Tumbler have been recently acquired by the named providers. Twitter is a chapter by itself as most activities on Twitter are public by nature (and unlike some other providers they have put up a brave fight to keep their private services private).
  • Professional Networks: Also professional networks, like LinkedIn have not been explicitly mentioned so far (likely because the job market for the type of activities under investigation does not rely on these types of services ), but here some local alternatives do still exist. Unfortunately the alternatives are often very local (limited to one language area) and do not help much in an increasingly pan-European or inter-continental job market.
  • Dropbox: I could have used the more neutral term file replication here, but DropBox has – in a remarkably short time – pulled a Xerox on the market and made its brand name the generic name for these types of service. Alternatives do exist – from independent European companies as well as from Telco’s and ISPs and even from providers of networked hard disks. Maybe this is a good time for companies – who so far largely turned a blind eye towards the (shadow) use of such services, to offer internal – but just as convenient – alternatives to their employees.
  • Cloud IaaS/PaaS Providers: Also these have not yet explicitly been mentioned. Maybe because the typical consumer does not use these providers to build their own personal photo of file storage and sharing facility (mainly because higher level alternatives like Flicker and DropBox are so much more convenient to achieve the same result). Also these lower level services offer a lot more options for the user to protect his own data (like using encryption). Regardless of these consideration, this area is a domain where several local alternatives do exist, both at a national and a pan-European level. Some of these providers are even global offering services from facilities they run in “neutral” – but latency-wise quit closeby – locations like Canada or Switzerland.

So far most of the discussion has been about individuals and their data. The interesting thing is that the European Data Protection Directive has implemented the roles of Data Subject, Data Controller and Data Processor. For individuals (Data Subjects) the cloud service providers mentioned in the current media hype are in many cases both the Data Controller and Data Processor. For companies using these same cloud service provider firms, they themselves remain the Data Controller, while their customers and employees are the Data Subjects and the Cloud Service Providers are the Data Processors (which – according to my limited legal knowledge – can significantly change the applicable law and the entity held eventually responsible).

The list of concrete named European Cloud Service alternatives is not as long as I would like it to be (any suggestions more than welcome, please submit via the comments). That however was to be expected given the slower uptake of cloud in Europe described in existing Gartner research (see In a Diverse Europe, Cloud Adoption Will Be Slower and European Businesses Are Only Slowly Overcoming Their Reluctance to Transfer Personal Data to theCloud). There are however several interesting European initiatives underway, as we described in Cool Vendors in the European Cloud Computing Market, 2013 (subscription required). Please note that blogs – like this one – do not constitute such research.

10 Years on, and it still matters?

If an article, 10 years after its initial publication date, is featured in several look backs, reviews, Q&As and still gathers reactions and emotional analysis, it can be concluded it must have struck a chord – or in this case – more an open nerve.

In May 2003*, the Harvard Business Review published “IT Doesn’t Matter” , an article by then still largely unknown editor “at large” Nicholas Carr.

The premise of the article was that infrastructure has a diminishing impact on competitiveness and that IT was infrastructure (although Carr in the recent Q&A seems to indicate he meant IT Infrastructure is Infrastructure, a lot less controversial idea). Given all the recent analysis around, I only want to zoom in on one aspect.

What still amazes me after all these years is how the last decade of IT was impacted/hindered/predicted/paralleled (pick one based on your personal emotional state with regard to the article) by the three short recommendations that were included – almost as an afterthought – in a small breakout box on page 8 of the article.

The article gave the following three “New rules for IT management”

  • Spend Less : Which arguably coincided with a decade of corporate IT anorexia?
  • Follow, don’t lead : Today we know that consumer-play IT – and not corporate IT – leads most of IT innovation (think Facebook, Twitter, Google, Netflix), and Web-scale IT is arguably about corporates following consumer-plays?
  • Focus on vulnerabilities (as for any utility the dependence on external providers increases) – Which ironically is today’s main argument for corporate’s preference for private (over public) clouds?

Is the relationship between the last 10 years of IT and the article a question of coincidence, a perfect crystal ball, extreme influence or simply good penmanship? Let me know what you think (in the comments or via @gregorpetri)

The spring of 2013 is off to a cool start

Even though today’s crowning ceremony in Amsterdam enjoyed some modest sunshine, the temperatures across Europe are at an all time low. A more reliable indication that spring has started, are the annual Cool Vendor reports being published.
For the first time this series includes a note dedicated to cloud activity in Europe. The “Cool Vendors in the European Cloud Computing Market, 2013” note describes four European vendor making a difference in the local and global cloud market. The report also points to several other European cool vendors featured in other notes. Such as in the “Cool Vendors in Cloud Services Brokerage Enablers, 2013“,  “Cool Vendors in Cloud Services Brokerages, 2013” and  “Cool Vendors in Cloud Management, 2013“.
One of the featured cool vendors is directly involved in the Amsterdam crowning ceremony by hosting the social crowd control app that gives the many visitors real-time insight into movements and current volumes of people at the different venues. A nice example of a nexus application that gathers social information from mobile devices, performs real-time analysis of that big data type information in the cloud and makes it available again to the crowd via a mobile app.

Three Makes a Cloud

The ‘third’ in this cloud menage a trois is the network, which is joining Storage and Compute as a Software Defined Resources that can be allocated on demand through a self-service API or portal. As a term “Software Defined” is in race to catch up with established but equally vague terms such as “on demand” and “as a Service” and surfacing in all kinds of combinations.

The current frontrunner – Software Defined Networking (SDN) – might very well already be the most hyped term of 2013. All network technology providers are busy either building or acquiring SDN capabilities but the largest acquisition to date (for more than one billion dollars) was done by a virtualization platform provider. Meanwhile most network providers are looking to leverage SDN to increase the agility and reduce the cost of the services they provide.

An important reason for the interest in SDN is the size of the market it is promising to disrupt. The total revenue for Network and Communication services makes up a large part total worldwide IT spending. Of the total 3.7 trillion IT spending in 2013 46 percent will be spent on Telecom Services (next to 8 percent on software, 22 percent on hardware and 25 percent on IT Services). Any development that such a large proportion of the total cost influences can count on great interest, also from the telecom industry itself. At the SDN World Congress in Darmstadt no less than 13 of the largest telecommunications companies announced a joint initiative to promote ‘Network Function Virtualization. This initiative encourages network technology providers to enable their network functions to run on (clouds of) industry standard servers instead of on proprietary hardware appliances.

The main advantage of a software defined network – just like any other kind of software defined infrastructure construct – is that it no longer consists of dedicated and proprietary boxes with names like: firewall, load balancer, router, etc. If an organization tomorrow suddenly two times more firewalls than load balancers needs (or vice versa), they can just provision other software on their existing hardware. In addition everything that is controlled by software can be much easier automated than something that is based on hardware. And this offers benefits not just to providers but also to end users as it can reduce the time needed to reconfigure en network to changing needs from several days or weeks to a few hours, or even shorter. And as a result the network can become as dynamic a cloud resource as compute and storage already were.

But let’s not forget that for many organizations the transition to ‘as a Service’ and “on demand” began in the network area. Back in the 80-ties companies started to give up their own in-house controlled and managed wide area networks in exchange for the use of shared packet-switched networks, often based of X.25 and commonly called the “public data network”. Maybe something to remember for those currently afraid of “public clouds”?

PS Gartner press release : Software Defined Networking Creates a New Approach to Delivering Business Agility