The Infrastructure Turk: Lessons in Services

#devops #cloud If your goal is IT as a Service, then at some point you have to actually service-enable the policies that govern IT infrastructure.

My eldest shared the story of “The Turk” recently and it was a fine example of how appearances can be deceiving – and of the power of abstraction. If you aren’t familiar with the story, let me briefly share before we dive in to how this relates to infrastructure and, specifically, IT as a Service.

 The Turk, the Mechanical Turk or Automaton Chess Player was a fake chess-playing machine constructed in the late 18th century.

The Turk was in fact a mechanical illusion that allowed a human chess master hiding inside to operate the machine. With a skilled operator, the Turk won most of the games played during its demonstrations around Europe and the Americas for nearly 84 years, playing and defeating many challengers including statesmen such as Napoleon Bonaparte and Benjamin Franklin. Although many had suspected the hidden human operator, the hoax was initially revealed only in the 1820s by the Londoner Robert Willis.[2] 

-- Wikipedia, “The Turk”

The Automaton was actually automated in the sense that the operator was able to, via mechanical means, move the arm of the Automaton and thus give the impression the Automaton was moving pieces around the board. The operator could also nod and shake its head and offer rudimentary facial expressions. But the Automaton was not making decisions in any way, shape or form. The operator made the decisions and did so quite well, defeating many a chess champion of the day.

[ You might also recall this theme appeared in the “Wizard of Oz”, wherein the Professor sat behind a “curtain” and “automated” what appeared to the inhabitants to be the great Wizard of Oz. ]

The Turk was never really automated in the sense that it could make decisions and actually play chess. Unlike Watson, the centuries old Automaton was never imbued with the ability to dynamically determine what moves to make itself.

This is strikingly similar to modern “automation” and in particular the automation being enabled in modern data centers today. While automated configuration and set up of components and applications is becoming more and more common, the actual decisions and configuration are still handled by operators who push the necessary levers and turn the right knobs to enable infrastructure to react.


We need to change this model. We need to automate the Automaton in a way that enables automated provisioning initiated by the end-user, i.e. application owner. We need infrastructure and ultimately operational services not only to configure and manage infrastructure, but to provision it.  More importantly, end-users need to be able to provision the appropriate infrastructure services (policies) as well.

Right now, devops is doing a great job enabling deployment automation; that is, creating scripts and recipes that are repeatable with respect to provisioning the appropriate infrastructure resources necessary to successfully deploy an application. But what we aren’t doing (yet) is enabling those as services. We’re currently the 18th century version of the Automaton, when we want is the 21st century equivalent – automation from top to bottom (or underneath, as the analogy would require).

What we’ve done thus far is put a veneer over what is still a very manual process. Ops still determines the configuration on a per-application basis and subsequently customizes the configurations before pushing out the script. Certainly that script reduces operational costs and time whenever additional capacity is required for that application as it becomes possible to simply replicate the configuration, but it does not alleviate the need for manual configuration in the first place. Nor does it leave room for end-users to tweak or otherwise alter the policies that govern myriad operational functions across network, storage, and server infrastructure that have a direct impact – for good and for ill –on the performance, security, and stability of applications.
End users must still wait for the operator hidden inside the Automaton to make a move.

IT as a Service needs services. And not just services for devops, but services for end users, for the consumers of IT. The application owner, the business stakeholder, the admin. These services need to not only take into consideration the basic provisioning of the resources required, but the policies that govern them. The intelligence behind the Automaton needs to be codified and encapsulated in a way that makes them  as reusable as the basic provisionable resources. We need not only provision resources – an IP address, network bandwidth, and the pool of resources from which applications are served and scale, but the policies governing access, security, and even performance. These policies are at the heart of what IT provides for its consumers; the security that enables compliance and protects applications from intrusions and downtime, the dynamic adjustments required to keep applications performing within specified business requirements, the thresholds that determine the ebb and flow of compute capacity required to keep the application available.

These policies should be service-enabled and provisionable by the end-user, by the consumers of IT services.

The definitions of cloud computing , from wherever they originate, tend to focus on resources and lifecycle management of those resources. If one construes that to include applicable policies as well, then we are on the right track. But if we do not, then we need to consider from a more strategic point of view what is required of a successful application deployment. It is not just the provisioning of resources, but policies, as well, that make a deployment successful.

The Automaton is a great reminder of the power of automation, but it is just as powerful a reminder of the failure to encapsulate the intelligence and decision-making capabilities required. In the 18th century it was nearly impossible to imagine a mechanical system that could make intelligent, real-time decisions. That’s one of the reasons the Automaton was such a fascinating and popular exhibition. The revelation of the Automaton was a disappointment, because it revealed that under the hood, that touted mechanical system was still relying on manual and very human intelligence to function. If we do not pay attention to this lesson, we run the risk of the dynamic data center also being exposed as a hoax one day, still primarily enabled by manual and very human processes to function. Service-enablement of policy lifecycle management is a key component to liberating the data center and an integral part of enabling IT as a Service.

AddThis Feed Button Bookmark and Share

Published Sep 28, 2011
Version 1.0

Was this article helpful?

No CommentsBe the first to comment