cloudware
5 TopicsGoogle Gmail: The Lawn Darts of the Internet
This blog on the inadvertent sharing of Google docs led to an intense micro-conversation in the comments regarding the inadvertent sharing of e-mail. sensitive financial data, and a wealth of other private data that remained, well, not so private through that [cue scary music] deadly combination that makes security folks race for their torches and pitchforks: Google Apps and Gmail. [pause for laughter on my part. I can't say that without a straight face] Here's part of the "issue" "discovered" by the author: Closer examination of the spreadsheets, along with some online digging, indicated that a CNHI employee had most likely intended to share the reports and spreadsheets with an employee named Deirdre Gallagher. Instead, he or she typed in my Gmail address and handed me the keys to a chunk of CNHI’s Web kingdom, including the detailed financial terms for scores of Web advertising deals. [emphasis added] Many comments indicated deep displeasure with Google's e-mail functionality in terms of how it handles e-mail addresses. Other comments just seemed to gripe about Google Apps and its integration in general. "Dan" brought sanity to a conversation comprised primarily of technology finger-pointing, much of which blamed Google for people fat-fingering e-mail addresses, when he said: "The misuse of the technology can hardly be the fault of the providers." Thank you Dan, whoever you are. How insightful. How true. And how sad that most people won't - and don't - see it that way. Remember lawn darts? I do. If you're young enough you might not, because the company that manufactured and sold them stopped doing so in 1988 after they were sued when a child was tragically killed playing with them. But the truth is that a child was throwing the darts over the roof of a house at his playmate. He was misusing the lawn darts. The product was subsequently banned from sale in the US and Canada due to safety concerns. Because they were used in a way they were not intended to be used. After all, consider the millions of other children (myself included) who managed to play the game properly without ever earning so much as a scratch. The mosquito bites were more dangerous than the lawn darts when used correctly and with the proper amount of attention paid to what we were doing. This is not unlike the inadvertent sharing of Google docs. If I mistype an e-mail address, it's not the fault of my e-mail client when confidential launch plans find their way into an unintended recipients' inbox. That's my fault for not being more careful and paying attention to detail. In the aforementioned Google doc sharing escapade, Google's software did exactly what it was supposed to do: it e-mailed a copy of a Google doc to the e-mail address specified by the sender. Google has no way of knowing you meant to type "a" when you typed "o"; that's the responsibility of the individual and it's ridiculous to hold Google and its developers responsible for the intentional or unintentional mistakes of its users. Look, there are plenty of reasons to be concerned about storing sensitive corporate data on a remote server and/or in the cloud, Google or others. There are a lot of reasons to be concerned about privacy and information leaks in SaaS (Software as a Service) and Web 2.0 applications. THIS IS NOT ONE OF THEM. Those same documents could easily have been e-mailed to the wrong person, accidentally or purposefully, from someone's desktop. Mistyping e-mail addresses is not peculiar to GMail, or Hotmail, or Yahoo mail, or any other cloudware e-mail service. It's peculiar to people. Remember them? The ultimate security risk? Rather than claim this is some huge security hole (it is not) or point the finger of blame at Google for sending that e-mail, remember what your mother said about pointing... When you point one finger at Google for sending that e-mail, three fingers are pointing back at you.357Views0likes0CommentsBursting the Cloud
The cloud computing craze is leading to some interesting new terms. Cloudware and cloudbursting are two terms I particularly like for their ability to describe specific computing models based on cloud computing. Today we're going to look at cloudbursting, which is basically a new twist on an old concept. Cloudbursting appears to be to marry the traditional safe enterprise computing model with cloud computing; in essence, bursting into the cloud when necessary or using the cloud when additional compute resources are required temporarily. Jeff at Amazon Web Services Blog talks about the inception of this term as applied to the latter and describes it in his blog post as a method used by Thomas Brox Røst to regenerate a number of dynamic pages in 5 hours rather than the 7 hours that would be required if he had attempted such a feat internally. His approach is further described on The High Scalability Blog. Cloudbursting can also be used to shoulder the burden of some of an application's processing. For example, basic application functionality could be provided from within the cloud while more critical (e.g. revenue-generating) applications continue to be served from within the controlled enterprise data center. This assumes that only a portion of consumers will actually be interacting with the data-driven side of a web site (customer management, process visibility, etc...) while the greater portion will simply be browsing around on the non-interactive, as it were, side of the site. Bursting has traditionally been applied to resource allocation and automated provisioning/de-provisioning of resources, historically focused on bandwidth. Today, in the cloud, it is being applied to resources such as servers, application servers, application delivery systems, and other infrastructure required to provide on-demand computing environments that expand and contract as necessary, without manual intervention. This requires the ability to automate the cloud's data center. Data center automation in a cloud computing environment, regardless of the opacity of the model, requires more than simple workflow systems. It requires on-demand control and management over all devices in the delivery chain, from the storage to the application and web servers to the load-balancers and acceleration offerings that deliver the applications to end-users. This is more akin to data center orchestration than it is automation, as it requires that many moving parts and pieces be coordinated in order to perform a highly complex set of tasks seamlessly and with as little manual intervention as possible. This is one of the foundational requirements of a cloud computing infrastructure: on-demand, automated scalability. Data center automation is nothing new. Hosting and service providers have long automated their data centers in order to reduce the cost of customer acquisition and management, and to improve efficiency of provisioning and de-provisioning processes. These benefits can also be realized inside the data center, regardless of the model being employed. The same automation required for smooth, cost-effective management of a cloud computing data center can be utilized to achieve smooth, cost-effective management of an enterprise data center. The hybrid application deployment model involving cloud computing requires additional intelligence on the part of the application delivery network. The application delivery network must be able to understand what is being requested and where it resides; it must be able to intelligently route requests. This, too, is a fundamental attribute of cloud computing infrastructure: intelligence. When distributing an application across multiple locations, whether local servers or remote data centers or "in the cloud", it becomes necessary for a controlling node to properly route those requests based on application data. In a less sophisticated model, global load balancing could be substituted as a means of directing requests to the appropriate site, a task for which global load balancers seem a perfect fit. A hybrid approach like cloudbursting seems to be particularly appealing. Enterprises seem reluctant to move business critical applications into the cloud at this juncture but are likely more willing to assign responsibility to an outsourced provider for less critical application functionality with variable volume requirements, which fits well with an on-demand resource bursting model. Cloudbursting may be one solution that makes everyone happy.276Views0likes1CommentCloudware and information privacy: TANSTAAFL
Ars Technica is reporting on a recent Pew study on cloud computing and privacy, specifically concerning remote data storage and the kind of data-mining performed on it by providers like Google, indicates that while consumers are concerned about the privacy of their data in the cloud, they still subject themselves to what many consider to be an invasion of privacy and misuse of data. 68 percent of respondents who said they'd used cloud services declared that they would be "very" concerned, and another 19 percent at least "somewhat" concerned, if their personal data were analyzed to provide targeted advertising. This, of course, is precisely what many Web mail services, such as Google's own Gmail, do—which implies that at least some of those who profess to be "very" concerned about the practice are probably nevertheless subjecting themselves to it. One wonders why those who profess to be very concerned about privacy and data-mining tactics used by cloudware providers would continue to use those services? One answer might lie in the confusing legalese of the EULA (end user license agreement) presented by corporations. Where's F5? VMWorld Sept 15-18 in Las Vegas Storage Decisions Sept 23-24 in New York Networld IT Roadmap Sept 23 in Dallas Oracle Open World Sept 21-25 in San Francisco Storage Networking World Oct 13-16 in Dallas Storage Expo 2008 UK Oct 15-16 in London Storage Networking World Oct 27-29 in Frankfurt It's necessary, of course, that the EULA be written using the language of the courts under which it will be enforced. But there are two problems with EULAs: first, they aren't really required to be read and second, even if they were really required to be read, they can't be easily understood by the vast majority of consumers. I'll be the first to admit I rarely read EULAs. They're long, filled with legalese speak, and they always come down to the same basic set of rules: it's our software, we don't make any guarantees, and oh, yeah, any rights not specifically listed (like the use of the data you use with our "stuff") are reserved for us. It's that last line that's the killer, by the way because just about everything falls under that particular clause in the EULA. Caveat emptor truly applies in the world of cloudware and online services. Buyer beware! You may be agreeing to all sorts of things you didn't intend. The argument against such privacy and security assurances for consumers is that they aren't paying for the service, therefore the provider needs some way to generate revenue to continue providing the service. That revenue is often generated by advertising and partnerships, but it's also largely provided by selling off personal information either directly gleaned from users or mined from their data. Which is what Google does with GMail. Enterprises, at least, are not only aware of but thoroughly understand the ramifications of storing their data "in the cloud". SaaS (Software as a Service) has had to provide proof positive that the data stored in their systems are the property of the consumer, that the data is not being used for data-mining or sharing purposes, and that security is in place to protect it from theft/viewing/etc... But in between the consumer and the enterprise markets lies the SMB, the small-medium business. Not quite financially able to afford a full data center and IT staff of their own, they often take advantage of cloudware services as a stop-gap measure. But in doing so, they put their business and data at risk, because they aren't necessarily using cloudware designed with businesses in mind, at least not from a data security perspective, and that means they are often falling under the more liberal end-user license agreement. All bets are off on the sanctity of their data. TANSTAAFL. There ain't no such thing as a free lunch, people, and that has never rang as true as it does in the world of cloudware and online services. If it's heralded as "free" that only means you aren't paying money for it, but you are bartering for the service; exchanging your personal information and data for the privilege of using that online service. In many cases folks weigh the value they receive from the "free" service against divulging personal information and data and make an informed choice to exchange that information for the service. When that's the case - the consumer or business is making an informed choice - it's all good. Everybody wins. Bartering is, after all, the oldest form of exchanging goods or services. And it's still used today. My grandmother paid her doctor three chickens for delivering my father, and that was in the mid 1900s, not that long ago at all. So exchanging personal information and access to your data for services is completely acceptable; just make sure you understand that's what you're doing - especially if you're a business.240Views0likes0CommentsThe Three "Itys" of Cloud Computing
No matter where you deploy it, it's still your application Related Reading Everyone's talking about cloud computing and cloudware (applications in the cloud) services and pointing to the hiccups of several major cloud providers already this year. Reliability, availability, and security are still major concerns, and yet some reports indicate these three "itys" aren't impeding adoption of cloud computing models at all. Applications, whether in the cloud or in the corporate data center, are still delivered via a network. It is more often than not that the network is at the heart of both the successful and the unsuccessful deployment of applications. Cloud Computing Adoption Grows Despite Concerns Cloudware and Information Privacy: TANSTAAFL Bursting the Cloud The appeal of cloud computing is, in part, due to its obfuscated nature. The underlying delivery infrastructure is hidden in "the cloud" and can therefore largely be ignored by consumers - whether individuals or organizations. Or can it? In June, Analyst firm Gartner published a research note ("You Can't Do Cloud Computing Without the Right Cloud (Network)", June 23, 2008, ID Number: G00158513) cautioning early adopters to be more aware of the infrastructure that is the core of cloud computing. The promise of cloud computing is ubiquitous access to a broad set of applications and services, which are delivered over the Internet and related networks, to multiple customers. To deliver on that promise, the cloud must provide a rich set of network services to a broad set of applications and services. Not all applications are the same: some will only require the basic capabilities available on the public Internet, while others may require an overlay on top (the "augmented Internet"), or even a private, Internet Protocol network with application-specific capabilities. What may work for one subscriber of a cloud-based service may not be appropriate for another, so cloud computing providers need to understand network delivery issues and be prepared to deliver multiple cloud network options to their subscribers. Options are not only a concern for the providers, but should be a concern for subscribers. It behooves the subscriber to be aware of the needs of their application(s) and ensure that the cloud computing provider can meet those needs. While cloud computing can certainly alleviate the cost of maintaining an application delivery network from the shoulders of the organization, it is not erased; only shifted to the provider. But the responsibility for ensuring that applications are delivered and performing well still lies with the organization. After all, the allure of cloud computing is often that the consumer of applications does not care - nor needs to care - where the application physically resides. The application consumer cares only that the application is secure, fast, and available. That means it is still important for organizations deploying applications in the cloud to ensure that the application delivery network over which a cloud computing provider will deliver applications has implemented an infrastructure that adequately addresses the three "itys" of cloud computing: reliability, availability, and security. RELIABILITY AVAILABILITY SECURITY Reliability is generally a measure of how long a system remains available between failures. It is often reported in terms of "Mean Time Between Failures (MTBF)". Reliability can be measured either on an individual infrastructure component basis or, more appropriately for cloud computing, on a system wide basis. Availability is a measure of application downtime, often presented as the number of "9s" an application is or has been availability in a given time period. Availability is often assured through the use of network infrastructure such as application delivery controllers as these solutions are capable of reacting dynamically to changing conditions in the network and application infrastructure and can direct requests in such a way as to ensure they are responded to. Security is a broad topic covering both the protection of data in-flight and at-rest as well as the real-time interactions of applications. Security of the entire application infrastructure stack - from layer 2 to layer 7 - should be evaluated. It will not do to secure the application against vulnerabilities if a simple layer 4 DoS attack can prevent the application from being used. QUESTIONS TO ASK QUESTIONS TO ASK QUESTIONS TO ASK Ask your cloud computing provider about the reliability of servers, routers, switches, and delivery network infrastructure. Ensure that the underlying infrastructure has been architected using solutions from reputable vendors. If you wouldn't deploy the solution(s) used in your own data center, you might want to consider other cloud computing provider options. Ask if load balancing and application delivery features are part of the solution. If they are, ask what kind of control you have over the configuration and options: are applications simply load balanced using standard algorithms or are more advanced options in place to assure availability? Can the infrastructure react dynamically? Can it automatically redirect requests from slow or down servers to faster, more available ones? Ask about security in place for all layers of the application stack. Are common layer 2-3 attacks mitigated? How are layer 7 (application) vulnerabilities addressed? Investigate the three AAAs of security: authentication, authorization, and auditing, to understand what measures are in place to help ensure your cloud-based application is secure. Note that despite the hype around virtualization and the way many people like to tie it to cloud computing like a tick on a dog, it doesn't end with "ity". What does that mean? It means that while virtualization is currently enabling many of the cool "tricks" associated with cloud computing, without the right infrastructure it doesn't matter one iota whether virtualization is involved. If you can't reliably access the applications running in that virtual image, does it really matter whether it's virtual or not? Cloud computing isn't going away and it's likely that you'll use it in one way or another to deploy an application. Because that application is "in the cloud", consumers of that application don't know - or care - where it's physically deployed. You are responsible for its performance and availability, and it's you that will field the calls if it fails or performs poorly. You can mitigate much of the potential risk by ensuring that you choose a cloud computing provider with a strong cloud computing infrastructure; one that has addressed the three "itys" of cloud computing.186Views0likes0Comments- 169Views0likes0Comments