storage
161 TopicsF5 Predicts: Education gets personal
The topic of education is taking centre stage today like never before. I think we can all agree that education has come a long way from the days where students and teachers were confined to a classroom with a chalkboard. Technology now underpins virtually every sector and education is no exception. The Internet is now the principal enabling mechanism by which students assemble, spread ideas and sow economic opportunities. Education data has become a hot topic in a quest to transform the manner in which students learn. According to Steven Ross, a professor at the Centre for Research and Reform in Education at Johns Hopkins University, the use of data to customise education for students will be the key driver for learning in the future[1].This technological revolution has resulted in a surge of online learning courses accessible to anyone with a smart device. A two-year assessment of the massive open online courses (MOOCs) created by HarvardX and MITxrevealed that there were 1.7 million course entries in the 68 MOOC [2].This translates to about 1 million unique participants, who on average engage with 1.7 courses each. This equity of education is undoubtedly providing vast opportunities for students around the globe and improving their access to education. With more than half a million apps to choose from on different platforms such as the iOS and Android, both teachers and students can obtain digital resources on any subject. As education progresses in the digital era, here are some considerations for educational institutions to consider: Scale and security The emergence of a smogasborad of MOOC providers, such as Coursera and edX, have challenged the traditional, geographical and technological boundaries of education today. Digital learning will continue to grow driving the demand for seamless and user friendly learning environments. In addition, technological advancements in education offers new opportunities for government and enterprises. It will be most effective if provided these organisations have the ability to rapidly scale and adapt to an all new digital world – having information services easily available, accessible and secured. Many educational institutions have just as many users as those in large multinational corporations and are faced with the issue of scale when delivering applications. The aim now is no longer about how to get fast connection for students, but how quickly content can be provisioned and served and how seamless the user experience can be. No longer can traditional methods provide our customers with the horizontal scaling needed. They require an intelligent and flexible framework to deploy and manage applications and resources. Hence, having an application-centric infrastructure in place to accelerate the roll-out of curriculum to its user base, is critical in addition to securing user access and traffic in the overall environment. Ensuring connectivity We live in a Gen-Y world that demands a high level of convenience and speed from practically everyone and anything. This demand for convenience has brought about reform and revolutionised the way education is delivered to students. Furthermore, the Internet of things (IoT), has introduced a whole new raft of ways in which teachers can educate their students. Whether teaching and learning is via connected devices such as a Smart Board or iPad, seamless access to data and content have never been more pertinent than now. With the increasing reliance on Internet bandwidth, textbooks are no longer the primary means of educating, given that students are becoming more web oriented. The shift helps educational institutes to better personalise the curriculum based on data garnered from students and their work. Duty of care As the cloud continues to test and transform the realms of education around the world, educational institutions are opting for a centralised services model, where they can easily select the services they want delivered to students to enhance their learning experience. Hence, educational institutions have a duty of care around the type of content accessed and how it is obtained by students. They can enforce acceptable use policies by only delivering content that is useful to the curriculum, with strong user identification and access policies in place. By securing the app, malware and viruses can be mitigated from the institute’s environment. From an outbound perspective, educators can be assured that students are only getting the content they are meant to get access to. F5 has the answer BIG-IP LTM acts as the bedrock for educational organisations to provision, optimise and deliver its services. It provides the ability to publish applications out to the Internet in a quickly and timely manner within a controlled and secured environment. F5 crucially provides both the performance and the horizontal scaling required to meet the highest levels of throughput. At the same time, BIG-IP APM provides schools with the ability to leverage virtual desktop infrastructure (VDI) applications downstream, scale up and down and not have to install costly VDI gateways on site, whilst centralising the security decisions that come with it. As part of this, custom iApps can be developed to rapidly and consistently deliver, as well as reconfigure the applications that are published out to the Internet in a secure, seamless and manageable way. BIG-IP Application Security Manager (ASM) provides an application layer security to protect vital educational assets, as well as the applications and content being continuously published. ASM allows educational institutes to tailor security profiles that fit like a glove to wrap seamlessly around every application. It also gives a level of assurance that all applications are delivered in a secure manner. Education tomorrow It is hard not to feel the profound impact that technology has on education. Technology in the digital era has created a new level of personalised learning. The time is ripe for the digitisation of education, but the integrity of the process demands the presence of technology being at the forefront, so as to ensure the security, scalability and delivery of content and data. The equity of education that technology offers, helps with addressing factors such as access to education, language, affordability, distance, and equality. Furthermore, it eliminates geographical boundaries by enabling the mass delivery of quality education with the right policies in place. [1] http://www.wsj.com/articles/SB10001424052702304756104579451241225610478 [2] http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2586847876Views0likes3CommentsF5 Automated Backups - The Right Way
Hi all, Often I've been scouring the devcentral fora and codeshares to find that one piece of handywork that will drastically simplify my automated backup needs on F5 devices. Based on the works of Jason Rahm in his post "Third Time's the Charm: BIG-IP Backups Simplified with iCall" on the 26th of June 2013, I went ahead and created my own iApp that pretty much provides the answers for all my backup-needs. Here's a feature list of this iApp: It allows you to choose between both UCS or SCF as backup-types. (whilst providing ample warnings about SCF not being a very good restore-option due to the incompleteness in some cases) It allows you to provide a passphrase for the UCS archives (the standard GUI also does this, so the iApp should too) It allows you to not include the private keys (same thing: standard GUI does it, so the iApp does it too) It allows you to set a Backup Schedule for every X minutes/hours/days/weeks/months or a custom selection of days in the week It allows you to set the exact time, minute of the hour, day of the week or day of the month when the backup should be performed (depending on the usefulness with regards to the schedule type) It allows you to transfer the backup files to external devices using 4 different protocols, next to providing local storage on the device itself SCP (username/private key without password) SFTP (username/private key without password) FTP (username/password) SMB (using smbclient, with username/password) Local Storage (/var/local/ucs or /var/local/scf) It stores all passwords and private keys in a secure fashion: encrypted by the master key of the unit (f5mku), rendering it safe to store the backups, including the credentials off-box It has a configurable automatic pruning function for the Local Storage option, so the disk doesn't fill up (i.e. keep last X backup files) It allows you to configure the filename using the date/time wildcards from the tcl [clock] command, as well as providing a variable to include the hostname It requires only the WebGUI to establish the configuration you desire It allows you to disable the processes for automated backup, without you having to remove the Application Service or losing any previously entered settings For the external shellscripts it automatically generates, the credentials are stored in encrypted form (using the master key) It allows you to no longer be required to make modifications on the linux command line to get your automated backups running after an RMA or restore operation It cleans up after itself, which means there are no extraneous shellscripts or status files lingering around after the scripts execute I wasn't able to upload the iApp template to this article, so I threw it on pastebin: http://pastebin.com/YbDj3eMN Enjoy! Thomas Schockaert9.1KViews0likes79CommentsGetting Up And Running With F5 ARX Virtual Edition
ARX (Adaptive Resource Switch) is a filesystem virtualization switch, which provides intelligent file virtualization for storage environments. ARX can be used to tier files across multiple filers, provide disaster recovery services, and allow administrators to migrate files between backend filers without disruption of service. These are a few of the use cases for ARX, but by no means all of them. Over the next few months, we will be covering ARX heavily here on DevCentral. Let’s get started by deploying ARX Virtual Edition in our ESX environment. ARX Virtual Edition resource requirements The resource requirements are difference depending on the license you have for your ARX VE. If you have a trial license obtained from http://www.f5.com/trial the requirements are as follows: 1 CPU core, 64-bit architecture 2 GB of memory 1 Virtual NIC (VNIC) 40 GB or more of hard drive space If you have a production license or evaluation license, the requirements are as follows: 2 CPU cores, 64-bit architecture 4 GB of memory 1 Virtual NIC (VNIC) 40 GB or more of hard drive space Prerequisites Download and install VMWare ESXi Download VMWare vSphere Client from ESXi host and install on local workstation Generate registration key and download F5 ARX Virtual Edition (VE) Installation Connect to ESXi host using VMWare vSphere Client Select ‘File’, then ‘Deploy OVF Template…’ Browse to the location of the OVF template (arxve-esx-trial-XXXXX.ova), select it, and click ‘Next’ Check the ‘OVF Template Details and make sure everything looks correct, then click ‘Next’ Name your Virtual Edition instance, click ‘Next’ Select ‘Trial’ from the ‘Configuration’ drop-down, click ‘Next’ Select ‘Thick provisioned format’ to allocate the disk space immediately (only select ‘Thin provisioned format’ if disk space is an issue and you fully understand its implications), click ‘Next’ Select the network you would like to attach the ARX VE VNIC to, click ‘Next’ Verify all of the deployment details and click ‘Finish’ if everything is correct The OVF template will now be deployed onto the ESXi box. This process can take 5-10 minutes. Configuration – command-line portion Power on ARX VE virtual machine Open console tab Press ‘enter’ to start the ‘Switch Configuration Wizard’ Enter the following parameters (substitute your values where necessary) Management port IP address 10.0.0.150 Management port subnet mask 255.255.255.0 Management port gateway IP address 10.0.0.254 Switch private IP address (accept default) Switch private subnet mask (accept default) Chassis UUID (accept default) Crypto-officer username admin Crypto-officer password (use your own) System password (use your own) Master key (accept default) Once you have completed the form, type ‘yes’ to accept your changes Wait for all the services to restart Configuration – web interface portion Open a web browser and navigate to the filer over HTTPS (https://10.0.0.150 in our case) Login with the crypto-officer username and password From the ‘Common Operations’ leaf, select ‘Initial Setup…’ Begin the initial setup wizard (these values will vary with your local environment) Switch name test-arx-ve Management protocols SSH, API Access – HTTPS (we may want this for some iControl examples later on) Proxy IP addresses 10.0.0.160 (1 is required for ARX VE) NTP server 10.0.0.254 Time zone region North and South America contrinents Time zone city PST (-0800)/PDT (-0700) United States Pacific time DNS domain name arx-test-ve.f5-test.com Primary DNS server 10.0.0.254 DNS search domains f5-test.com Fully qualified domain name of e-mail server smtp.f5-test.com E-mail recipients admin@f5-test.com SNMP (optional) Confirm your configuration and click ‘Finish’ to finalize the initial setup At this point, the VE instance should be fully configured with the exception of licensing the unit Activating the license automatically Locate the email you received containing the registration key and copy the key to your clipboard Select ‘Activate License…’ from the web interface of ARX Manager Paste the registration key into the field of the pop-up window and select ‘Automatic’ if your ARX has a route to the Internet, click ‘Next’ Agree to the EULA terms and click ‘Next’ If automatic activation worked correctly, you should be presented with a confirmation screen, click ‘Finish’ Activating the license manually Follow steps 1-2 from the previous selection Paste the registration key into the field of the pop-up window and select ‘Manual’ Copy the dossier to your clipboard and click ‘Click here to access F5 licensing server’ Paste dossier into the text field and click ‘Next’ Copy the license to your clipboard and post it into the ‘License’ field of the pop-up window, click ‘Next’ If manual activation worked correctly, you should be presented with a confirmation screen, click ‘Finish’ (should look like confirmation from previous section, step 5) Conclusion Now that you’ve got ARX VE up and running in your environment, you can begin playing with some of the functionality. We will be using a combination of commercial filer simulators in the examples that follow over the next few months. If you want to get ahead of the game, get your favor filer up and running on your ESX box. See everyone next week!352Views0likes1CommentThe Challenges of SQL Load Balancing
#infosec #iam load balancing databases is fraught with many operational and business challenges. While cloud computing has brought to the forefront of our attention the ability to scale through duplication, i.e. horizontal scaling or “scale out” strategies, this strategy tends to run into challenges the deeper into the application architecture you go. Working well at the web and application tiers, a duplicative strategy tends to fall on its face when applied to the database tier. Concerns over consistency abound, with many simply choosing to throw out the concept of consistency and adopting instead an “eventually consistent” stance in which it is assumed that data in a distributed database system will eventually become consistent and cause minimal disruption to application and business processes. Some argue that eventual consistency is not “good enough” and cite additional concerns with respect to the failure of such strategies to adequately address failures. Thus there are a number of vendors, open source groups, and pundits who spend time attempting to address both components. The result is database load balancing solutions. For the most part such solutions are effective. They leverage master-slave deployments – typically used to address failure and which can automatically replicate data between instances (with varying levels of success when distributed across the Internet) – and attempt to intelligently distribute SQL-bound queries across two or more database systems. The most successful of these architectures is the read-write separation strategy, in which all SQL transactions deemed “read-only” are routed to one database while all “write” focused transactions are distributed to another. Such foundational separation allows for higher-layer architectures to be implemented, such as geographic based read distribution, in which read-only transactions are further distributed by geographically dispersed database instances, all of which act ultimately as “slaves” to the single, master database which processes all write-focused transactions. This results in an eventually consistent architecture, but one which manages to mitigate the disruptive aspects of eventually consistent architectures by ensuring the most important transactions – write operations – are, in fact, consistent. Even so, there are issues, particularly with respect to security. MEDIATION inside the APPLICATION TIERS Generally speaking mediating solutions are a good thing – when they’re external to the application infrastructure itself, i.e. the traditional three tiers of an application. The problem with mediation inside the application tiers, particularly at the data layer, is the same for infrastructure as it is for software solutions: credential management. See, databases maintain their own set of users, roles, and permissions. Even as applications have been able to move toward a more shared set of identity stores, databases have not. This is in part due to the nature of data security and the need for granular permission structures down to the cell, in some cases, and including transactional security that allows some to update, delete, or insert while others may be granted a different subset of permissions. But more difficult to overcome is the tight-coupling of identity to connection for databases. With web protocols like HTTP, identity is carried along at the protocol level. This means it can be transient across connections because it is often stuffed into an HTTP header via a cookie or stored server-side in a session – again, not tied to connection but to identifying information. At the database layer, identity is tightly-coupled to the connection. The connection itself carries along the credentials with which it was opened. This gives rise to problems for mediating solutions. Not just load balancers but software solutions such as ESB (enterprise service bus) and EII (enterprise information integration) styled solutions. Any device or software which attempts to aggregate database access for any purpose eventually runs into the same problem: credential management. This is particularly challenging for load balancing when applied to databases. LOAD BALANCING SQL To understand the challenges with load balancing SQL you need to remember that there are essentially two models of load balancing: transport and application layer. At the transport layer, i.e. TCP, connections are only temporarily managed by the load balancing device. The initial connection is “caught” by the Load balancer and a decision is made based on transport layer variables where it should be directed. Thereafter, for the most part, there is no interaction at the load balancer with the connection, other than to forward it on to the previously selected node. At the application layer the load balancing device terminates the connection and interacts with every exchange. This affords the load balancing device the opportunity to inspect the actual data or application layer protocol metadata in order to determine where the request should be sent. Load balancing SQL at the transport layer is less problematic than at the application layer, yet it is at the application layer that the most value is derived from database load balancing implementations. That’s because it is at the application layer where distribution based on “read” or “write” operations can be made. But to accomplish this requires that the SQL be inline, that is that the SQL being executed is actually included in the code and then executed via a connection to the database. If your application uses stored procedures, then this method will not work for you. It is important to note that many packaged enterprise applications rely upon stored procedures, and are thus not able to leverage load balancing as a scaling option. Depending on your app or how your organization has agreed to protect your data will determine which of these methods are used to access your databases. The use of inline SQL affords the developer greater freedom at the cost of security, increased programming(to prevent the inherent security risks), difficulty in optimizing data and indices to adapt to changes in volume of data, and deployment burdens. However there is lively debate on the values of both access methods and how to overcome the inherent risks. The OWASP group has identified the injection attacks as the easiest exploitation with the most damaging impact. This also requires that the load balancing service parse MySQL or T-SQL (the Microsoft Transact Structured Query Language). Databases, of course, are designed to parse these string-based commands and are optimized to do so. Load balancing services are generally not designed to parse these languages and depending on the implementation of their underlying parsing capabilities, may actually incur significant performance penalties to do so. Regardless of those issues, still there are an increasing number of organizations who view SQL load balancing as a means to achieve a more scalable data tier. Which brings us back to the challenge of managing credentials. MANAGING CREDENTIALS Many solutions attempt to address the issue of credential management by simply duplicating credentials locally; that is, they create a local identity store that can be used to authenticate requests against the database. Ostensibly the credentials match those in the database (or identity store used by the database such as can be configured for MSSQL) and are kept in sync. This obviously poses an operational challenge similar to that of any distributed system: synchronization and replication. Such processes are not easily (if at all) automated, and rarely is the same level of security and permissions available on the local identity store as are available in the database. What you generally end up with is a very loose “allow/deny” set of permissions on the load balancing device that actually open the door for exploitation as well as caching of credentials that can lead to unauthorized access to the data source. This also leads to potential security risks from attempting to apply some of the same optimization techniques to SQL connections as is offered by application delivery solutions for TCP connections. For example, TCP multiplexing (sharing connections) is a common means of reusing web and application server connections to reduce latency (by eliminating the overhead associated with opening and closing TCP connections). Similar techniques at the database layer have been used by application servers for many years; connection pooling is not uncommon and is essentially duplicated at the application delivery tier through features like SQL multiplexing. Both connection pooling and SQL multiplexing incur security risks, as shared connections require shared credentials. So either every access to the database uses the same credentials (a significant negative when considering the loss of an audit trail) or we return to managing duplicate sets of credentials – one set at the application delivery tier and another at the database, which as noted earlier incurs additional management and security risks. YOU CAN’T WIN FOR LOSING Ultimately the decision to load balance SQL must be a combination of business and operational requirements. Many organizations successfully leverage load balancing of SQL as a means to achieve very high scale. Generally speaking the resulting solutions – such as those often touted by e-Bay - are based on sound architectural principles such as sharding and are designed as a strategic solution, not a tactical response to operational failures and they rarely involve inspection of inline SQL commands. Rather they are based on the ability to discern which database should be accessed given the function being invoked or type of data being accessed and then use a traditional database connection to connect to the appropriate database. This does not preclude the use of application delivery solutions as part of such an architecture, but rather indicates a need to collaborate across the various application delivery and infrastructure tiers to determine a strategy most likely to maintain high-availability, scalability, and security across the entire architecture. Load balancing SQL can be an effective means of addressing database scalability, but it should be approached with an eye toward its potential impact on security and operational management. What are the pros and cons to keeping SQL in Stored Procs versus Code Mission Impossible: Stateful Cloud Failover Infrastructure Scalability Pattern: Sharding Streams The Real News is Not that Facebook Serves Up 1 Trillion Pages a Month… SQL injection – past, present and future True DDoS Stories: SSL Connection Flood Why Layer 7 Load Balancing Doesn’t Suck Web App Performance: Think 1990s.2.3KViews0likes1CommentF5預測:智慧應用網路趨勢 教育邁向個人化
This is a localised version of the original article here. 科技改變了所有產業,教育也不例外,教育課題從未像今天這樣受到關注。我相信大家都同意教育已歷經了重大的改變,學生和教師不再像以前那樣被局限在一間掛著黑板的教室。 現在,網際網路成為主要的促成機制,學生藉由它聚集在一起,傳播理念和發掘經濟機會。教育資源已成為一個熱門的主題,對學生的學習方式帶來改變。Johns Hopkins大學教育研究與改革中心教授Steven Ross認為,運用資料建立客製化教育將成為未來促成學生學習的關鍵。這項科技革命產生了大量的線上學習課程,讓任何人都可以透過智慧型裝置存取。 哈佛和MIT針對大規模網路免費公開課程(massive open online course; MOOC)所做的二年評估發現,68個MOOC共有170萬筆課程註冊[2]。相當於約100萬參與者,平均每人參加1.7個課程。 毫無疑問的,此種公平教育為全球學生提供了龐大的機會,並且改善他們獲得教育的方式。iOS和Android等平台提供超過50萬個app,讓教師與學生都能輕易取得任何主題的數位資源。 教育隨著數位年代的發展而改變,以下幾點是教育機構應注意的: 應用為中心的規模與安全性 眾多MOOC供應者例如Coursera和edX的出現,已改變了今日教育的傳統地理和科技藩籬。數位學習將繼續成長,而人們也更加需要一個具備親和性的無縫學習環境。 再者,教育科技的發展也為政府和企業帶來新機會。如果這些組織有能力可以快速延展並順應嶄新的數位世界 - 讓資訊服務更容易提供、取得並確保安全,那將是最有效率的方法。 許多教育機構擁有一如大型跨國公司的龐大使用者數量,而在應用的交付上也面對了延展方面的問題。現在的目標不再是如何設法讓學生獲得快速連接,而是要如何快速的配置和提供內容服務,以及建立無縫接軌般的使用經驗。 傳統方法不再能夠為我們的客戶提供所需的水平延展。他們需要一個智慧化且彈性的架構,以支援部署和管理應用與資源。因此,現在的關鍵在於必須建構一個以應用為中心的基礎設施,讓課程能夠快速提供給用戶群,並且確保整體環境的使用者存取與流量安全。 無縫存取連接性的確保 我們生活在Y世代,對於任何人任何事都要求高度的便利性與速度。 對於便利的要求,也涵蓋了將教育提供給學生的方式。再者,物聯網(Internet of things; IoT)已開啟了全新的可能性,讓教師有更多方法可以教育他們的學生。 不論教育與學習是否透過連網裝置例如Smart Board或iPad為之,最重要的是必須確保資料與內容的無縫存取。我們對於網際網路頻寬越來越依賴,學生頻繁利用Web,因此教科書已不再是主要的教育方法。此種改變讓教育機構能夠根據他們收集到的學生與課業資料,建立更具個人化的課程。 安全責任與義務 雲端技術持續測試和改變全球教育。教育機構選擇一種集中化的服務模型,可以輕易的從中選擇希望提供給學生的服務以強化學習經驗。 因此,教育機構對於內容的選擇以及學生取得服務的方式,必須肩負安全責任與義務(duty of care)。他們可以透過政策,藉由強化的使用者身分識別與存取權限以確保將正確申請課程內容提供給各申請學生。 透過對app安全性的強化,教育機構將可以有效防護其環境的惡意程式與病毒威脅。教育者可以確保學生只取得他們確實希望取得的內容。 F5提供解決方案 BIG-IP LTM為教育組織提供一個穩固的礎石,支援配置、優化和交付服務。它讓教育機構能夠以一種快速且高時效的方式,在一個控制且安全的環境將應用發行到網際網路。F5方案提供所需的效能和水平延展能力,以確保最高層級的資料吞吐效能。 同時,BIG-IP APM讓學校能夠運用虛擬桌面基礎設施(virtual desktop infrastructure; VDI)應用,向上與向下延展而無需安裝昂貴的VDI閘道,並且提供集中化的安全性。再者,學校可以開發客製化iApps以便快速且一致的交付應用,以及以一種安全、無間隙且可管理的方式為那些已發行到網際網路的應用進行重新組態。 BIG-IP Application Security Manager (ASM)提供一個應用層安全性,保護重要的教育資產以及持續發行的應用和內容。ASM允許教育機構建立量身訂製的安全政策,無間隙的保護每一個應用,確保所有應用都能以安全的方式交付。 明日數位化的教育 我們很難不察覺科技對教育所造成的深遠衝擊。數位年代的科技建立了一種新層級的個人化學習。 教育數位化的時機已成熟,科技必須站在第一線確保內容與資料的安全、延展和交付。 科技促成的公平教育,協助解決教育的取得途徑、語言、經濟、距離和公平性等問題。再者,它破除了地理疆界,在妥善的政策保護下,促成高品質教育的大規模普及。253Views0likes0CommentsOn Cloud Nine: Lucky 7 questions you have about Singapore’s journey to becoming a Smart Nation
The Infocomm Development Authority (IDA) wants to make Singapore the world’s first Smart Nation, and this vision means connecting devices, things and people to provide better quality of life in an era of mobility, urban density, aging population and so on. IDA's executive deputy chairman, Steve Leonard, has said that when tackling difficult urban challenges in areas such as healthcare and energy, enterprises in Singapore need to capture and analyze massive amounts of data, and use that situational awareness to take meaningful actions (link). From a technology perspective, cloud has reached a tipping point in the enterprise. An exciting new era of cloud deployments is being ushered in, one characterised by high levels of flexibility, agility and innovation. Today, cloud is no longer just a buzzword, but an integral fabric of the modern enterprise. Conversation nowadays have shifted from cloud deployments to optimizing those resources and thus improving the overall user experience. Singapore’s Smart Nation vision has entered the “build” phase. What gives? The focus on infrastructure and services will serve as the nation’s framework. There are three areas of innovation: Smart Logistics, Smart Nation Tech Challenges and Smart Health-Assist. The vision to connect devices, things and people is a grand one, and starts with ensuring the integrity of the nation’s framework is built on a strong foundation. Applications and connectivity are at the heart of this vision and the technologies enabling flow of information are increasingly cloud-based. Enterprises are fast adopting a hybrid-cloud infrastructure, so sensitive data can be stored in a private cloud while the public cloud can be leveraged for computational resources to provide for the running of less critical applications. 2.To cloud or not to cloud? – That is the question! As early as 2013, 83% of Singaporean companies felt they have already experienced the financial advantages of cloud deployments. This is 16% more than the global average (link). The journey of cloud adoption is aligned with the Smart Nation Initiative and Singapore is a significant investor in cloud adoption. State initiatives aside, a question to ask is “to cloud or not to cloud”? The many benefits of cloud adoption include quicker disaster recovery times and increased collaboration amongst employees since they are able to sync up and work on documents and shared apps simultaneously. All these can only result in a positive business impact as productivity goes up. More importantly the cloud provides for business agility allowing companies to scale up and down their information infrastructure in a relatively short time frame, sometimes with the benefit of paying for capacity that is being consumed. “Pay to use” versus “Buy to Depreciate” provides for a better financial argument which generally goes well with CFOs. As technologies such as IoT become mainstream and as Singapore moves forward to becoming a Smart Nation, the correct question to ask is “How do we effectively deploy and maximise the potential of cloud?” Right…so, how can we fully maximise the potential of cloud and turn them into a positive business impact? Business has reached the tipping point of cloud computing with the utilisation of cloud both inside and outside the enterprise. To fully maximize the potential of cloud, there are 4 notable considerations for an enterprise cloud strategy. Applications: Companies today run a remarkable number of workloads within their IT environments, with some enterprises running more than 100 concurrently. Most of these applications demand differing sets of requirements and characteristics. However, as cloud-based services start to demonstrate the capability and maturity to run core workloads, confidence in off-premise solutions is increasing. The result, today’s enterprises are gaining more confidence in migrating critical workloads to a cloud environment. Business Decision-makers: The self-service nature of cloud solutions is starting to evolve decision-making process away from IT, and into one that involves multiple stakeholders and business leaders. More and more, departmental heads will play major role in identifying needs and shortlisting cloud solutions. Compliance/risk directors then need to take the lead in evaluating solutions and manage risks, while the entire C-suite make the final purchase-decision. Customers: Cloud, and indeed IT in general, has traditionally focused on internal enterprise and benefits such as cost savings, resource optimization and business agility. However, this ignores a key segment of IT user pool – the Customer! Forward-thinking businesses are now beginning to evaluate what cloud means to their customers and how they can leverage it to enhance the customer experience. Defence: Security and privacy of IT environments are perennial topics to any cloud discussions, whether it is about apps, business or customers. Security is often highlighted as the biggest impediment in adopting cloud services or choosing service providers. Security considerations should never be an afterthought to any cloud migration planning and should be considered and deliberated extensively prior to any move to the cloud. A “security-first” approach to a cloud strategy will ensure that the move to the cloud does not cause any major operational or internal policy issues as well as ensure a smooth customer experience. This should be complemented with a “follow the apps” defensive posture where the app security services should be fronting the application wherever it resides. Where’s the future of cloud headed? From optimization to orchestration. Today, the primary use of cloud services is to optimize and streamline conventional business processes. This will change. Enterprises will next leverage cloud services to automate business processes and drive business transformation. There will also be more collaborative decision-making in cloud service procurement. The role of the CIO is set to shift from information to innovation. With the inclusion of customers in the IT user pool, enhancing customer experience through high availability and performance of business apps is crucial. Cloud will continue to evolve. 5. Is Cloud safe? As more applications and technology becomes increasingly cloud-based, especially as we mature into a Smart Nation, how can we ensure information transfer over Cloud is safe? Every battle is won before it is fought, says Sun Tzu. This is also the philosophy undertaken by Singapore in the march towards being a Smart Nation. And security continues to be one of the largest barriers to cloud adoption. It is also a key consideration in a hyper-connected environment and the prolific use of applications adds an additional layer of challenge. Organisations generally do a decent job securing their infrastructure but face challenges when securing applications regardless if these applications are hosted in-house, in a cloud environment or both. The security strategy should encompass considerations at the network/infrastructure area, applications and web assets, endpoints and devices, users behaviours. Security is everyone’s business and a foresight consideration. How can Cloud benefit enterprises as Singapore moves to become a Smart Nation? Companies are investing in cloud and using it for competitive reasons. 77% of senior information technology executives have placed high importance on digital transformation and count it as a key factor for driving the business growth of their organisations (link). Improving operational excellence and customer experience are some of the reasons why cloud adoption is on the rise. Innovations in IoT are evolving and continue to shape how people use and interact with the technologies. New devices will emerge and the technologies will evolve with these devices which in turn will shape how information is being delivered to users. In a Smart Nation where hyper connectivity is at the heart of everything, accessing information and applications in a secure and seamless manner is key and cloud will play a crucial part in its success. OK, so Cloud deployment and Smart Nation: they are highly intertwined but the investment outlay and management complexity remains a key barrier. How can we efficiently manage this? It is a common perception associated with the build out of a cloud strategy but the key consideration is knowing how applications are being consumed and the corresponding services they need. In reality, not all applications will be delivered from the cloud due to the nature or the intent of the application, especially if there is a high level of data sensitivity or the need for high operational in house secure management. We will likely see the emergence of a hybrid cloud architectures requiring seamless management and orchestration services with a balanced security posture both in house and in the cloud. This will be especially applicable for the delivery of citizen services in an aspiring smart nation like Singapore where mobile and internet penetration rates are high and technology adoption is prevalent in every aspect of our daily lives. The expectation for service on demand will increase as IoT adoption becomes mainstream and becomes interconnected with the social platforms. Architecting the infrastructure from this perspective allows for better and efficient management and reduce costs in deploying cloud. Organisations and Governments alike are already starting to build out their own cloud strategy in an attempt to drive business growth and national transformation. In an increasingly connected world where mobility is driving productivity and consumption of information, cloud adoption in a hyper connected Smart Nation will spur productivity and improve customer satisfaction, with the right consideration and strategy. At the end of the day, a cloud strategy is just one of the many means to an end – an end to become a smart nation, a nation where the citizens and corporations alike are empowered to harness technology for driving growth.255Views0likes0Comments雲端安全顧慮:無端的猜疑?
This blog is adapted from the original post here. 企業組織採納雲端技術可以帶來許多不可否認的效益,包括成本節省、商務敏捷性、以及讓使用多種運算裝置的員工們達到更好的生產力。根據IDC所做的第五個年度終端使用者調查,亞太區資訊長(CIO)在2013年增加50%的雲端服務與技術支出,達到75億美元。再者,他們對於要使用何種類型的雲端模式以及要在雲端之上執行何種工作負荷,也都有必較特定的選擇。IDC在最近的Vendor Spotlight報告中指出,此一轉移趨勢提高了複雜化的層級,特別是在應用管理方面,包括應用該駐留在什麼地方以及管理人員是否有能力可以確保維護適當的安全性和網路成長。 儘管有著那麼多正面效益,但關鍵在於有一項致命且牽動所有層面的負面因素,阻礙企業朝雲端轉移 - 那就是被全球和亞洲企業組織視為最高優先的「安全性」! 大多數人們認為雲端比不上傳統資料中心安全,或者認為現在欠缺可以解決一些特定安全顧慮例如資料外洩的完美方案。這並非真實。真正讓終端使用者感到不安的原因在於喪失管控能力。 儘管人們對於轉移到雲端運算有所遲疑,但事實上它擁有比傳統資料中心更多層的安全性。 雲端服務供應商有著強大的動機促使他們提供最佳安全性,因為這攸關他們本身的商務和聲譽。他們通常會投資特定技術並部署專門的人員,確保能以最強的能力來降低安全威脅。再者,現在有越來越多雲端安全性與資料保護法的訂定,因此當發生不可預期的攻擊時有助於舒緩企業組織的不安。 然而,一旦CIO選擇將應用程式從他們的資料中心轉移到雲端,就等於將他們整體資料保護的部分管控權讓渡出去。正因為如此,除了選擇一家優良的雲端服務供應商以建立信心之外,CIO需要在他們可以著力的地方如應用層強化安全性。 面對新典範的行動化、雲端與混合網路,企業如何解決網路、應用與資料存取問題?如此眾多行動化但僅受到公司有限度管控的新裝置,加上散佈在網路、各種雲端與SaaS環境的應用與資料,企業該如何確保快速、適當、驗證與授權的存取? 身分識別只是管控存取的先頭部隊。使用者請求存取的當下情境,以及他們提出存取請求時所處的環境,同樣都是確保安全存取的要素。若能夠適當的管控「何人」、「何事」、「何時」、「何地」、「為何」與「如何」,就可以確保、強化和區分使用者對網路、雲端、應用與資料的安全存取,而不論那些資源駐留在何處或如何組成。 確保有效率且安全的在網路、雲端、應用程式和資料之間分享使用者身分識別,(不論他們身在何處,是現在的一項必要工作。然而,這有許多挑戰,例如身分識別孤島、雲端與SaaS應用和資料的企業內部(on-premise)身分識別、以及使用者密碼疲勞(導致較弱的使用者名稱與密碼)等都很容易被破解。解決之道就是要構築一個身分識別橋梁。聯合識別(federation)透過業界標準例如SAML,在網路、雲端、應用程式之間建立一個信任的鏈結,不再需要繁雜的身分識別目錄複製與插入。身分識別與存取由企業管控,並且在企業、雲端與SaaS服務供應商之間進行認證。企業能夠集中化的管控使用者認證與終止。聯合識別提供了存取能見度與管控能力。 F5利用安全性判斷提示標記語言(Security Assertion Markup Language; SAML)在身分識別提供者與服務提供者之間交換認證與授權資料,協助企業組織在應用層強化安全性與存取政策。其後,他們就可以一致的維護政策執行並確保使用者能夠存取關鍵的服務,跨越應用程式與環境,讓雲端部署更為簡單且擁有更安全的本質。189Views0likes0CommentsIT安全性不能以一種方案「一體適用」
This is adapated from the original post by Matt Miller. 今天的安全局勢具有高度複雜化的傾向,原因大致上可以歸咎於日益複雜化的網路攻擊本質,特別是從管理者的觀點來看。例如,分散式拒絕服務攻擊(DDoS)現在已達到400Gbps速度,目標包括網路和應用層。很顯然的,攻擊者持續進化,開發其他方法來繞過包括防火牆等傳統安全防護。 對於面對應用層DDoS攻擊威脅的企業而言,必須克服的挑戰在於如何區分人類流量與魁儡(bot)流量。 再者,攻擊背後的動機越來越複雜,特別是從政治與經濟觀點而言。例如,美國國家安全局(NSA)洩密案 - 前僱員Edward Snowden洩漏了包括美國、英國、澳洲、加拿大和紐西蘭等國政府的機密資訊 - 此一事件確實提醒了我們必須重視駭客活動。另外,現在IT安全的最大威脅之一就是組織化的網路竊盜與詐欺,因為那些聰明的犯罪者越來越了解可以藉由線上犯罪獲取可觀的財務利益。 因此,確保擁有適切的防護以杜絕網路攻擊,已成為企業的一項關鍵課題。一個有效的安全策略必須涵蓋員工存取的所有裝置、應用程式和網路,並且跨越企業本身的基礎設施。傳統安全方法例如新一代防火牆和被動回應的安全措施,已無法有效的對抗新類型的攻擊。現在的安全性非常重視對於應用程式的保護,以及使用者身分識別的加密和保護措施,而比較不偏重於基底的網路基礎設施。這是因為網路基礎設施已演變得比較沒那麼靜態,並且也已被證明它只是用來運行複雜應用程式的一個載具。 企業需要的是一個彈性且完整的安全策略,必須有能力將DNS安全性與DDoS保護、網路防火牆、存取管理、應用安全性等結合智慧型的流量管理。 Frost & Sullivan的一項報告(Frost Industry Quotient)指出,市場呈現了朝Web應用防火牆(Web Application Firewall; WAF)與應用交付控制器(Application Delivery Controller; ADC)平台整合的發展趨勢,而這促使F5針對應用交付市場開發了一個稱為F5新融合架構(F5 Synthesis)的新觀點。此一觀點提供了一個高效能分散網路架構(high performance network fabric),為一個應用的基本單元(網路、DNS、SSL、HTTP)提供保護以防範複雜的DDoS攻擊威脅。 F5新融合架構透過業經測試的參考架構,在客戶朝軟體定義資料中心(software defined data centres; SDDS)轉移的過程中確保應用安全與可用性。再者,F5的DDoS防護方案提供目前市場上最完備的攻擊防護。DDoS攻擊的平均速度達到2.64 Gbps,而升級到F5的BIG-IP平台後,伺服器將能處理高達470 Gbps的攻擊威脅。這不僅提供了充裕的頻寬以舒緩DDoS攻擊,而且其額外的容量讓線上公司可以維護正常的商務營運 - 即使是在遭受攻擊的期間。 安全性不再是一種「一體適用」的方案。終端使用者期望擁有高效能服務,而企業必須確保他們所部署的安全方案不會變成一個瓶頸。我們可以預期見到多維(multi-dimension)或「雞尾酒」式的攻擊出現,亦即DDoS攻擊結合應用層攻擊與SQL安全弱點威脅。因此,傳統防火牆不再是一個有效的安全防衛,企業需要採行一種多堆疊的安全方法,並結合內部控管程序。面對來自不同裝置和多重面向的攻擊威脅,單一用途安全設備將被高功能的多用途設備取代。239Views0likes0CommentsProud to win, Humbled to serve – F5 wins APAC Awards
In recent weeks, the award bells have been ringing for F5 in Asia Pacific. We’ve bagged not one, but four awards, showing a clear testament to our continued market and technological leadership. F5 has always been committed to delivering a holistic customer experience and we are humbled to be accredited by various industry experts and analysts for our work. These awards belong to every F5ers in Asia Pacific. The dedication in empowering our partners and delivering solutions and services to our customers, has taken F5 from humble beginnings to now a trusted partner in optimizing and securing thousands of Apps in Enterprise Hybrid Environments. This year F5 bagged award highlighting our leadership in Application Delivery Controllers (ADC) from Frost & Sullivan. It is a very special accolade for F5, as this marks the 7th year we are named the ADC Vendor of the Year at Frost & Sullivan’s Asia Pacific ICT Awards – an achievement we hold dear and do not take lightly. The winners of the awards are determined via in-depth interviews, analysis and extensive secondary research conducted by analysts and companies are evaluated based on revenue, market share, capabilities and overall contribution to the industry. F5 has grown beyond our data center roots to include a robust Hybrid offering from purpose-built hardware to as-a-service offerings. At Network World Asia’s Information Management Awards 2015, F5 is honoured to be conferred the Application Delivery Controller of the year. Apps are taking center stage in most business transactions, and enterprises are challenged to extend management of their IT infrastructure to include Apps hosted in the cloud. We are positioned in a sweet spot more than ever, to reduce complexity and embrace hybrid infrastructure deployments without forsaking the benefits our customers have seen in data center. Significantly this year, we announced our entrance into the cloud service delivery space with the introduction of F5 Silverline platform. Security runs in F5’s DNA. This year, our awards from Computerworld Malaysia and Network World Asia reflect on our rock-solid technology in DDoS Protection. Regional CIOs and IT heads of end-user organisations were invited to vote for both awards. It is the industry’s stamp of approval on F5’s capabilities and track record in keeping enterprises safe from vulnerabilities and threats in the cyberspace. These awards recognise the care we take in each step of product creation, from the drawing board to the end-user experience and support. Once the furore is over, we will continue to partner with enterprises and help businesses achieve their objectives with the best-in-class solutions. For now, we’ll pop the champagne and have a cheers to our wins!249Views0likes0CommentsUAE Cybersecurity Threat Landscape Growing in Intensity and Complexity
Leading UAE IT decision-makers agree that cybersecurity threats are growing in intensity and scale across the region. According to a new survey commissioned by F5 Networks,81% of surveyed IT decision-makers believed their organisation was more vulnerable than ever to cybersecurity threats. 82% ranked their organisation’s vulnerability to cybercrime, hacking and “hacktivism” as “very” or “extremely” vulnerable, and 79% agreed that it is more difficult than ever to protect their organisations from associated security threats. Worryingly, only 8% are completely confident their organisation has consistent IT security measures across its entire IT network. 34% said their marketing and sales efforts were most vulnerable to attacks, 28% cited email, 27% employee data and 24% customer information.Common cybersecurity threats include distributed denial of service (DDoS) attacks, phishing/spear-phishing emails, data theft, “zero-day” software assaults, web application exploits, and website defacement. The top cybersecurity challenges listed in F5 Networks’s survey include changing motivations for hacking (33% of respondents), the virtualization of server desktops and networks (31%), difficulty in managing a variety of security tools (29%), the increasing complexity of threats (29%), the shift from datacentre-focused infrastructure to the cloud (25%) and the move from traditional client-server applications to web-based applications (24%). In order to adapt and cope, 57% of decision-makers wanted a better understanding of the different types of security threats, 24% called for consolidated management of their different security tools, and 20% wanted a stronger focus on security issues from management.200Views0likes0Comments