The Internet of Sports
Did you see what the NFL is doing this year with sensors? Earlier this month they announced a partnership with Zebra Technologies, a company that provides RFID chips for applications from 'automotive assembly lines to dairy cows' milk production.' This season there will be sensors in the player's shoulder pads which will track all their on field movements. This includes player acceleration rates, top speed, length of runs, and even the distance between a ball carrier and a defender. Next year they'll add sensors for breathing, temperature and heart rate. More stats than ever and could change the game for-ever. Imagine coaches being able to examine that data and instantly call a play based on it. Play by play. To me it somewhat takes away that 'feel' for the game flow but also having data to confirm or deny that feeling might make for exciting games. Maybe lots of 0-0 overtimes or a 70-0 blowout. Data vs. data. Oh how do I miss my old buzzing electric football game. The yardsticks will have chips along with the refs and all that data is picked up by 20 RFID receivers placed throughout the stadium. Those, in turn, are wired to a hub and server which processes the data. 25 times a second, data will be transmitted to the receivers and the quarter sized sensors use a typical watch battery. The data goes to the NFL 'cloud' and available in seconds. The only thing without a sensor is the ball. But that's probably coming soon since we already have the 94Fifty sensor basketball. And we've had the NASCAR RACEf/x for years and this year they are going to track every turn of the wrench with RFID tracking in the pits and sensors on the crew. Riddell has impact sensors in their helmets to analyze, transmit and alert if an impact exceeds a predetermined threshold. They can measure the force of a NBA dunk; they can recognize the pitcher’s grip and figure out the pitch; then the bat sensor that can measure impact to the ball, the barrel angle of their swings, and how fast their hands are moving; and they are tracking soccer player movement in Germany. Heck, many ordinary people wear sensor infused bracelets to track their activity. We've come a long way since John Madden sketched over a telestrator years ago and with 300 plus lb. players running around with sensors, this is truly Big Data. It also confirms my notion that the IoT should really be the Internet of Nouns - the players, the stadiums and the yardsticks. ps Related: Player-tracking system will let NFL fans go deeper than ever Fantasy footballers and coaches rejoice—NFL players to wear RFID tags More sensors are coming to professional sports, but research outpaces business models Why This Nascar Team Is Putting RFID Sensors On Every Person In The Pit Impact Sensors: Riddell InSite Impact Response System Fastpitch Softball League Adds Swing Sensors to its Gear Technorati Tags: rfid,sensors,IoT,things,nfl,cloud,big data,silva,f5 Connect with Peter: Connect with F5:409Views0likes1CommentAdd a Data Collection Device to your BIG-IQ Cluster
Gathering and analyzing data helps organizations make intelligent decisions about their IT infrastructure. You may need a data collection device (DCD) to collect BIG-IP data so you can manage that device with BIG-IQ. BIG-IQ is a platform that manages your devices and the services they deliver. Let’s look at how to discover and add a data collection device in BIG-IQ v5.2. You can add a new data collection device to your BIG-IQ cluster so that you can start managing it using the BIG-IP device data. In addition to Event and Alert Log data, you can view and manage statistical data for your devices. From licensing to policies, traffic to security, you’ll see it all from a single pane of glass. But you need a DCD to do that. So, we start by logging in to a BIG-IQ. Then, under the System tab, go to BIG-IQ Data Collection and under that, click BIG-IQ Data Collection Devices. The current DCD screen shows no devices in this cluster. To add a DCD, click Add. This brings us to the DCD Properties screen. For Management Address field, we add the management IP address of the BIG-IP/DCD we want to manage. We’ll then add the Admin username and password for the device. For Data Collection IP Address, we put the transport address which is usually the internal Self-IP address of the DCD and click Add. The process can take a little while as the BIG-IQ authenticates with the BIG-IQ DCD and adds it to the BIG-IQ configuration. But once complete, you can see the devices has been added successfully. Now you’ll notice that the DCD has been added but there are no Services at this point. To add Services, click Add Services. In this instance, we’re managing a BIG-IP with multiple services including Access Policies so we’re going to activate the Access services. The listener address already has the management address of the DCD populated so we’ll simply click Activate. Once activated, you can see that it is Active. When we go back to the Data Collection Devices page, we can see that the Access Services have been added and the activation worked. Congrats! You’ve added a Data Collection Device! You can also watch a video demo of How to Add a data collection device to your BIG-IQ cluster. ps Related: Lightboard Lesson: What is BIG-IQ?3.3KViews0likes6CommentsInteresting Data versus Useful Data
I'm sure there are few people who have escaped some form of reporting or metrics collection in their careers. Pondering my various roles and responsibilities for a moment, I think I've measured almost everything: SPAM emails caught, megabytes browsed, web-cache hits, IOPS, virtual machine density, tweets and re-tweets... In most cases it has been clear to me the value of certain metrics over others. As a contractor I’ve measured hours worked - that’s how I got paid. As an engineer I’ve measured web usage as a means to justify organizational spend for campus access. In sales, I’ve worked with customers on creating metrics-based business cases to justify both investment and change in practice. However, I’ll be honest with you, there were a few metrics that left me scratching my big bald head… This brings me on to ‘Big Data’, “an all-encompassing term for any collection of data sets so large and complex that it becomes difficult to process them using traditional data processing applications.” Sounds interesting, right? But who’s actually analyzing this data to drive business value. What are their use cases? Or, from a different angle, who’s misusing the data to get the answers they want, instead of the guidance they need? Jason Spooner of Social Media Explorer grabbed my attention with his article titled, “BIG DATA IS USELESS: unless you have a plan for how to use it”, highlighting that, “There is no inherent value to data. The value comes from the application of what the data says.” My thoughts exactly! While I know I’ve come across a little on the skeptic side today, it’s my endless questioning and curiosity that’s kept me employable thus far. Consequently, I feel compelled to ask, is there a danger that Big Data is merely fueling the behavioral addicts among us? Is this an enterprise-grade obsessive compulsive disorder? Or, maybe it’s the skeptics like me that will inhibit future application. J. Edgar Hoover was heavily criticized over his fingerprint database… ok, not my finest example. In my current role I’m focused heavily on Software-Defined Networking. Unfortunately, SDN has been largely driven by the desire to solve implementation issues – how quickly an organization can deploy a new network and improve its Time to Market for new applications and services. However, I believe that there is far more to gain from applying software-defined principals to solving post-deployment problems. Consider the benefits of a network topology driven by real-time data analysis. A network that can adapt based on its own awareness. Now that would be cool! I appreciate that the control-plane abstraction driven by SDN is step one: allowing for the breaking away from management silos and steering towards a policy-driven network. However, there is still far more to gain from a software-defined approach. I, for one, look forward to the day when we see these data center network policies being driven by data analysis, both historical and real-time, to deliver better business services. Dare I call it DC Agility 2.0…?211Views0likes0CommentsPlay Ball!
...Oh Wait, Let Me Check the Stat-Cloud First! It is like a SAT question: Cincinnati Reds Billy Hamilton has a 10.83 foot lead off first base, can hit a top speed of 21.51 mph and clocked a jump of 0.49 seconds. If the Milwaukee Brewers catcher took 0.667 seconds to to get the ball out of his glove to throw to second and the ball is travelling at 78.81 mph, is Hamilton safe or out? A few weeks ago I wrote about the Internet of Sports, and can't believe I missed this one. But with the MLB playoffs in full gear, I didn't want this to slip through the IoT cracks. Sports analytics has been around for a while but never to this degree. Just like the NFL, Major League Baseball is equipping stadiums with technologies that can track moving players, flying baseballs and accurate throws. More than the RBIs, hits and stolen bases that appear on the back of trading cards, new technology (and software) also gathers stats like pop-fly pursuit range or average ground ball response time. Professional sports teams have always tracked their players' performance and often such milestones are included in the player's contract. Bonus for so many games played, or home runs hit or some other goal. With all this new detailed data, teams can adjust how they train, prepare for games and even player value for personnel moves like trades. For the 2014 season, only 3 stadiums (Mets, Brewers, Twins) had the new Field f/x (Sportvision Inc.) system but the league plans to have all 30 parks complete for the 2015 season. Field f/x can show data such as the angle of elevation of a batted ball, the highest point in its trajectory and the distance covered and top speed attained by a player attempting to field a ball. Of course all this can then be crunched for cool graphics during a replay. Cameras, sensors and software are all now part of the game. So are data centers, clouds and high speed links. All this data needs to be crunched somewhere and more often it is in a cloud environment. Add to that, the connection(s) to the fans and with the fans at the stadium. Levi's Stadium, for instance, has 1200 access points and an app that allows you to order food, watch instant replays and know which bathroom line is the shortest. Our sport stadiums are becoming data centers. Announcer: Welcome to Exclusive Sponsor Data Center Field! Home of the Hypertext Transfer Protocols. Undefeated at home this year, the Prots look to extend their record and secure home field throughout the playoffs. And if you were wondering, Hamilton was Safe. ps Related: New Baseball Season Brings Tech to Track Player Skills Major League Baseball brings new tech to the plate Baseball All-Stars’ Data Gets More Sophisticated With Field F/X The Internet of Sports Are You Ready For Some...Technology!! Is IoT Hype For Real? Technorati Tags: iot,things,baseball,sports,sensors,stats,big data,mlb,nfl,f5,silva Connect with Peter: Connect with F5:372Views0likes0CommentsThe Internet of Things and mobility driving HTTP and cloud
#IoT #bigdata #cloud The Internet of Things smells like opportunity for everyone. There is no industry that hasn't been touched by the notion of smart "things" enabling convenience or collaboration or control in every aspect of our lives. From healthcare to entertainment, from automotive to financials, the Internet of Things is changing the way we work, live and play. That's the view from the consumer side, from the perspective of someone using the technology made available by <insert vendor/provider here>. But before that consumer could get their hands on the technology -and the inevitable accompanying "app" that comes with it - the provider/vendor had a lot of work cut out for them. Whether it was building out licensing and activation servers, a remote-based control application, or a data exchanging service, the provider of this "thing" and its supporting ecosystem of applications had to design and implement systems. One of those systems is inevitably related to storage and retrieval of data. That's because a key consideration of nearly all "things" is that while they pack quite the computing punch (as compared historically to mobile devices) they are still constrained with respect to storage capabilities.These devices are generating more data than most people can fathom. Globally, smart devices represented 21 percent of the total mobile devices and connections in 2013, they accounted for 88 percent of the mobile data traffic. In 2013, on an average, a smart device generated 29 times more traffic than a non-smart device. Globally, there were nearly 22 million wearable devices (a sub-segment of M2M category) in 2013 generating 1.7 petabytes of monthly traffic. Cisco Visual Networking Index Not only are the things themselves incapable of storing the vast quantities of data generated in the long term, providers/vendors are unlikely to have the spare storage capacity readily available to manage it. Even if they do, it's a rare organization that has in place the controls and multi-tenancy necessary to support the storing such data with the privacy expected (and demanded) by consumers. Add to that the reality that they're small and portable and often dropped into the most inconvenient of places by their owners (or their owners' children) and the result is a need to ensure off-thing storage of data. Just in case. What that means is the Internet of Things is driving the use of HTTP and solidifying its status as the "new TCP". Things communicate with applications to store and retrieve a variety of data. This communication is generally accomplished over the Internet - even though it may start out over a cellular network - and that means using the linga franca of the web, HTTP. Additionally, HTTP is ubiquitous, the market for developers is saturated, and support for HTTP is built into most embedded systems today. So HTTP will be used to store and retrieve data for all these "things", that seems a foregone conclusion. But what about storage and capacity? Ready or Not The question is whether the provider/vendor of the thing is going to take on the challenges of capacity and storage themselves or, as is increasingly the case, turn to the public cloud. The public cloud option has many benefits, particularly in that it's cheap, it already exists, and its enabled with the APIs (accessible via HTTP) required to integrate it with a mobile app or thing. It's already multi-tenant and supportive of the level of privacy required by consumers, and it grows on-demand without requiring staff to spend time racking more compute and storage in the data center. It seems likely, then, that not only will things and mobility continue to drive the dominance of HTTP but will also increase use of public cloud services. Certainly there are industries and segments within the "things" and mobile app categories that make using public cloud unacceptable. My mobile banking and financial apps, for example, are not storing data anywhere but safely inside the (hopefully very secure) walls of their respective institutions. My Minecraft game on the Xbox 360, however, offers up "cloud" as a storage device, which means I can create new worlds til the cows come home. My smartpen synchronizes with Evernote. My iPhone is constantly nagging me to use iCloud because, well, it's available. With Google's acquisition of Nest, if its data and control applications weren't being run in Google's cloud, they probably will be in the future. The reality is that many organizations are not architecturally ready from the network and operations perspective to take on the challenges that will be encountered by a foray into the Internet of Things. But to let it pass them by is also not acceptable. That may very well drive organizations to the cloud to avoid missing these early days of opportunity.214Views0likes0CommentsThe Internet of Things: The Only Winning Move is to Play
#IoT #BigData The lure of free and convenience has finally won How many of you actually fill out the registration cards that come with your kid's toys? Anyone? Anyone? Bueller? Bueller? That's what I thought. Me neither. Not as a parent and certainly not as an adult. After all, they were just going to use it for marketing, right? Fast forward from our children to today, and the Internet of Things is rapidly changing everything. Right under our ... fingers. Take, for example, Construct Bots. Not literally, because my six year old will scream with rage, but as an example. You buy the toy, which is cool enough in and of itself, but then you get the free app for your <insert mobile platform of choice>. Now, when you buy the toy, you can scan a QR code (or enter a bunch of digits, your choice) in the app. That unlocks digital versions of pieces and parts and your six your old is happy as a clam. And so is the company behind it. Because that's the modern version of a registration card. And it's part of what is contributing to the big data explosion, because that code has to be validated - most likely by an application sitting either in the cloud or at corporate head quarters. And that code is tied to an app that's tied to a device that's.. well, you get the picture. For the price of developing an app and printing some codes on paper, the business gets to mine a wealth of usage and purchasing data that it could never have gotten back in the days of postcards and stamps and pens. There are hundreds - thousands - of examples of digital-physical convergence and new "things" connecting to the Internet across every industry and every vertical. Everyone is going to get in the game called the Internet of Things because, unlike the famous conclusion in War Games, the only winning move in this game is to play. What's That Mean for You in the Data Center? Most of the focus of the Internet of Things has been on the impact of an explosive amount of data and the pressure that's going to put on storage and bandwidth requirements, particularly on service providers. But one of the interesting things about all these wearables and Internet-enabled devices is that for the most part, they're talking the lingua franca of the web. It may be transported via a mobile network or WiFi, but in the end the majority are talking HTTP. Which really means, they're talking to an application. And if your success is relying in part or wholly on the delivery of an application, you're going to need to deliver it. Securely, reliability and with an attention to its performance requirements. The thing is, that each of these applications is going to have a different set of requirements, sometimes for the same back-end application. That's the beauty of a service-oriented and API-based approach to applications (which is rapidly becoming known as microservices but hey, today's not the day to quibble about that) - you can leverage the same service across multiple consumption models. Web, mobile, things. Doesn't matter, as long as they can speak HTTP they can communicate, share data, and engage consumers and employees. For a great read on microservices I highly recommend Martin Fowler's Microservices series But the application when used to collect critical health data from a wearable has different performance and reliability requirements than when it's used to generate a chart for the consumer. Yes, we always want it fast but there's a marked difference between fast in terms of user experience and fast in terms of, "this data could save his life, man, hurry it up". The same application will have different performance and availability requirements based on its purpose. Not its location or its form factor, not its operating system or application version. Its purpose. And purpose isn't something that's easily discernable from simple HTTP headers or an application ID, and it certainly isn't extractable from ports and IP addresses. The L4-7 services responsible for ensuring the performance and reliability of and access to applications is going to need to be far more granular than it is today. It's going to have to match more than content type with an operating system to be able to determine how to route, optimize, enable access and secure the subsequent exchange of data. Programmability is going to play a much bigger role in the data center able not just to support playing in this game but winning at it. Because it's only through the ability to not only to extract data but logically put 2 and 2 together and come up with the right policy to apply that we can possibly attain the level of granularity that's going to be necessary in the network to provide for these kinds of highly differentiated application policies. This world is coming, faster than you think. Every day there's a new wearable, a new toy, a new app and a new idea. As the footprint of computing power continues to shrink, we're going to see more and more and more of these "things" connecting us to the Internet. And a significant portion of what's going to make them successful is whether or not the network can keep them fast, secure and available.177Views0likes0CommentsF5 Friday: I am in UR HTTP Headers Sharing Geolocation Data
#DNS #bigdata #F5 #webperf How'd you like some geolocation data with that HTTP request? Application developers are aware (you are aware, aren't you?) that when applications are scaled using most modern load balancing services that the IP address of the application requests actually belong to the load balancing service. Application developers are further aware that this means they must somehow extract the actual client IP address from somewhere else, like the X-Forwarded-For HTTP header. Now, that's pretty much old news. Like I said, application developers are aware of this already. What's new (and why I'm writing today) is the rising use of geolocation to support localized (and personalized) content. To do this, application developers need access to the geographical location indicated by either GPS coordinates or IP address. In most cases, application developers have to get this information themselves. This generally requires integration with some service that can provide this information despite the fact that infrastructure like BIG-IP and its DNS services, already have it and have paid the price (in terms of response time) to get it. Which means, ultimately, that applications pay the performance tax for geolocation data twice - once on the BIG-IP and once in the application. Why, you are certainly wondering, can't the BIG-IP just forward that information in an HTTP header just like it does the client IP address? Good question. The answer is that technically, there's no reason it can't. Licensing, however, is another story. BIG-IP includes, today, a database of IP addresses that locates clients, geographically, based on client IP address. The F5 EULA, today, allows customers to use this information for a number of purposes, including GSLB load balancing decisions, access control decisions with location-based policies, identification of threats by country, location blocking of application requests, and redirection of traffic based on the client’s geographic location. However, all decisions had to be made on BIG-IP itself and geographic information could not be shared or transmitted to any other device. However, a new agreement allows customers an option to use the geo-location data outside of BIG-IP, subject to fees and certain restrictions. That means BIG-IP can pass on State, Province, or Region geographic data to applications using an easily accessible HTTP header. How does that work? Customers can now obtain a EULA waiver which permits certain off-box use cases. This allows customers to use the geolocation data included with BIG-IP in applications residing on a server or servers in an “off box” fashion. For example, location information may be embedded into an HTTP header or similar and then sent on to the server for it to perform some geo-location specific action. Customers (existing or new) can contact their F5 sales representative to start the process of obtaining the waiver necessary to enable the legal use of this data in an off-box fashion. All that's necessary from a technical perspective is to determine how you want to share the data with the application. For example, you'll (meaning you, BIG-IP owner and you, application developer) will have to agree upon what HTTP header you'll want to use to share the data. Then voila! Developers have access to the data and can leverage it for existing or new applications to provide greater location-awareness and personalization. If your organization has a BIG-IP (and that's a lot of organizations out there), check into this opportunity to reduce the performance tax on your applications that comes from double-dipping into geolocation data. Your users (especially your mobile users) will appreciate it.377Views0likes0CommentsBig issues for Big Data
It’s one of those technologies that seem to divide opinion, but Big Data is beginning to make a real impact on the enterprise. Despite some experts saying that it’s nothing more than hype, new figures from Gartner suggest that CIOs and IT decision makers are thinking very seriously about Big Data, how they can use it and what advantages it could generate for their business. The study from Gartner revealed that 64% of businesses around the world are investing in Big Data technology in 2013 or are planning to do so. That’s a rise from 58% in 2012. The report also shows that 30% of businesses have already invested, while 19% say they plan to do so over the next 12 months and a further 15% say they plan to invest over the next two years. This shows that businesses are embracing Big Data to enhance their decision making. There is a huge amount of insight into a business, its workers and its customers just waiting to be discovered via capturing, storing and then analysing the huge volume of data that is being created these days. But what we at F5 have noticed is that sometimes Big Data projects can fail because the underlying architecture isn’t able to cope with all the data and the different devices attached to the network that want to access that data. Businesses must also have in place infrastructure that can scale as data and demand increases. The key to a successful Big Data project is enabling secure access to the data when it’s needed. When I say secure what I mean is that workers are only allowed to access certain applications; the ones they need to do their job. Controlling access to the data reduces potential security risks. To get the best out of Big Data it is also vital that it is able to travel around your network freely whenever it’s needed. That’s a key point: getting the best decisions out of your data requires immediacy; making instant decisions and putting them into action is what can give a business the edge in this increasingly competitive world we operate in. It’s no use if the relevant data cannot be accessed at the time it’s needed, or if the network is so saturated that it takes too long to get where it needs to be. If the moment passes, so does the opportunity. This is what I mean when I talk about having the right infrastructure in place to enable Big Data... access, security and an intelligent network that can always guarantee that data can move around freely.197Views0likes0CommentsPrivacy for a Price
A few weeks ago, I went to my usual haircut place and after the trim at the register I presented my loyalty card. You know the heavy paper ones that either get stamped or hole-punched for each purchase. After a certain number of paid visits, you receive a free haircut. I presented the card, still in the early stages of completion, for validation and the manager said I could convert the partially filled card to their new system. I just had to enter my email address (and some other info) in the little kiosk thingy. I declined saying, 'Ah, no thanks, enough people have my email already and don't need yet another daily digest.' He continued, 'well, we are doing away with the cards and moving all electronic so...' 'That's ok,' I replied, 'I'll pay for that extra/free haircut to keep my name off a mailing list.' This event, of course, got me thinking about human nature and how we will often give up some privacy for either convenience or something free. Imagine a stranger walking up to you and asking for your name, address, email, birthday, income level, favorite color and shopping habits. Most of us would tell them to 'fill in the blank'-off. Yet, when a Brand asks for the same info but includes something in return - free birthday dinner, discounted tickets, coupons, personalized service - we typically spill the beans. Infosys recently conducted a survey which showed that consumers worldwide will certainly share personal information to get better service from their doctors, bank and retailers; yet, they are very sensitive about how they share. Today’s digital consumers are complicated and sometimes suspicious about how institutions use their data, according to the global study of 5,000 digitally savvy consumers. They also created an infographic based on their findings. Overall they found: 82 percent want data mining for fraud protection, will even switch banks for more security; 78 percent more likely to buy from retailers with targeted ads, while only 16 percent will share social profile; 56 percent will share personal and family medical history with doctors ...and specific to retail: To know me is to sell to me: Three quarters of consumers worldwide believe retailers currently miss the mark in targeting them with ads on mobile apps, and 72 percent do not feel that online promotions or emails they receive resonate with their personal interests and needs To really know me is to sell me even more: A wide majority of consumers (78 percent) agree that they would be more likely to purchase from a retailer again if they provided offers targeted to their interests, wants or needs, and 71 percent feel similarly if offered incentives based on location Catch-22 for retailers? While in principle shoppers say they want to receive ads or promotions targeted to their interests, just 16 percent will share social media profile information. Lacking these details could make it difficult for retailers to deliver tailored digital offers Your data is valuable and comes with a price. While many data miners are looking to capitalize on our unique info, you can always decline. Yes, it is still probably already gathered up somewhere else; Yes, you will probably miss out on some free or discounted something; Yes, you will probably see annoying pop-up ads on that free mobile app/game and; Yes, you might feel out of the loop. But, it was still fun to be in some control over my own info leaks. ps Related: Path pledges to be ad-free: Will consumers pay for their privacy? What Would You Pay for Privacy? Paying for privacy: Why it’s time for us to become customers again Consumers Worldwide Will Allow Access To Personal Data For Clear Benefits, Says Infosys Study Engaging with digital consumers: Insights from Infosys survey [Infographic] Parking Ticket Privacy Invasion of Privacy - Mobile App Infographic Style 'Radio Killed the Privacy Star' Music Video? Technorati Tags: privacy,data,big data,mobile,loyalty,consumer,human,information,personal,silva,security,retail,financial Connect with Peter: Connect with F5:586Views0likes1CommentBig Data Getting Attention
According to IBM, we generate 2.5 quintillion (2.5 followed by 17 zeros) bytes of data every day. In the last two years, we've created about 90% of the data we have today. Almost everything that's 'connected' generates data. Our mobile devices, social media interactions, online purchases, GPS navigators, digital media, climate sensors and even this blog to name a few, adds to the pile of big data that needs to be processed, analyzed, managed and stored. And you think that saving all your movies, music and games is a challenge. This data growth conundrum is 3 (or 4 - depending on who you talk to) dimensional with Volume (always increasing amount of data), Velocity (the speed back and forth) and Variety (all the different types - structured & unstructured). Veracity (trust and accuracy) is also included in some circles. With all this data churning, security and privacy only add to the concerns but traditional tactics might not be adequate. Recently the Cloud Security Alliance (CSA) listed the top 10 security and privacy challenges big data poses to enterprises and what organizations can do about them. After interviewing CSA members and security-practitioners to draft an initial list of high priority security and privacy problems, studying the published solutions and characterizing problems as challenges if the proposed solution(s) did not cover the problem scenarios, they arrived at the Top 10 Security & Privacy Challenges for Big Data. They are: Secure computations in distributed programming frameworks Security best practices for non-relational data stores Secure data storage and transactions logs End-point input validation/filtering Real-Time Security Monitoring Scalable and composable privacy-preserving data mining and analytics Cryptographically enforced data centric security Granular access control Granular audits Data Provenance The Expanded Top 10 Big Data challenges has evolved from the initial list of challenges to an expanded version that addresses new distinct issues. Modeling: formalizing a threat model that covers most of the cyber-attack or data-leakage scenarios Analysis: finding tractable solutions based on the threat model Implementation: implanting the solution in existing infrastructures The idea of highlighting these challenges is to bring renewed focus on fortifying big data infrastructures. The entire CSA Top 10 Big Data Security Challenges report can be downloaded here. ps Related: CSA Lists Top 10 Security, Privacy Challenges of Big Data CSA Releases the Expanded Top Ten Big Data Security & Privacy Challenges Expanded Top Ten Big Data Security and Privacy Challenges (download link) The Four V’s of Big Data Don't forget the network in this Big Data rush When Big Data Meets Cloud Meets Infrastructure How big data in the cloud can drive IT ops Technorati Tags: big data,performance,web,cloud,csa,optimization,cloud computing,network,application,infrastructure,http 2.0,dynamic infrastructure,mobile,silva Connect with Peter: Connect with F5:241Views0likes0Comments