…Or 6 years old in human time. When PCI DSS was born, it was actually five different procedures from each of the major credit card issuers: Visa, MasterCard, American Express, JCB and Discover. Each program was comparable in that they wanted merchants to have minimum security requirements when handling (process, transmit, store) cardholder data as a protection mechanism. The industry came together and formed the PCI Security Standards Council (SSC) which aligned the distinctive policies and then released the Payment Card Industry Data Security Standard (PCI DSS) v1.0. Over the years there have been clarifications, slight revisions, wireless guidelines, the addition of PIN Entry devices and of course, version updates – 1.1, 1.2 and 1.2.1, the most recent standard. PCI DSS v2.0 was released on Oct 28, 2010 and went into effect January 1, 2011. Organizations have until New Year’s Eve 2011 to implement and comply with the new changes and can actually still validate compliance against v1.2 until the ball drops again in 360 days.
It’s been an interesting ride for PCI with supporters hailing it’s mission and others complaining that it’s expensive, confusing and subjective. If nothing else, it’s made business focus on and consumers more aware of Data Security, which is a good thing. PCI v2.0 does not have any extensive new requirements but it does clarify some requirements for easier understanding and makes adoption, especially for small merchants, simpler and easier. Some of the important updates include the need for a comprehensive audit prior to assessment to understand where all the cardholder data resides within the infrastructure. Knowing all the locations and flows of sensitive data can help in protecting those assets. An evolving requirement is allowing merchants to execute a risk-based approach, based on business circumstances, for ranking, addressing and prioritizing vulnerabilities. I’ve mentioned before that Security is really about Risk-Management, and while I’m not sure that merchants with limited IT security experience could determine if they are more susceptible to Forceful Browsing, Hidden Field Manipulation, or SQL Injection, I do think it’s a step in the right direction in terms of an exercise. It encourages organizations to conduct a risk-assessment and focus on areas that are the most vulnerable. This can help a smaller merchant target their limited resources to a specific area of concern.
Another evolving requirement is the need for more effective and centralized log management. Scouring logs from various systems looking for that one nasty IP address can be cumbersome and the ability to centralize log management is important whether you’re trying to be PCI compliant or not. Cloud Computing comes to mind as a big beneficiary of centralized management. And speaking of the Cloud, there is also some guidance on virtualization – not much – but some. For one, they’ve included virtualization in that, they’ve expanded the definition of system components to include virtual components. You can only implement one primary function per server, so functions like web, app, db, DNS and so forth should be running on separate virtual machines. They want to avoid situations where different functions that may have different security levels are cohabitating on the same server. Also, since VMs can move around, if only one of your VMs is handling cardholder data, then the entire virt....
And so begins the new 3 year lifecycle for standards development but minor revisions can be added, if necessary, during that time. While the temptation is to wait until you absolutely have to comply or just test against the old standard, it’s better to get going on two-dot-oh sooner than later. You really don’t want to be worried about implementing PCI updates when the 2011 holiday shopping season is in full swing or when staff is limited due to the holidays. If you need to comply, do yourself a favor and get it done early.