series-security-sidebar
15 TopicsSecurity Sidebar: What's Real, and What's Fake?
Generative Adversarial Networks (GANs) are deep neural net architectures comprised of two networks, pitting one against the other (thus the “adversarial”). These networks can learn to mimic any distribution of data, and they can take input from many different sources in order to create things that are extremely similar to real-world things. Things like images, music, speech, prose, etc. One website uses GANs to study thousands of human faces and then generate faces of people who do not exist. Fake Pictures The website This Person Does Not Exist uses GANs that study thousands of human faces and then generate faces of people who do not exist. Do you know the girl shown below? No, you don't. She doesn't exist. The Generative Network works alongside a Discriminative Network to determine how authentic the picture actually is. In effect, the generative network "generates" the picture (based on real life images) and then the discriminative network provides feedback on whether the picture actually looks real or fake. Here's a cool picture of the process of how these GANs study real picture inputs and then generate fake pictures. On one hand, this is cool and fascinating stuff. On the other, it can get pretty freaky pretty fast. It also makes me think about the picture that my buddy showed me of his new "girlfriend"...I'm gonna need to actually meet the girl to confirm she's a real person. Fake Videos Related to all this, new advancements are coming in the area of artificial intelligence and fake videos. While video manipulation has been around for a relatively long time, researchers at Samsung have recently been able to take a single picture and turn it into a fake video of that person. We all know Miss Mona Lisa, right? Well, have you ever seen her have a conversation. No, because video wasn't around back then. Well, now you can... When you add together the fake images from these GANs and the ability to turn a single picture into a video of that person, you get some crazy possibilities. Maybe the video evidence that has always been so trustworthy in a court room is suddenly not. Maybe your favorite politician gives a private speech on a controversial topic...or maybe they don't? The possibilities can get pretty extensive. In times like these, remember the fateful words of Abraham Lincoln (16th President of the United States): "Never believe everything you see on the Internet."479Views3likes0CommentsSecurity Sidebar: Improving Your SSL Labs Test Grade
Encrypt everything. That's what Google Chairman Eric Schmidt recently said. His comments were in response to various surveillance efforts that he considered government overreach and censorship. His rationale...if you are going to spy on everything I send across the Internet, then I'll simply encrypt it all so you can't read it. Other companies like Facebook, Twitter, Yahoo, and many others have taken similar steps. In addition, Mark Nottingham (chairman of the group developing the new HTTP/2 protocol) said, "I believe the best way that we can meet the goal of increasing use of TLS on the Web is to encourage its use by only using HTTP/2 with https:// URIs." With all this encryption momentum from giants in the industry, the HTTPS path has been paved and everyone who wants to stay relevant will have to get on board. So, the world is moving to "encrypt everything" and you want to follow suit. Unfortunately, there are many different options to consider when implementing SSL on your web server. Wouldn't it be nice to just have a checkbox that said "click here for SSL implementation"? It's not that simple. Fortunately, there are many different web-based tools that allow you to score the effectiveness of your web server's SSL implementation. Many of these tools provide recommendations on how to improve your web server's security and make it stronger and more efficient. Some of these include Wormly, SSL Shopper, DigiCert, and GlobalSign to name a few. Some of these tools just give you basic certificate information while others dig a little deeper into performance and known vulnerability status. There's no magic formula or mandate that forces any of these tools to look at one thing over another, so they all test things a little bit differently. That said, the undisputed industry thought leader in this space is Qualys SSL Labs. Qualys does a great job of conducting a comprehensive inspection of the SSL implementation on your web server. Some may question the need for having a good grade on the SSL Labs test, but imagine a customer checking, for example, their bank website and finding a bad grade for SSL implementation. If my bank had a failing grade on SSL implementation, it would certainly get my attention and it might make me think twice about moving my money and my business elsewhere. Even though an organization may not totally agree with the way Qualys approaches web server testing, it's still important to understand their testing methodology so as to align SSL implementation practices with their recommendations. How does SSL Labs approach web server testing? They have a fairly short and easy to read SSL Server Rating Guide that outlines the exact methodology they use for testing. Their approach consists of 4 steps: Look at a certificate to verify that it's valid and trusted Inspect server configuration in three categories: Protocol support Key exchange support Cipher support Combine the category scores into an overall score (a score of zero in any category will push the overall score to zero), then calculate an overall letter grade Apply a series of rules to handle aspects of server configuration that cannot be expressed via numerical scoring The final letter grade is based on the following overall numerical score: Numerical Score Letter Grade >= 80 A >= 65 B >= 50 C >= 35 D >= 20 E < 20 F Who knew you could get an "E" grade?!? I'm pretty sure I've received every other letter grade on that scale at some point in my life, but never an E. By the looks of where it fits on the scale, I don't want to start now. One other note about the grading scale...in certain situations the standard A-F grades are not quite applicable and are out of scope. To handle this, SSL Labs has introduced the "M" grade (certificate name mismatch) and the "T" grade (site certificate is not trusted). So, when you are reviewing your score and you see the "M" or the "T" you don't have to wonder what happened with the scoring results. Anyway, let's quickly look at each of the 4 areas they test. Certificate Inspection Three certificate types are currently in use: domain-validated, organization-validated, and extended-validation (EV) certificates. SSL Labs only requires that a certificate be correct and does not go beyond that basic requirement. They do recommend EV certificates for higher-value web sites but they have no way of knowing the purpose of each web site so they simply check to make sure the site's certificate is valid and trusted. However, they do note some certificate issues that will immediately result in a zero score: Domain name mismatch Certificate not yet valid Certificate expired Use of a self-signed certificate Use of a certificate that is not trusted (unknown CA or some other validation error) Use of a revoked certificate Insecure certificate signature (MD2 or MD5) Insecure key Server Configuration The three criteria used for server configuration are protocol support (30% of grade), key exchange (30% of grade), and cipher strength (40% of grade). Protocol support is graded against the following criteria: Protocol Score SSL 2.0 0% SSL 3.0 80% TLS 1.0 90% TLS 1.1 95% TLS 1.2 100% They start with the score of the best protocol used on your web server and then add the score of the worst protocol and then divide the total by 2. This doesn't account for any protocols in between the best and worst on your site, but that's why it's important to understand how they calculate all this stuff. For example, if your site supports SSL 3.0, TLS 1.1, and TLS 1.2, your score would be (100 + 80) / 2 = 90. How would you increase that score? Well, if you continued support for TLS 1.1 and TLS 1.2 and dropped support for SSL 3.0, your score would move up to (100 + 95) / 2 = 97.5. Key exchange is graded against the following criteria: Key Exchange Score Weak key (Debian OpenSSL flaw) 0% Anonymous key exchange (no authentication) 0% Key or DH parameter strength < 512 bits 20% Exportable key exchange (limited to 512 bits) 40% Key or DH parameter strength < 1024 bits (e.g., 512) 40% Key or DH parameter strength < 2048 bits (e.g., 1024) 80% Key or DH parameter strength < 4096 bits (e.g., 2048) 90% Key or DH parameter strength >= 4096 bits (e.g., 4096) 100% Cipher strength is the final piece of the server configuration equation. Servers can support varying strengths of ciphers, so SSL Labs scores the cipher strength the same way they do the protocol strength...take the score of the strongest cipher, add the score of the weakest cipher, and divide by 2. The scores for each cipher are as follows: Cipher Strength Score 0 bits (no encryption) 0% < 128 bits (e.g., 40, 56) 20% < 256 bits (e.g., 128, 168) 80% >= 256 bits (e.g., 256) 100% Sample Web Server Let's say your web server has the following configuration: Valid and trusted certificate Protocol support for TLS 1.0 and TLS 1.1 RSA key with 2048 bit strength Cipher algorithm is AES/CBC with 256 bit strength In this case, you would score a 92.5 for protocol support, a 90 for key exchange, and a 100 for cipher strength. Protocol support accounts for 30% of the overall grade, so you multiply 92.5 by 30%. Key exchange is also 30% of the overall grade, and cipher strength is 40% of the overall grade. Using these values, you would score a (92.5 * 30%) + (90 * 30%) + (100 * 40%) = 94.75. Converting this numerical score to a Letter Grade would yield an overall "A" score. Congratulations! Important Things to Consider... SSL Labs periodically changes their grading criteria and methodology based on changes in technology. Here are some changes that they have published (updated Feb 2018): SSL 2.0 is not allowed (results in an automatic "F") Insecure renegotiation is not allowed (results in an automatic "F") Vulnerability to the BEAST attack caps the grade at B Vulnerability to the CRIME attack caps the grade at C (previously capped at "B" but changed in the May 2015 test version) The test results no longer show the numerical score (0-100) because they realized that the letter grade (A-F) is more useful (they still calculate the numerical score...they just don't show it to you) No longer require server-side mitigation for the BEAST attack Support for TLS 1.2 is now required to get an A grade. Without it, the grade is capped at a B If vulnerable to the Heartbleed attack, automatic "F" grade If vulnerable to the OpenSSL CVE-2014-0224 vulnerability, automatic"F" grade Keys below 2048 bits (e.g., 1024) are now considered weak, and the grade is capped at a B Keys under 1024 bits are now considered insecure (results in an automatic "F") Warnings have been introduced as part of the rating criteria. In most cases, warnings are about issues that do not yet affect the grade, but likely will in the future. Server administrators are advised to correct the warnings as soon as possible. Some examples are: Warning: RC4 is used with TLS 1.1 or newer protocol. Because RC4 is weak, the only reason to use it is to mitigate the BEAST attack. For some, BEAST is still a threat. Because TLS 1.1 and newer are not vulnerable to BEAST, there is no reason to use RC4 with them Warning: No support for Forward Secrecy Warning: Secure renegotiation is not supported Grade A- is introduced for servers with generally good configuration that have one ore more warnings Grade A+ is introduced for servers with exceptional configurations. At the moment, this grade is awarded to servers with good configuration, no warnings, and HTTP Strict Transport Security support with a max-age of at least 6 months MD5 certificate signatures are now considered insecure (results in an automatic "F") Clarified that insecure certificate signatures affect the certificate score. This has always been the case for MD2 Clarified that the strength of DHE and ECDHE parameters affects key exchange scoring. This has always been the case, but previous revisions of the text were not clear about it An A+ score is not awarded to servers that use SHA1 certificates Overall grade is capped at C if vulnerable to POODLE attack An A+ score is not awarded to servers that don’t support TLS_FALLBACK_SCSV Overall grade is capped at "B" if SSL 3 is supported Overall grade is capped at "B" if RC4 is supported Overall grade is capped at "B" if the certificate chain is incomplete Servers that have SSL 3.0 as their best protocol automatically get an "F" If using weak DH parameters (less than 1024bits), grade is automatically set to "F" If using weak DH parameters (less than 2048 bits), grade capped at "B" If using export cipher suites, grade is automatically set to "F" If vulnerable to CRIME attack, best grade is capped at "C" (was "B" prior to May 2015 test version) Cap grade at "C" if RC4 is used with TLS 1.1+ Cap grade at "C" if not supporting TLS 1.2 Fail servers that support only RC4 suites Detect when RSA exponent 1 is used. This is insecure and gets an automatic "F" Hosts that have HPKP issues can't get an A+ grade Servers vulnerable to DROWN attack get an automatic "F" grade If vulnerable to CVE-2016-2107 (Padding oracle in AES-NI CBC MAC check), grade is an automatic "F" Introduce a penalty (grade capped at C) for using 3DES (and other ciphers with block sizes of 64 bits) with TLS 1.1+ SHA1 certificates are nolonger trusted; results in a "T" grade Introduced an explicit penalty for using cipher suites weaker than 112 bits. This was necessary to address a flaw in the SSL Labs grading algorithm that didn't sufficiently penalize these weak suites. WoSign/StartCom certificates are distrusted andwill result in a "T"grade If vulnerable to Ticketbleed (CVE-2016-9244), the grade is an automatic "F" In addition to these updates, SSL Labs is planning to add more criteria changes in March, 2018. These include: Penalty for not using forward secrecy (grade capped at "B"). Not using Forward Secrecy is currently a warning, but will soon affect the actual grade of your web server. Theywill not penalize sites that use suites without forward secrecy provided they are never negotiated with clients that can do better. Penalty for not using AEAD suites (grade capped at "B").Your site should use secure cipher suites, andAEAD is the only encryption approach without any known weaknesses. Also, the new TLS 1.3 protocol supports only AEAD suites.In their new grading criteria, websites will be required to use AEAD suites to get an"A". However, as with forward secrecy, theywill not penalize sites if they continue to use non-AEAD suites provided AEAD suites are negotiated with clients that support them. Penalty forReturn Of Bleichenbacher Oracle Threat (ROBOT) vulnerability (automatic "F" grade). ROBOTis an attack model based on Daniel Bleichenbacher chosen-ciphertext attack. Bleichenbacher discovered an adaptive-chosen ciphertext attack against protocols using RSA, he demonstrated the ability to perform RSA private-key operations. Researchers have been able to exploit the same vulnerability with small variations to the Bleichenbacher attack. The ROBOT vulnerability was a warning in the past, but will now be used in the grading algorithm. Note: F5 has provided mitigation steps for the ROBOT vulnerability in article K21905460: BIG-IP SSL vulnerability (ROBOT) CVE-2017-6168. Penalty for using Symantec Certificates (grade of "T" will be given). Starting March 1, 2018, SSL Labs will give “T” grade for Symantec certificates issued before June 2016. Hopefully you can start to see how your overall grade can change based on different options and configurations. As SSL Labs changes their grading criteria and testing methodology (i.e. will support for HTTP 2.0 be needed for an "A" grade in the future?) you should stay aware of what they are doing and how your web site is affected by their changes. It's important to check back periodically to see how your grade looks...your customers are certainly checking on you! After all, if you're gonna "encrypt everything" you might as well encrypt it correctly. Knowing all this, you can more easily configure your web server to go from this grade... To this grade... Here's to great web site configurations, effective security, and A+ grades!2.7KViews1like11CommentsSecurity Sidebar: Did Quantum Computing Kill Encryption?
Google recently published results of its newest quantum computing capability with a chip called "Sycamore" and the results are pretty impressive. Classic computer operations rely on a binary 1 or 0 to execute operations. But, quantum computing can take advantage of numbers between 1 and 0 at the same time, thus greatly increasing its computing speed and power. Of course, this quantum computing thing is not easy. Giant companies like Google, IBM, and others have been working hard with large budgets for a long time to figure this thing out. Google's Sycamore Chip In its public release, Google showed that the Sycamore chip could execute calculations that are not possible with classical computers. The specific calculations that the Sycamore chip performed were related to complex random number generation. The Sycamore chip performed the calculations in about 200 seconds. In order to show the significance of how fast this was, the team also ran a simpler version of this same test on the world's fastest supercomputer (not quantum computer) at the Oak Ridge National Laboratory. After the supercomputer completed this simpler task, the team was able to extrapolate the amount of time the supercomputer would have taken to complete the more complex task that Sycamore completed. The team suggested it would have taken the supercomputer about 10,000 years to complete the same task that Sycamore completed in 200 seconds! Google's Quantum Computer To be fair, the task of verifying complex random number generation doesn't necessarily have wide application in today's world. But, that was never really the point of this experiment. The point was to show the potential that quantum computing can have in our world as the technology matures. Some experts have compared this breakthrough to Sputnik in space or the Wright Brothers first airplane flight...while these events arguably didn't have super-impressive results, they certainly paved the way for what would be very significant technology in the future. So, we will see where quantum computing takes us as an industry, but it's certainly proving to show that computing power is getting stronger and faster. Encryption So, how would this affect encryption? Encryption is fundamental to Internet privacy and security. At its core, encryption requires a secret key that the sender and receiver both have in order to encrypt and decrypt the information they send back and forth. Most encryption algorithms used today are widely known, and the developers show exactly how they work and how they were designed. While the security of the encryption is certainly based on its design and mathematical strength, it is also based on the fact that both the sender and receiver have a key that is kept secret. If an attacker steals the key, then game over. The strength of the key is based on the mathematical likelihood that someone (or something) could (or could not) figure it out. If you have followed computer encryption for any length of time, you've no doubt noticed that certain encryption key strengths are no longer recommended. This doesn't automatically mean that the encryption algorithm is not good, it just means the key size needs to be larger so that a computer will take longer figuring out the key. As computer processing power has grown over the years, the need for larger key sizes has also grown. For example, the RSA encryption algorithm (used for server authentication and key exchange) has been tested over the years to see how long it would take a computer to crack the secret key. As you may know, RSA is built on the foundation of prime number factoring where two large prime numbers are multiplied together to get a common value that is shared between the client and server. If a computer could take this large number and figure out the two prime numbers that were multiplied together, then it would know the secret key value. So, the whole foundation of security for RSA encryption is based on the idea that it is very difficult to figure out those two numbers that were multiplied together to get that big shared value. The idea with key size in RSA encryption is that the larger the two prime numbers are, the harder it is to figure them out. Many people have tested RSA over the years, and one group of researchers discussed some results from one of their tests. Several years ago, this team tested a 155-digit number and worked to factor it down. It took them nine years to figure out the factors (and thus the secret key). More recently, they tested a 200-digit number with more modern computing power and it took them about 18 months to crack it. A while later (with still faster computers), they tried a 307-digit number and they factored it down even faster. The point is, as modern computing power gets faster, the time it takes to crack an encryption key gets shorter. A typical RSA implementation today uses 1024-bit key size. Some applications will use 2048-bit key sizes, but the larger the key size, the more load it puts on the client and server, and it slows the web application down. So, there's a tension between strong (large) key size and application speed. Now that Google has shown the ability to use quantum computing to run calculations in 200 seconds that would take today's fastest supercomputers 10,000 years, it's not hard to imagine that an encryption key like the one used in RSA can be cracked in a matter of seconds. If you know a mathematician who designs computer encryption algorithms, tell them that the Internet might be looking for some new stuff pretty soon...604Views1like2Comments