21 December 2015
Feisty Duck’s Cryptography & Security Newsletter is a periodic dispatch bringing you commentary and news surrounding cryptography, security, privacy, SSL/TLS, and PKI. It's designed to keep you informed about the latest developments in this space. Enjoyed every month by more than 50,000 subscribers. Written by Hanno Böck.
It seemed that the fate of SHA1 certificate signatures had been decided, but it appears that this old and insecure signature algorithm is not going away quietly. On the one hand, major browser vendors had announced plans to deprecate the support for certificates using this algorithm, and the CA/Browser forum had decreed that SHA1 must not be used for certificate signatures past 2015. On the other hand, some large companies are complaining that giving up on SHA1 might mean loss of access for many users.
In December, CloudFlare and Facebook voiced their concerns. They still observe a significant number of connections from devices that are not capable of using the stronger SHA256 algorithm. These connections—up to 6% in some cases—come mostly from developing countries.
These companies currently deploy both SHA1 and SHA256 in parallel, serving SHA1 certificates to clients that don’t support anything better. They believe that they can reliably detect such cases and that the detection mechanism can’t be exploited. To continue to use this dual deployment mechanism, Facebook and CloudFlare proposed a new mechanism for certificate validation called Legacy Validation, which would allow companies to continue to use SHA1 certificates in the future provided they can guarantee that they will only use it with old browsers. (Besides, up-to-date browsers will simply stop accepting SHA1, meaning such certificates, even if issued past 2015, just won’t work in them any more.)
It is not entirely clear which devices are responsible for the significant number of devices that don’t support SHA256. One known reason are Windows XP computers without Service Pack 3. For those machines updating them with this Service Pack can solve the issue. Of course that won’t close other security issues in Windows XP that were never fixed due to its end of life. Alternative browsers like Firefox are also an option for users of those old systems.
CloudFlare’s blog post mentions old Android devices before version 2.3 as a reason, but it seems that Android has better support for SHA256 than previously believed. Even the older Android 2 versions are already capable of supporting SHA256 certificates. Facebook highlights Symbian devices as a major reason they see connection failures with SHA256 certificates.
The proposal has now been submitted to CA/Browser forum for discussion.
On December 17, Juniper disclosed existence of unauthorized code in some of their products. This code could have been used by "knowledgeable attackers" to gain administrative access and decrypt VPN connections. Although we don't know the full details yet, it appears that there were at least two backdoors; one with a special SSH password and another focused on traffic decryption using a modified Dual EC DRBG.
In November, Kazakhtelecom issued a press release to announce that its customers will have to install a national security certificate starting in January 2016. The text indicated that this would be needed to allow traffic analysis on encrypted HTTPS connections.
A few days after that the press release was removed from the web page, but it can still be read through the Internet Archive. It is currently not clear if that means that the plans for the national security certificate have been cancelled.
Kazakhtelecom is a major Kazakhstan ISP. Requiring users to install a certificate that allows traffic analysis would be a unique and very drastic move. It could potentially endanger the security of all Internet users in the country if the certificate’s key gets leaked or abused by the people in charge.
The concept of installing a root certificate into a browser to allow traffic analysis is not new, although it’s controversial. Many security products use this technique to intercept and analyze traffic, especially in Enterprise environments. But these products are usually only used in private networks of companies. Other products that use similar TLS interception techniques have caused major security issues in the past, the most prominent example is the software Superfish that was found preinstalled on Lenovo laptops earlier this year.
Researchers from the University of Michigan created a search engine for certificates called Censys. In the past years these researchers had used Internet-wide scans to collect information about certificates and TLS configurations, publishing a lot of their raw data via the scans.io web site. Now, their extensive collection of certificates is searchable through Censys. A background paper on the search engine was published at the CCS conference.
The certificate authority Let’s Encrypt opened its service to the general public on December 3rd. Let’s Encrypt aims to make it both easier and cheaper for server operators to get valid TLS certificates.
Let’s Encrypt implements a new protocol to issue certificates called ACME. A lot of people were unhappy with the official Let’s Encrypt client software, because its code is quite complicated and requires a large number of dependencies. But there are already alternative implementations of the ACME protocol for users looking for a more lightweight solutions like acme-tiny (Python).
A few days after the launch a smaller incident happened. Let’s Encrypt had issued six certificates for domains that were using a technology called Certificate Authority Authorization (CAA) to forbid that. The concept of CAA is that domain operators can restrict the certification authorities they want to use by publishing them via a DNS record. The CAA check of Let’s Encrypt was not working properly and thus it was possible to get certificates despite the CAA record disallowing it. The issue was fixed a few hours after it had been reported.
OpenSSL fixed several security bugs with the release of the versions 1.0.2e, 1.1.0q, 1.0.0t and 0.9.8zh. One of the fixed bugs is a bug I discovered in the BN_mod_exp() function, which is part of OpenSSL’s Bignum implementation. In some corner cases the function will produce wrong results. The bug was found with the help of the fuzzing tool american fuzzy lop. A similar bug was found earlier this year in the squaring function BN_sqr in OpenSSL.
OpenSSL announced that these were the last updates for the old 1.0.0 and 0.9.8 versions. Future security fixes won’t be backported to these old versions; everyone should upgrade to one of the newer version trees.
Now that HTTP/2 is gaining in popularity, we’re starting to see configuration issues coming from the fact that HTTP/2 generally doesn’t allow the use of legacy ciphers even if they’re not outright insecure. In version 1.21.13, SSL Labs improved its handshake simulation to detect HTTP/2 negotiation and warn if the selected protocols and cipher suites are not appropriate. If this is something you care about, I recommend that you use the development server, which correctly simulates both NPN and ALPN negotiation. (NPN is a legacy negotiation mechanism that is being phased out. ALPN is the official way to negotiate HTTP/2.)
This subscription is just for the newsletter; we won't send you anything else.
Designed by Ivan Ristić, the author of SSL Labs, Bulletproof TLS and PKI, and Hardenize, our course covers everything you need to know to deploy secure servers and encrypted web applications.
Remote and trainer-led, with small classes and a choice of timezones.
Join over 2,000 students who have benefited from more than a decade of deep TLS and PKI expertise.