SSL Labs Grading Update: Forward Secrecy, Authenticated Encryption and ROBOT

Bhushan Lokhande

Last updated on: December 16, 2022

Update March 1, 2018: The completion of these changes is documented under Version 1.31.0 in the SSL Labs Changelog.

Qualys Cloud Platform

Free Trial

Get Full Access to the Qualys Cloud Platform Free

We are giving advance notification for following grading criteria changes applying from March 1, 2018: Not using forward secrecy, not using AEAD suites, and vulnerability to ROBOT. Update: This release also includes a grading change for some Symantec certificates.

Penalty for not using forward secrecy (B)

Forward secrecy (FS) also known as perfect forward secrecy (PFS), is a property of secure communication protocols in which compromises of long-term keys does not compromise past session keys. Forward secrecy protects past sessions against future compromises of private key. The very popular RSA key exchange doesn’t provide forward secrecy. You need to support and prefer ECDHE suites in order to enable forward secrecy with modern web browsers.

SSL Labs will start penalizing servers that don’t support forward secrecy; Grade will be capped to B. We will not penalize sites that use suites without forward secrecy provided they are never negotiated with clients that can do better.

Penalty for not using AEAD suites (B)

Your site should use secure cipher suites. AEAD is the only encryption approach without any known weaknesses. The alternative, CBC encryption, is susceptible to timing attacks (as implemented in TLS). AEAD suites provide strong authentication, key exchange, forward secrecy, and encryption of at least 128 bits. TLS 1.3 supports only AEAD suites. SSL Labs doesn’t currently reward the use of AEAD suites. In this grading criteria update we will start requiring AEAD suites for A.

Grade will be capped to B, if AEAD suites are not supported. As with forward secrecy, we will not penalize sites if they continue to use non-AEAD suites provided AEAD suites are negotiated with clients that support them.

We have talked about these changes in Announcing SSL Labs Grading Changes for 2017.

Penalty for ROBOT vulnerability (F)

Return Of Bleichenbacher Oracle Threat, is an attack model based on Daniel Bleichenbacher chosen-ciphertext attack. Bleichenbacher discovered an adaptive-chosen ciphertext attack against protocols using RSA, he demonstrated the ability to perform RSA private-key operations. Researchers have been able to exploit the same vulnerability with small variations to the Bleichenbacher attack.

SSL Labs will start giving “F” grade to the servers affected by ROBOT vulnerability from February 28, 2018 March 1, 2018. Note: All changes described in this blog post go live on March 1.

SSL Labs has started giving a warning if the site doesn’t support forward secrecy and/or AEAD suites; or if the site is vulnerable to ROBOT.

Penalty for using Symantec Certificates (T)

Starting March 1, 2018, SSL Labs will give “T” grade for Symantec certificates issued before June 2016.

See details in Google and Mozilla are Deprecating Existing Symantec Certificates.

Qualys Cloud Platform

Free Trial

Get Full Access to the Qualys Cloud Platform Free

Show Comments (32)

Comments

Your email address will not be published. Required fields are marked *

  1. I consider this being a major step forward because by this, the basic and known sources of long-term “ancient” and so-to-say “inherited” weaknesses get commonly penalized now.
    Additionally, this prepares and literally “calls” for TLS 1.3 being introduced soon as the “most secure” TLS standard ever proposed (because it excludes former Problems and suites).
    While TLS 1.2 does not urgently need to be abolished first, as long as all required and
    recommended protocol mitigations, plus HSTS, SCSV, no fallbacks, current certificate types and strengths, only strong authenticated ciphersuites are in place using the latest available implementations, by which it also provides sufficient security relating to the recommended conditions as proposed in the latest PCI recommendation.
    Can anyone btw push MS to finally implement ChaCha20_poly1305 [rfc7905] also into their SCHANNEL for IE/Edge as well as for their servers ? (yes and why not also the CAMELLIA suites [rfc3713] which are up-to-date and secure?). This is then what I would really call a “versatile” https engine.

    1. Dear BerndP and bloggers,

      A big big fat +1 for your post until Schannel/IE/EDGE.

      You work for a large (read MS shop) company or not in website development.
      I do work for a company in website development.
      We consider IE/EDGE as legacy browsers and avoid spending much work to get it to work with them, implying cosmetic errors are fixed on as time permits basis.

      Schannel (server 2012 side) had its compatibility (are people doing testing there?) issues with AES with GCM, being fixed with a patch to disable these ciphers all together and never got fixed permanently. Leaving AES with CBC as the strongest. Not great.

      Customers can upgrade to server 2016 to get what they deserved (AES w/ GCM) with server 2012 in the first place: this smells like a sales trick of the dirty kind.

      Log story short: avoid MS client and server side to the max or if you are stuck with a 2012 server (or earlier), place NGINX as proxy in front and you can even end up with TLS 1.3 draft 18 on top of all the other RFC goodies. Don’t waste your time with MS and Schannel (I mute from here as a start).

      1. With the current alpha of OpenSSL 1.1.1 you can use TLS 1.3 Draft-23 which is what is now being deployed in web browsers in place of Draft-18.

        I do believe you need either the Beta or Dev branch of Chrome to use it currently.

  2. Interest is growing in wanting to know the following.

    1) From an SSL labs perspective – If the appropriate patch is applied and the TLS/RSA cipher is not removed will SSL labs still trigger a failing grade?

    2) From a Qualys VM perspective – If the patches are applied and again the TLS/RSA cipher is not removed from the asset , will the Qualys VM program trigger a vuln. based on the existing TLS?RSA remnant?

  3. SSL Labs also detects it now but has indicated that all sites that have it will be given the lowest grade possible of ‘F’ as of next month. Can you provide a rationale that you can share?

    In comparison, Qualys Enterprise has the vulnerability as a CVSS 4.3, so not as High as an F. When will these failing grades be received Feb.28 or March 1st?

  4. Dear bloggers,

    To be honest, I find the capping to B for sites that have nothing better than ‘TLS_RSA_’ rather double standard and of weak knees. Weak knees and its friends in mitigation, got the SSL/TLS stack in the current mess in the first place.
    Because, if 1024 bits DHE (usually also a common parameter file) is in use, your grade is capped to C and for all the good reasons.
    Also, and even more important, the fine ROBOT researchers made it very clear, passing the ROBOT vulnerability test does not necessarily mean you are in safe waters.
    I like to go even a step further and promote a cap to D for RSA key exchange-only sites.

    On a side note: I have big trouble digesting a site that gets an A+ while having a fall back to ‘TLS_RSA_WITH_3DES_EDE_CBC_SHA (0xa)’. ‘wired.com’ is such an example, and they avoid here the C cap penalty for legacy DH by falling back to RSA Kx. In fact from a previous blog like this: “To that end, we’ll be modifying our grading criteria to penalise sites that negotiate 3DES with TLS 1.1 and newer protocols. Such sites will have their score capped at C”.
    Disclaimer: It’s all about fairness and nothing against ‘wired.com’ and sub-domains.

  5. Still concerns about this test. We have two domains configured with the exact same SSL settings & ciphers and the ROBOT test is failing on one, passing on the other. We have verified with our vendor.

    1. Dear ‘general’ and others,

      As a matter of exercise and learning, I’m running sites found vulnerable of ROBOT by SSLlabs thru stand alone scanners.
      Currently these scanner are TLS-Attacker 2.4 and SSLyze (latest as of writing).
      And the results of the ROBOT _only_ rescan are not very consistent: TLS-Attacker disagrees with SSLlabs and SSLyzer in over 35% of the cases.
      SSLyzer agrees with SSlabs always as far as my ~15 random rescans are concerned.

      To make a long story short: Is there a forum where results of multiple ROBOT scanners are discussed, so developers of those scanners can improve their products for the greater good and consistency?
      This might be the root cause of ‘general’ his issue with the SSLlabs’ ROBOT scan of his 2 sites.

  6. I’m currently researching / validating very high encryption standards for our internal and external network services, and only want to cater for endpoint / client technology we currently have active inside our organisation network. i.e. We don’t have any Windows 7 or Android 4/5/6 on our corporate LAN, however the configurations I’m testing are getting quashed by the “Penalty for not using forward secrecy (B)”, because there are reference browser tests which we’re not going to support.

    We currently have 100% for Certificate, Protocol Support and Cipher Strength for Windows Server 2008R2, 2012, 2012R2, and 2016. However Key Exchange is limited to 70% due to old clients which we’re not going to support inside our corporate network.

    Is there an option to disable testing, or displaying results for systems we’re not interested in supporting? This is also important for some of our external web applications were we want to phase out TLS1.0 / SHA1, and move to more secure services for out Internet facing customers.

    1. Dear Miles and others,

      Sorry for guessing, but for the best reasons, you’re not publishing your SSLLab results.

      If we can disable tests ourselves, than we all end up with A+.
      The tests are suitable for generic public users. This clearly isn’t a perfect fit with your corporate user community.
      If the result of this test is critical, you have no choice but to change your config to make PFS work for these unimportant reference browsers/user agents.
      If not critical, you’re already done, because you verified with the browsers in use.

      On a side note: Why wait disabling SHA1 and TLS 1.0 (and TLS 1.1), if you only have top notch users?
      Who knows, you end up with an A+ because the legacy reference browsers are excluded.
      Take a look at other sites and their results. I’ve seen A+ because of TLS 1.2 only (sorry, no URL and it wasn’t me/my company). The list of excluded browsers was huge, but I’m sure none of those affects your users. Or should not affect your customers.
      I say, take the lead and push your customers to use top notch browsers only. Take an example on Google vs Symantec SSL biz. Weak knees are like weak doctors, here.

      Keep up the good work, to make our world a safer place.

      1. Hi HanC,

        I think you misunderstand why people might want to suppress tests / results.

        For example, if a domain is fully tested using SSL Labs, it will get an overall score (that should not change)… in our case a “B” due to not supporting older reference browsers inside our corporate environment. However as these results are capped at “B”, I can’t see if there are any other underlying issues / configuration I might be able to improve, in other areas. I.E. If I can suppress the “B” cap results due to old reference browsers, then I can see if the overall rating goes up / down, this makes no bearing on the overall rating, but admins can drill down into other problem areas which can’t be seen due to result caps.

        I don’t think anyone would wants to change their overall score by omitting tests / results which should be included, but suppressing areas which you know you’re not going to support will give admins a better understanding on how they can improve their overall security in other areas by seeing what their next test result might be, if they can vary the results they see at any one time.

        Regards.

  7. When do you plan on supporting testing for TLS 1.3 draft-23? Both Cloudflare and OpenSSL 1.1.1-pre2 use this variant of TLS 1.3 now.
    I do know my OpenSSL 1.1.1-pre2 server supports it as Chrome Dev is indicating a TLS 1.3 connection.

          1. Are you suggesting Qualys may be planning on marketing the information collected via the sign-up form?

  8. When will the grading be adjusted to implement a penalty for TLS 1.0? It has already been formally deprecated on June 30, 2018.

  9. LATE Grading changes proposal :
    RSA now suffers from a new attack vector – Cache timings – affecting also connections starting from TLS 1.3 (when downgraded in some scenarios although this shouldn’t be the case resp. this should not work at least under normal circumstances by “secure renegotiation” becoming necessary or with SCSV in place. And under HSTS being deployed, please no downgrade attempts allowed at all within the established session! ).
    Hence under TLS 1.3 there is no plain RSA handshake allowed, so only “crippled” TLS 1.3 sessions – or cloggy implementations – seem to suffer from this problem when allowing to be somehow getting downgraded to TLS 1.2 where the attack can be used.
    Which is to be seen as a protocol flaw now if that still works somewhere, equal by which way.
    So, – another time – plain RSA has been proven weak for handshaking, And there is no FS provided with it.
    Maybe we should finally ban it from any TLS Connection asap and from the standards, equals which level and version of it was used.
    It finally makes no sense to drag such a damaged “weakness-prone feature” around for much longer. No A anymore for Servers even providing RSA somewhere with a protocol. Downprioritizing it is no excuse as well. So we should only have B for it.. the bad RSA guy. And at some later timepoint : C.

  10. Hi, while some ste auidts show A grade SSL certificates, ssllabs.com is showing’B’ grade.

    My site is hosted on Azure Ubuntu. Cna you please help in why this discrepancy please.