Bad SSL: security awareness in interesting times (2)

(part 1)

The reason I started collecting third party's data

Through my numerous visits to the SSL Test webpage, in that particular list on the lower right of the page I have seen ...attack ships on fire off the shoulder of Orion.

I got intrigued by the idea of assessing the density of technologically advanced organisations - some of these professionally active in security -, state departments, healthcare institutions, military hosts, service providers, remote access portals for teleworkers, hardware/software management dashboards that can be so badly protected to achieve such a negative evaluation.

It is interesting to know that although SSL Labs offers the option to not publish the results of the test (which might inconveniently appear in the worst scores list) many sensitive sites show up in the list: is it paranoid to think that several tests are executed not only by the organisations themselves (who would probably use the privacy option), but also by users or employees?

I find that worst scores box so intriguing, a couple months ago I wrote a script to periodically check that particular top 10 and store the entries in a database; while storing an entry, IP-API is used to obtain the country information; a timestamp gets refreshed when an entry is seen in the list again.

The data is periodically summarised in a chart showing the amount of websites achieving the worst score at the SSL Labs test sorted by country and a search box is available (for partial match on the website's domain name).

the badssl project's ranking by country view (home page)


~disclaimer

It may be important to highlight the possibility an entry in the cache might have meanwhile been fixed and sporting an A+ when checked later (much more desirable than verifying it still achieves an F six weeks past the first addition to the database).

The cache is not meant to represent a wall of shame, but to freeze certain snapshots in time: it would be nice to see the majority of the entries achieving a B or higher when tested later.

There is no specific commitment regarding the completeness of the information acquired: the frequency of the acquisition is deliberately not as high as it should probably be to assure all entries entering and leaving the (first-in-first-out) list to be captured; the idea is not to collect every entry ever but a significant sample.

The title Geographic Distribution of insecure HTTPS is used knowing this is a rather liberal approximation, to say the least: the information acquired via SSL Labs is based on/limited to live results generated by the servers actively tested as opposed to information based on mass scanning projects (data are not live and/or do not contain important details)

What is already in the data, at this point

Already at this (early) stage of the data collection, the ability to perform arbitrary lookups on specific strings provides evidence that at several (even sensitive) websites, security is - long term - limited to obtaining the friendly padlock icon displayed in the browser more than actually protecting information and users.

It may be disturbing to see how many secure, login, vpn, intranet, mail, bank, .gov, .mil in just a few weeks have been captured: numerous entries in the database show long term issues. (Many interesting strings should be looked up in one's own language)

In my opinion, websites that still achieve an F, two days after being tested the first time, indicate no responsible function is even aware of the fact; which becomes especially worrying when it affects resources and institutions claiming ambitious visions and missions, publicly funded with often astronomic expenses.


Technical note

Everyone is invited to peruse the lookup function but, please, keep in mind the current system might not keep up with sustained load; effort has been made to generate compliant HTML and CSS content and for an optimised experience as well from desktop as from mobile platforms; smartphones and tablet users could easily obtain a web app through the browser.

I have not yet determined to what extent or how long this project will be meaningful, hence I admit I have not yet arranged availability/redundancy/(auto)scaling: at some point I might make a periodic dump of the database available for possible heavy users.

Comments

Popular posts from this blog

SSLLabs SSL Test on 716 .gov https sites

Is DHS running honeypots?

Het is moeilijk mensen te vertrouwen in een monetair systeem