How to bulk test for compatibility with HTTPS inspection?

Most of my users have been exempted from https inspection because of a series of problems.   I am trying to decide whether to turn it back on.   I would like to minimize the risk of doing so, by analyzing compatibility in advance.

I can parse my lweb filter logs to obtain a list of several thousand FQDNs that we have accessed using https without inspection.   I would like to feed the list into a shell script that tests whether UTM can connect successfully or not, using its version of OpenSSL and its ciphersuite configuration.   Successful connections would go to one log file and failed connections would go to another.  I could use the reject list to build an exception list before activating the https inspection feature, rather than enabling the feature and waiting for people to complain.

I have become familiar with the "openssl s_client -connect host:port" command, but have not figured out how to activate it in a script.  After a successful connection, it waits for a command, so I have to manually force it to disconnect.  I have not played with wget at all.

Can anyone suggest some script magic to get this done?

  • On the client I suggest using:

    wget -i filename


    The file then has


    For this type of thing my own full command I use is this.  It makes a single attempt, no redirects with low timeout, and the spider causes it to a HEAD rather than GET since we don't actually need to retrieve a body.

    wget --no-check-certificate -i filename --max-redirect 0  --connect-timeout=1 -T 1 -t 1 -O /dev/null --spider


    If done on a client going through the proxy, this will confirm that the proxy has no certificate or TLS problems.  That being said, the issues that cause people to skip HTTPS inspection for sites are usually at a higher application level.

  • In reply to Michael Dunn:

    Thank you, Michael!

    Since I am a Windows shop, I needed to find a wget equivalent for Windows.   There is one available on gethub, but a web search revealed equivalent options using powershell and javascript.   I have past bad experiences with PowerShell - since it is not procedural, a large source file has to be loaded into memory before any of it can be evaluated.   So I went with the javascript version.

    My selection was unique https sites, which had been visited without https inspection, and had returned success (StatusCode < 400).   For the time period with processed logs, I obtained a list of slightly less than 50,000 host names. 

    After some tweaking of the example, this is my script.   It runs a bit slowly, so I do not have complete results yet.   Based on interim results, the number of problems is a much lower percentage of the total than I had expected. 

    var WinHttpReq = new ActiveXObject("WinHttp.WinHttpRequest.5.1");
    var fs = new ActiveXObject("Scripting.FileSystemObject");
    infile = fs.OpenTextFile("hostlist1b.txt",1,0);
    outfile = fs.CreateTextFile("results1b.txt",-1);
    s = 0;
    u = 0;
    while ( infile.AtEndOfStream == false ) {
    q= infile.readline();
    q = /[^\n]+/.exec(q)
    try {
    WinHttpReq.Open("GET", 'https://'+q, /*async=*/false);
    x = WinHttpReq.ResponseText;
    s = s + 1;
    rslt = 0;
    x = '' + rslt + ',' + q
    } catch(err) {
    rslt = 1;
    u = u + 1;
    x = '' + rslt + ',' + q
    WScript.Echo('Success: ' + s + ' Fail: ' + u);

  • In reply to DouglasFoster:

    I run a combination of windows and Linux.  90% of the time when I want to do a Linux-ish thing in Windows I rely on MobaXterm (  The Free license is also legal to use in offices.  This is primarily an SSH client (like PuTTY) but with tabs, drag and drop SCP/FTP, and host of other features.  But it also gives you a bash-like command line with a significant number of Linux tools, including wget.


    An older, less complete, option that I used to use (especially because there was no installation, just copy the files over) is

    A complete and extensive set of Linux on windows is available by installing Cygwin