This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

HTTPS Decrypt and Scan "unable to get local issuer certificate"

I have implemented transparent HTTPS decrypt and scan. I am using the default generated signing certificate and imported that into the machines that are being scanned. This is working mostly fine, I can confirm that https traffic is being inspected successfully. I am however seeing in the logs quite a few sites being blocked reporting "error="Failed to verify server certificate", I can confirm this by going to the reported URL in the browser and I get the firewall splash page come back reporting "unable to get local issuer certificate". I will provide an example site 

https://www.propertyandlandtitles.vic.gov.au

I have been progressivly resolving the issue by getting the intermediate issuing certificate and importing it as a "Local Verification CA", including those listed in Sophos article 122257, I have done this now for about 10 from all manner of providers such as below to name a few.

DigiCert Inc DigiCert SHA2 Secure Server CA

Symantec Corporation Symantec Class 3 Secure Server CA - G4

VeriSign, Inc. VeriSign Class 3 Public Primary Certification Authority

After importing these as local verification CA's it seems to resolve the problem. I am also aware I can put in exceptions to skip certificate checking but I'd rather avoid that and I certainly rather avoid doing that for all URL's as I have read some people suggest.

My question is why do I have to keep installing these intermediate CA's, that seems like a task that will never end. I raised a case with Sophos but to be honest it wasn't overly helpful, they did a rmeote session and they basically told me to keep doing what I was doing.

I am also aware of a release note for 9.5

  • NUTM-6732 [Web] Certificate issue with transparent Web Proxy - "unable to get local issuer certificate"

I am running 9.503-4. Maybe the problem really hasn't been resolved or it has reocurred.

Of all the sites affected I have been there using firefox to confirm that vanilla Firefox does not have any certificate validation errors, my understanding is that the UTM's in build root verification CA's mirror those of Firefox.

I have used Digicert's SSL checker on a few of the affected sites and it has reported some issues almost certainly due to server admins incorrectly installing their certificate chains. However from my perspective if Firefox can display a page without error and without having to install anything manually I would expect the UTM to be able to deal with it also.

It almost feels like the root cause is the firewall is unable to go and follow the AIA and OCSP to retrieve any missing intermediate certs?

I've been reading the forums and this seems like it might be a recurring issue over the years, does this make the solution not viable? Does anyone have any feedback on if you are having this issue and any possible remedies. I don't really feel like spending the rest of my days hunting down and installing verification CA's. Any feedback would be greatly appreciated.



This thread was automatically locked due to age.
Parents
  • I woud like to add some information to this. I have upgraded to 9.505-4 and the behaviour is still the same. I still get the same problem appearing but I seem to have resolved a fair amount of the sites by installing the pictured intermediate CA's. The majority of the remaining sites I can see that regularly fail seem to be advertising and other such semi questionable sites and I'm less inclined to fix those.

Reply
  • I woud like to add some information to this. I have upgraded to 9.505-4 and the behaviour is still the same. I still get the same problem appearing but I seem to have resolved a fair amount of the sites by installing the pictured intermediate CA's. The majority of the remaining sites I can see that regularly fail seem to be advertising and other such semi questionable sites and I'm less inclined to fix those.

Children
  • Hi, Michael, and welcome to the UTM Community!

    I think there's a feature request for automatic fetching of intermediate CAs in Ideas.  For now, what you've done is what's needed.  If you have a paid license, please open a ticket with Sophos Support so that they know they should add these certs to everyone with an Update.

    Cheers - Bob

     
    Sophos UTM Community Moderator
    Sophos Certified Architect - UTM
    Sophos Certified Engineer - XG
    Gold Solution Partner since 2005
    MediaSoft, Inc. USA
  • I have conluded from my testing thst "Cannot get local issuer certificate" is a non-problem reprted in lieu of tbe actual problem.   I believe the message is saying that you do not have a personal certificate that can be used to identify yourself to the remote webserver for mutual authentication.   Openssl s_client will log this result on a normal healthy connection test.

    Supporting user certificates within https inspection is obviously tricky.  Last I have seen, UTM did not support user certificates but product management had plans to make it work sometime soon.  Not a feature I need anyway.

    Your actual problem is technically a configuration error on the remote server.  They should deliver the intermediate certificate along with the server certificate, but the omission error is common.  Most browsers use A.I.A fetching to find missing certificates.  I think this involves following the c.r.l. or o.s.c.p. links to find a trusted location for the missing certificate.  UTM lacks this feature, so please vote for the idea.

    Intermediate certificates are used so that a large batch of certificates can be untrusted quickly in a crisis.  Root certificates are trusted until they are removed or expired, so there is s slight future risk from declaring an intermediate certificate to be a root.  

    Finding and installing the intermediate certificate is a good workaround with a finite project scope.

    You should also know that US DOD roots are missing by default from both Windows and UTM.   If your users will access their sites, you will need to download and install those roots.  You can find them with a web search.

  • Thanks for your replies. Regarding the fetching of intermediate certs via AIA. Yes I understand the actual issue is the website admins not properly including the certificate chain and that most modern browsers have mitigated the problem by following AIA and OCSP to retrieve the intermediates. I can see in my firefox browser the default root certificates which mirror the Sophos UTM default roots, in the browser they are classified as "Built In Object Token", all the ones where it has run off and grabbed intermediates are classified "Software Security Device", and there is about 60 of those.

    It does appear to me now from this discussion that the UTM does not replicate the browser functionailty by attempting to download intermediates. This is a surprise to me and isnt mentioned in any documentation I have managed to find. This seems to be a pretty fundamental design flaw preventing an easy implementation of HTTPS inspection in general, at minimum I think it should be mentioned in the implementation notes, apologies if it is and I missed it.

    Given that this "issue" appears to be by design I am now super interested in how people are mitigating the problem. At least initially before manually loading the intermediates we were seeing in the order of hundreds of sites per day that users were unable to access. This is in a site with approx 300 users. I cannot imagine people are turning on inspection and then ignoring this problem beacuse users would have to be complaining. I'm also very interested in the release notes of 9.5 making reference to a "bug fix" for what appears to be this exact issue, I'd like to know what was done because maybe it didn't really fix anything?

    I must admit I am new to https inspection so my perspective is somewhat from an outsider never having implemented it in a production capacity. I couldn't make comment on the implementaiton of other vendor firewalls or other industry "standards" or conventional wisdom in terms of how their https inspection is implemented.

    However I will say after having implemented it, I can already attest to the benefits of it, it has already prevented numerous attempts to download malicious .exe files and other nasty things from "drive by's" and people clicking email links etc. I would say the implementation of this is now virtually essential if you want the UTM to do its job properly. If I were a malware author there's no way I would bother attempting to distribute over HTTP anymore knowing that decryption of HTTPS is still not the default position of many people.

  • The logs are also much less usable with https inspection off.

    V9.4 also rejects site that transmit a root certificate., which is another common configuration error.  This should be fixed in 9.5, subject to verification.

    Chrome started rejecting the UTM certificates when Chrome v58 was released earlier this year, because the Common Name field was being used instead of SAN.   It was fixed by changing the certificates generated by UTM.  I suspect this is what was fixed in tbe release notes that you read.