Upload Certificate using API

Hi folks,
 
I've started having a play around with XG. I have a PowerShell script for generating a new Let's Encrypt certificate and updating my various components that use it, and wanted to integrate this with XG Home. It looks like the obvious way of achieving this should be the API, but I'm struggling a little with the certificate upload.

I've taken a look at the documentation and fired off a few API calls successfully, but the upload operation is a little different as it references data in a multipart request.
I'm using the v16.5 documentation because the v17 docs appear to be missing this particular page (it appears in the menu but the link is broken because the AddCertificate&UpdateCertificate.html file is missing).
I am not a developer but I did a bit of Googling and have come up with what I think is a multipart request. A Fiddler trace shows this when I fire off my request (PFX file data redacted and passwords changed).
 
POST sophos:4444/.../APIController
User-Agent: Mozilla/5.0 (Windows NT; Windows NT 10.0; en-GB) WindowsPowerShell/5.1.16299.251
Content-Type: multipart/mixed; boundary=Certificate_File_Upload
Host: sophos:4444
Content-Length: 2712
Connection: Keep-Alive
 
--Certificate_File_Upload
Content-Disposition: attachment; filename="test.pfx"
Content-Type: application/x-pkcs12
 
<<<<Encoded PFX data here>>>>
--Certificate_File_Upload--

Tidying that up for readability the decoded XML in the URL is:
<Request>
  <Login>
    <Username>admin</Username>
    <Password>MyPassword</Password>
  </Login>
  <Set operation="add">
    <Certificate>
      <Action>UploadCertificate</Action>
      <Name>Test</Name>
      <Password>PfxPassword</Password>
      <CertificateFormat>pkcs12</CertificateFormat>
      <CertificateFile>test.pfx</CertificateFile>
      <PrivateKeyFile></PrivateKeyFile>
    </Certificate>
  </Set>
</Request>
 
The response I get from the API is: <Status code="510">Operation failed. Deleting entity referred by another entity.</Status>
Going by the documentation this means "Certificate could not be uploaded due to invalid private key or passphrase. Choose a proper key".
 
I've also tried a version with the certificate and key in PEM format rather than PFX and get: <Status code="500">Operation could not be performed on Entity.</Status>
Going by the documentation this means "Certificate could not be updated".
 
I tried a multipart/form-data request initially, but the API didn't provide any feedback - I got an HTTP 200 response, but no XML in the body. The multipart/mixed version at least responds with some XML so I'm assuming that's what it wants.
 
I'm assuming there's something wrong with the way I'm uploading the PFX file - perhaps I've misunderstood what the multipart request should look like? I couldn't find an example in the docs. The encoded PFX file data looks correct though, because it appears to be the same in Fiddler when I upload the PFX via the web UI (successfully). I was hoping one of you might have tried this and could point me in the right direction, please?
 
Thanks,
Andrew
  • I have attempted the API call using cURL instead, with the same XML. This gave exactly the same response, which suggests the problem isn't related to the fact I'm using a PowerShell script to create a multipart upload.
     
    Has anyone successfully uploaded a certificate to XG using the API?
    If anyone has a working process or an example (as long as it uses the API that actual method doesn't matter) I'd appreciate knowing how they did it.
     
    Thanks,
    Andrew
  • Hi Andrew,

    Looks like you are close to a working solution. I too - just as you have tried - create a multipart/form-data object (System.Net.Http.MultipartFormDataContent).

     

    My request looks something like the following (highlighted in yellow where I believe you need to adjust):

    --6332ebe1-9b29-49f6-a4f2-1728b60e4131
    Content-Type: application/xml; charset=utf-8
    Content-Disposition: form-data; name=reqxml
    <Request>
        <Login>
            <Username>*** USER ***</Username>
            <Password passwordform="encrypt">*** HASHED PASSWORD - remove encrypt to use non-hashed ***</Password>
        </Login>
       
        <Set operation="add">
            <Certificate>
                <Name>*** CERT NAME ***</Name>
                <Action>UploadCertificate</Action>
                <CertificateFormat>pkcs12</CertificateFormat>
                <Password>*** PASSWORD ***</Password>
                <CertificateFile>*** CERT NAME ***.pfx</CertificateFile>
            </Certificate>
        </Set>
    </Request>
    --6332ebe1-9b29-49f6-a4f2-1728b60e4131
    Content-Disposition: form-data; filename=*** CERT NAME ***.pfx; name=*** CERT NAME ***
    Content-Type: application/octet-stream
    *** BINARY DATA ***
    --6332ebe1-9b29-49f6-a4f2-1728b60e4131--

     

    With this approach I get the following response:

    <?xml version="1.0" encoding="UTF-8"?>
    <Response APIVersion="1700.1">
      <Login>
        <status>Authentication Successful</status>
      </Login>
      <Certificate transactionid="">
        <Status code="200">Configuration applied successfully.</Status>
      </Certificate>
    </Response>

     

    Also remember that you cannot upload a certificate with a name that already exists. You need to create a new unique name each time (then update all references to use the new and then delete the old).

    A final note here is that it seems that firewall rules that have Path-specific routing looses some of its configuration on update (in my case the Allowed Client Networks). Haven't fully investigated what I'm doing wrong here, but to me it seems to be a bug in the API as I on update return the same configuration the API gives me...

     

    Hope this helps!

    Trond

  • In reply to Trond Aarskog:

    That's great. Thanks Trond!

    The missing link was putting the XML in the request body as well as the certificate, rather than just in the URL. All working now.

    Thanks for the help. Smile

  • In reply to Andrew Quinn:

    Any chance that either of you could do a complete write-up on doing this. ( I would prefer something that runs from a UNIX host :) ) I am assuming that Lets Encrypt certificates are being generated using --check-dns so it would only require one central system to make the various requests for each site/service that would require a certificate.

    What is the impact of having to add a new certificate and delete the old one for the selected sites/services.

    This would be awesome if it all could be done in the WEB and Mail services.

    What would be even better if Sophos would just include Lets Encrypt support in the XG directly and us not having to create these crazy work arounds.

     

    Good work to you two guys.

    -Ron

  • In reply to rrosson:

    Hi Ron,
     
    I'm using the ACMESharp PowerShell library to generate a Let's Encrypt certificate (with multiple SANs) on an IIS server. Once it has been installed in IIS I'm exporting it and copying to the different locations it is used.
    The IIS server is my "central" system for generating the certs... however in my case this is a home install, so it's all behind a single public IP. The easiest validation method for me with a single IP was to forward HTTP traffic hitting my IP to a validation site on the same IIS server and let ACMESharp take care of the validations using the HTTP method.

    With Trond's help my PowerShell script is now able to upload the new certificate to XG, so ideally I'd have it also replace the old certificate on the various XG components. I'm not using XG much yet, but I intend to have a play around with the email and web publishing when I get some time. If I figure that out I'll post something back here in case it helps. Whilst I have got a Centos box here, I'm not doing anything clever with it - just SFTPing the exported (IIS) certificate over the top of the old one and restarting Apache.
     
    Thanks,
    Andrew
  • In reply to Andrew Quinn:

    I thought I'd post back with a bit of progress. At the risk of being unfair to a product I'm not hugely familiar with, the API seems to be a bit of a mess - with incorrect/incomplete documentation, and buggy behaviour. Perhaps my experience will help someone else.
     
    What I've tried to do: I have configured the email MTA, and some business (web) application rules. I am attempting to use the API to upload a new (Let's Encrypt) certificate and replace the existing one in each place it is used.

    So far, I've had some success with the website publishing, but not with email.
     
    The process I'm attempting to use is:
    1. Get existing config using API "Get" request
    2. Replace references to the old certificate with the new one
    3. Push the updated config to XG using API "Set" request
     

    With website publishing I've had a few challenges...
     
    The first minor gotcha is that while it's possible to retrieve the existing config by putting the request XML in the URL, it isn't possible to push the config back using the URL - I believe the URL becomes too long with all of the config in it.
    This works to retrieve the firewall rules: https://sophos:4444/webconsole/APIController?reqxml=<Request><Login><Username>MyUsername</Username><Password>MyPassword</Password></Login><Get><SecurityPolicy></SecurityPolicy></Get></Request>
    To push the config back to XG, putting the XML in the body of a multipart request works (see Trond's post above).
     
    The next challenge is that for some reason the API only seems to understand rules with path-specific routing enabled. If you retrieve the config for a rule without path-specific routing, it actually converts it to an equivalent rule with path-specific routing enabled (i.e. with one path "/"). This is not expected behaviour, but functionally it's equivalent so it's not a major issue. It is a problem if you try to use a "Set" request though, because although the documentation provides the syntax for a rule without path-specific routing, in practice I couldn't get this to work. Pushing the equivalent rule with path-specific routing enabled works though (sort of - see next).
     
    The biggest problem is the same one Trond mentioned - when you push the config to XG, it seems to lose the "Any IPv4" configuration from the "Allowed Client Networks" field. I, too, am convinced this is a bug in the API. It's quite a bad one considering it breaks the firewall rule by effectively converting it to a block rule for your web application. From what I can tell (with the limited functions I tested) the problem is limited to the "Any IPv4" setting. Other than that setting, the "Allowed Client Networks" field only allows you to select individual IP addresses or subnets in CIDR notation. As a workaround I have created "IP Host" definitions for the following networks. Combined, these subnets are more-or-less the equivalent of "Any IPv4". If I add these to "Allowed Client Networks" instead of "Any IPv4" the API doesn't lose the config and the rule works.
    1.0.0.0/8
    2.0.0.0/7
    4.0.0.0/6
    8.0.0.0/5
    16.0.0.0/4
    32.0.0.0/3
    64.0.0.0/2
    128.0.0.0/1
     
    In summary: Use path-specific routing, replace "Any IPv4" with IP Hosts representing those subnets, and you can get an equivalent rule that the API doesn't mangle. Put the XML in the body of a POST request instead of the URL, and success!
     

    Unfortunately, I've not had any luck with the email side of things...

    The "Get" request for "EmailConfiguration" returns "You do not have permission for the requested entity". This is authenticating successfully with the default "admin" account.
    "Set" requests don't work either with a more generic "Operation could not be performed on Entity".

    If anyone has managed to update the email MTA configuration using the API... feel free to share!
     
    Thanks,
    Andrew
  • In reply to Andrew Quinn:

    It seems that I didn't manage to work around all of the web problems - I've noticed the API not only loses the "Any IPv4" setting, but also any intrusion prevention policy that is set on the rule (and according to another post on this forum the traffic shaping policy too).
     
    The intrusion prevention policy configuration appears to be missing from the XML returned by the API's "Get" request for business application rules, but even if I add it back in manually before posting a "Set" request, the API still drops the configuration.
     
    I'm getting the feeling the API isn't finished... is it a beta feature?
  • In reply to Andrew Quinn:

    Andrew, Thank you for continuing the work and reporting your findings. Hopefully someone from Sophos will pick this up and enlighten us.   :)

     

    -Ron

  • In reply to Andrew Quinn:

    I'm gonna kick this one once more. I'm also trying to update the certificate of my Sophos XG instance using a Powershell script. I've managed to get the multipart request exactly the same as in the above post. But for some reason Sophos XG only returns a 200 OK without any body. So there must be something wrong, but I've gone over the request body 10 times now an I'm fairly sure that its ok. So if someone could maybe share their powershell script for reference, that would be awesome.

    Update: I managed to get to the point where it now responds (newlines in the XML are a no-no it seems). But the response is always 510 Deleting entity referred by another entity. Going to debug some more... 

  • In reply to Trond Aarskog:

    Hi Trond and Andrew,

    Firstly, thanks for the detailed question, it has helped me get closer to uploading the cert. I think.

    This is the PowerShell script that I have got so far (based a lot off this page: http://blog.majcica.com/2016/01/13/powershell-tips-and-tricks-multipartform-data-requests/)


    $ContentType = "application/octet-stream"
    $certFile = "C:\temp\LE_cert.pfx"
    $fileName = Split-Path $certFile -leaf
    $boundary = [guid]::NewGuid().ToString()
    $fileBin = [System.IO.File]::ReadAllBytes($certFile)

    $enc = [System.Text.Encoding]::GetEncoding("iso-8859-1")

    $template = @'
       --{0}
       Content-Type: application/xml; charset=utf-8
       Content-Disposition: form-data; name=reqxml
       <Request>
          <Login>
             <Username>apiUSER</Username>
             <Password>apiPWD</Password>
          </Login>
          <Set operation="add">
             <Certificate>
                <Name>LE_tcxapi_20181105_cert</Name>
                <Action>UploadCertificate</Action>
                <CertificateFormat>pkcs12</CertificateFormat>
                <Password>pwd</Password>
                <CertificateFile>{1}</CertificateFile>
             </Certificate>
          </Set>
       </Request>
       --{0}
       Content-Disposition: form-data; filename="{1}"; name="Unique"
       Content-Type: {2}
       {3}
       --{0}--
    '@

    $body = $template -f $boundary, $fileName, $ContentType, $enc.GetString($fileBin)
     

    $response = Invoke-WebRequest -UseBasicParsing -Headers $headers -Uri 'fw-exg-00:4444/.../APIController operation="add"><Certificate><Name>MHA_TEST_Cert</Name><Action>UploadCertificate</Action><CertificateFormat>pkcs12</CertificateFormat><Password>pwd</Password><CertificateFile>LE_cert.pfx</CertificateFile></Certificate></Set></Request>' -Method POST -body $body

    This gives me the following request body:

    --db611ddb-ea8a-450f-aad8-90bee1c6a889
    Content-Type: application/xml; charset=utf-8
    Content-Disposition: form-data; name=reqxml
    <Request>
       <Login>
          <Username>apiUSER</Username>
          <Password>apiPWD</Password>
       </Login>
       <Set operation="add">
          <Certificate>
             <Name>LE_cert</Name>
             <Action>UploadCertificate</Action>
             <CertificateFormat>pkcs12</CertificateFormat>
             <Password>password1</Password>
             <CertificateFile>LE_cert.pfx</CertificateFile>
          </Certificate>
       </Set>
    </Request>
    --db611ddb-ea8a-450f-aad8-90bee1c6a889
    Content-Disposition: form-data; filename="LE_cert.pfx"; name="LE_cert"
    Content-Type: application/octet-stream
    <<--redactedEncodedTXT-->>
    --db611ddb-ea8a-450f-aad8-90bee1c6a889--

    I think I have got my request in the same format as Trond's example but I am still getting "<Status code="500">Operation could not be performed on Entity.</Status>"

    I was wondering how you were encoding your certificate, I have tried the above iso-8859-1, but also UTF-8 and Base64 and none of them have got the certificate uploaded.

    I think I have got this right, but I just can't get the certificate uploaded, so any help would be greatly appreciated.

    Cheers,

    Tim

     

  • curl -k  -F "reqxml=<uploadTest.txt" -F file=@Certificates.p12 "https://<DNS or IP here>:4444/webconsole/APIController?"

    What mattered was using the @ and not < for the "file=" section.

    From the Curl man page:
    The difference between @ and < is then that @ makes a file get attached in the post as a  file  upload, while  the  <  makes  a text field and just get the contents for that text field from a file.

     

    The rest of this post is for those still learning/trying to figure things out:

    ---Contents of the file called uploadTest.txt---
    <Request>
      <Login>
        <Username>username</Username>
        <Password>password</Password>
      </Login>

      <Set operation="add">
        <Certificate>
          <Action>UploadCertificate</Action>
          <Name>friendly name for us humans goes here</Name>
          <Password>password to decrypt the certificate file</Password>
          <CertificateFormat>pkcs12</CertificateFormat>
          <CertificateFile>Certificates.p12</CertificateFile>
          <PrivateKeyFile></PrivateKeyFile>
        </Certificate>
      </Set>
    </Request>
    ---End content of file uploadTest.txt---

    **In the example above, Certificates.p12 is a placeholder for the filename of your certificate, and if you don't have a separate private key file you can omit the PrivateKeyFile line.

    I created a .p12 file for my certificate and private key simply because it was less to type in the file above and command line when testing. Valid formats are listed in the API documentation

  • In reply to JamieBah1:

    Most likely, if you have trouble in uploading the certificate (any file to XG via API), use a Test page and check, what you "actually" upload to the server / xg.

    http://ptsv2.com/s/howitworks.html

    Most likely your file is not proper uploaded / wrong format etc. 

    So you should verify, what you are uploading because XG has no proper storage of the file, if the request was invalid. It will simply flush the upload after parsing the command. 

  • Thanks everyone who contributed to this thread. It has been valuable to me figuring out how to upload the Let's Encrypt certificates from my Synology NAS to Sophos XG. Here's how it is done.

    Enable API (optionally create a special API Administration user) as described here: https://community.sophos.com/kb/en-us/132560

    On your Synology NASfollow the instructions for Let's Encrypt here and include your firewall's fqdn as a subject alternative name: https://www.synology.com/en-global/knowledgebase/DSM/help/DSM/AdminCenter/connection_certificate

    Then create this XML file, e.g. in your home directory:

    <?xml version="1.0" encoding="UTF-8"?>
    <Request APIVersion="1702.1">

    <!-- API Authentication -->
    <Login>
    <Username>apiuser</Username>
    <Password>randompw</Password>
    </Login>

    <Set operation="add">
            <Certificate>
                    <Action>UploadCertificate</Action>
                    <Name>yourdomain</Name>
                    <CertificateFormat>pem</CertificateFormat>
                    <CertificateFile>yourdomain.pem</CertificateFile>
                    <PrivateKeyFile>yourdomain.key</PrivateKeyFile>
            </Certificate>
    </Set>
    </Request>

    Under Control Panel, Task Scheduler, create the following User-defined script as Scheduled Task, that runs as User root.

    /bin/curl -F "reqxml=</var/services/homes/youruser/updatecertificate.xml"  -F "file=@/usr/syno/etc/certificate/system/default/cert.pem;filename=yourdomain.pem" -F "file=@/usr/syno/etc/certificate/system/default/privkey.pem;filename=yourdomain.key" -k https://yourfirewall:4443/webconsole/APIController

    Click "Run" to test and run it once. You should now have your Synology certificate and private key under SYSTEM, Certificates. If that worked, then make the following change in the the XML file: <Set operation="update">

    That should be it. From now on your firewall should be certified by Let's Encrypt and updated timely with renewed certificates. I run the task weekly on Sunday morning.

    Cheers!

  • In reply to DunRon:

    I have finally got this setup. The add worked properly but when the update executes every Sunday I get this output

    <?xml version="1.0" encoding="UTF-8"?>
    <Response APIVersion="1702.1" IPS_CAT_VER="1">
    <Login>
    <status>Authentication Successful</status>
    </Login>
    <Certificate transactionid="">
    <Status code="500">Operation could not be performed on Entity.</Status>
    </Certificate>
    </Response>



    Any ideas on what I am doing wrong. I have confirmed that I have changed the operation from add to update.

  • In reply to rrosson:

    So basically you can add with your script a Certificate, but you cannot update it? 

    As far as i know, you cannot "overwrite" the used Certificate. Because it is loaded in different places by XG. 

    What you have to do, would be reupload the certificate with different namens and change those uses in each place, you like.