This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Bug - Improper headers for blocked content

Dear all,

While toying around with cURL and Sophos Anti-Virus, I noticed that the Web Protection service:

▸ Returns a 403 Forbidden error when the domain's reputation is deemed unsuitable

▸ Returns a 200 OK error when the content is blocked

Compare, for example:

▸ Loading the Sophos malware-blocking test page

$ /usr/bin/curl --compressed "http://sophostest.com/malware/" -sI
    HTTP/1.1 403 Forbidden
    Content-Length: 6865
    Content-Type: text/html; charset=UTF-8
    Cache-Control: no-cache
    Connection: close
    Proxy-Connection: close

▸ Loading the Eicar.org pseudo-virus

 
/usr/bin/curl --compressed “www.eicar.org/.../eicar.com.txt" -sI
    HTTP/1.1 200 OK
    Date: Fri, 03 Jul 2015 12:20:55 GMT
    Server: Apache
    Content-disposition: attachment; filename="eicar.com.txt"
    Cache-control: private
    Content-length: 68
    Content-Type: application/octet-stream
 
This seems to be a bug, as both attempts result in a Sophos-generated error page being displayed.
:1021180


This thread was automatically locked due to age.
Parents
  • Hello Serra,

    As the go-to admin for an ever-growing circle of friends and family, I often whip up small scripts to help in troubleshooting and debugging network issues. These scripts perform a few dig and curl operations, compare the results against what would be expected in a known-good setup, and attempt to perform what changes can be done automatically.

    One of my steps is to test whether Sophos Web Protection is active. I actively recommend SAV 9.x to friends, but I find they sometimes change the configuration. (People with slow DNS will find Sophos Web Protection to be all but unusable and to have a paralysing effect on the machine while it is totally transparent to others. For example, OpenDNS Umbrella + Sophos = Total lockdown.)

    Testing whether reputation filtering is in place is as simple as:

    /usr/bin/curl "http://sophostest.com/malware/" -sI | grep -Eq "^HTTP/1\.1\ 403\ Forbidden\r$"

    However, testing for content filtering is very messy:

    /usr/bin/curl "www.eicar.org/.../eicar.com.txt" -s | grep -Eq "Sophos\ Block\ Page"

    The command is just as short, but we must download the entire page instead of just the headers, which takes longer and uses more resources. We must also use grep to parse the resulting [physique, build, form, frame, figure, anatomy, skeleton, trunk, torso]*, which is prone to failure, especially should Sophos ever decide to change the page format. The headers preceding the content appear to be those of the server being blocked. For example, in the case of the EICAR page above, I see the following, which looks suspiciously like what EICAR themselves send back. (I doubt that Sophos runs a full Apache install locally, or does it?)

    HTTP/1.1 200 OK
    Date: Tue, 07 Jul 2015 05:53:17 GMT
    Server: Apache
    Content-disposition: attachment; filename="eicar.com.txt"
    Cache-control: private
    Content-length: 68
    Content-Type: application/octet-stream

    Both commands result in error pages. From a purely logical standpoint, it seems the headers should always match the content. When the proxy serves a block page, it "forbids" loading the requested content just as much as when reputation filtering activates. The fact that the underlying mechanisms are different changes little to the actual result. Furthermore, since no browser on OS X offers "Custom Error Pages" by default (à la Internet Explorer of yore), to my knowledge at least, I feel that it would be quite safe to use the proper response codes.

    I realise that curl and scripts are edge cases, but I am certainly not the only Sophos user who noodles around in Terminal…

    * — Well, the you-guess-what anyway. The forum's profanity filter is unhappy about that word.

    :1021243
Reply
  • Hello Serra,

    As the go-to admin for an ever-growing circle of friends and family, I often whip up small scripts to help in troubleshooting and debugging network issues. These scripts perform a few dig and curl operations, compare the results against what would be expected in a known-good setup, and attempt to perform what changes can be done automatically.

    One of my steps is to test whether Sophos Web Protection is active. I actively recommend SAV 9.x to friends, but I find they sometimes change the configuration. (People with slow DNS will find Sophos Web Protection to be all but unusable and to have a paralysing effect on the machine while it is totally transparent to others. For example, OpenDNS Umbrella + Sophos = Total lockdown.)

    Testing whether reputation filtering is in place is as simple as:

    /usr/bin/curl "http://sophostest.com/malware/" -sI | grep -Eq "^HTTP/1\.1\ 403\ Forbidden\r$"

    However, testing for content filtering is very messy:

    /usr/bin/curl "www.eicar.org/.../eicar.com.txt" -s | grep -Eq "Sophos\ Block\ Page"

    The command is just as short, but we must download the entire page instead of just the headers, which takes longer and uses more resources. We must also use grep to parse the resulting [physique, build, form, frame, figure, anatomy, skeleton, trunk, torso]*, which is prone to failure, especially should Sophos ever decide to change the page format. The headers preceding the content appear to be those of the server being blocked. For example, in the case of the EICAR page above, I see the following, which looks suspiciously like what EICAR themselves send back. (I doubt that Sophos runs a full Apache install locally, or does it?)

    HTTP/1.1 200 OK
    Date: Tue, 07 Jul 2015 05:53:17 GMT
    Server: Apache
    Content-disposition: attachment; filename="eicar.com.txt"
    Cache-control: private
    Content-length: 68
    Content-Type: application/octet-stream

    Both commands result in error pages. From a purely logical standpoint, it seems the headers should always match the content. When the proxy serves a block page, it "forbids" loading the requested content just as much as when reputation filtering activates. The fact that the underlying mechanisms are different changes little to the actual result. Furthermore, since no browser on OS X offers "Custom Error Pages" by default (à la Internet Explorer of yore), to my knowledge at least, I feel that it would be quite safe to use the proper response codes.

    I realise that curl and scripts are edge cases, but I am certainly not the only Sophos user who noodles around in Terminal…

    * — Well, the you-guess-what anyway. The forum's profanity filter is unhappy about that word.

    :1021243
Children
No Data