This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Blacklisting webpage does not work.

Hello all,

Due to increased spam traffic with malicious links, we need to dynamically block certain URLs. With policy helpdesk, I can easily track what filter action and policy took place for particular user. I am conviced I blacklisted the webpage in the right place and yet, the user still can access the webpage. Below on the screenshots is outcome from particular policy, filter action and policy helpdesk test. Of course I had to cover texts that could lead to de-anonymization of the firewall, where screenshots were taken (hence there are some white rectangles in some parts of the screenshots). The images posted to my post have bad quality (you cannot read the setings), I therefore used external webpage for image hosting

What am I doing wrong?

 

Filter Action:

pasteboard.co/IDAeAfu.png

Policy test:

pasteboard.co/IDAeYbr.png

Profile:

pasteboard.co/IDAf4D2.png

 

Thank you in advance, take care.



This thread was automatically locked due to age.
Parents Reply Children
  • Hi  

    Would you please try this regex: ^https?://([A-Za-z0-9.-]*\.)?rahmatravel\.com/ ? And also uncheck the option Perform matching on these domains only and check.

    Regards

    Jaydeep

  • Hi, unfortunately it did not have any effect :-(.

  • I am sure I am tuning the right filter action, looks to me like the Sophos UTM does not care whatever entries are in the blocked websites. We are using transparent proxy and the user that was displayed as a test one in the screenshots has two exceptions applied (based on group membership) that are completely unrelated to blacklisting some pages. Not sure if helps, I am clueless.

  • Try backlist object under Edit Filter Action / Block These Websites.  Add rahmatravel.com in Domains.  This should work.

     

    Good Luck

  • Hello, unfortunately not. Policy helpdesk is still giving me "Allowed" result and I am 100% certain the policy uses that particular filter action that I have just edited. Looks to me that there is somewhere a checkbox saying "ignore blacklisted URLs" (I know, absurd, but I can't explain this behavior :-( ).

     

    Below is screenshot after suggested change:

    Sounds like I am gonna engage support.

     

    This is annoying.

     

    Thanks for the effort, though.

  • even without blacklisting, UTM blocked it and reported as "Malicious" and "Phishing"

  • First, let me say that web filtering works very well, I have exercised it thoroughly.   Notwithstanding, I cannot explain your symptoms, so this is exactly the type of situation where Support is likely to be better and faster than this forum (even at level 1 support.)

    You appear to have multiple filter actions.   Assuming that you want this example site blocked for everyone, you are making things harder on yourself by blocking within the Filter Action.  Instead, you should use an exception object.

    But you have also highlighted the problem with regex -- it is very subtle technology and easy to be disappointed

    • "^http://" does not block an https link
    • "^https?://" blocks both http and https, but not ftp
    • To block all three protocols, you need something like "(https?|ftp)://", but I would not trust the assertion without some testing.
    • "http://" will match a url with http protcol, but may also block some URLs with an http string embeddd in the querystring (since the ^ is omitted).

    The difference between your second regex and the one that was counterproposed is that one blocks *.baddomain and the other blocks with recursion, such as *.*.baddomain

    I have come to loathe and fear regex mistakes, so I avoid them.   Instead I use website override objects.   

    • For this situation, I would simply override the reputation.   
    • For items that are blocked as uncategorized, the website override specifies the category.   
    • For partial or full proxy bypass, the website override applies a tag, and then an Exception object defines the features that are disabled, and is applied to "websites with this tag".

    The website object can apply to a single host (www.badguys.com), or an organization by specifying badguys.com with the option to "include subdomains".   Unfortunately, the subdomains only apply to one level, *.badguys,com, but not *.*.badguys.com.   In most of my cases, a one level wildcard is sufficient.  The upside is that I omit the protocol and all three protocols are blocked.

    You need to look closely at the exceptions as reported in the policy helpdesk and the log files.   There may be an exception that is causing your problem.

  • Hello Douglas, hello all,

    first of all, I would like to thank you for sharing your ideas and valuable insights. It helped me to slow down and think more thoroughly. It was a stupid mistake that made earlier and I found it with fresh and rested brain. One of my exceptions had OR operators instead of AND operators. Basically - the exception ought to work like - if user is in that group and is accessing this domain, skip URL filter ... actually, the URL filter was skipped only based on the group membership (because of the OR operators and target domain was not taken into consideration.

    I am sorry for blaming the box for my mistake.

    Thank you everyone for your ideas, much appreciated!

    Request solved.