This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Created an Elastic snort pipeline by editing Sophos integration

Not certain where this should be, so if wrong, please move to correct group.

For those intrigued and specifically concerning the Elasticsearch ingestion of Sophos UTM logs, my initial step involved the installation of the Sophos integration into Kibana v8.11.0. Upon inspection, it became evident that the ingestion parsing and pipelines were exclusively configured for dhcp, dns, http, and packetfilter logs. However, I aimed to extend the parsing to include other logs such as snort, exim, etc.

Any chance of completing the rest of pipelines in the the Elastic integration of Sophos UTM? That question is for the team responsible for the Elastic Integration. I've been adding them myself for logs that I need a pipeline for. But much more easier if the other logs were already completed, and it won't revert all my changes when I upgrade to the next integration version. 

Starting with "snort" logs, I proceeded to modify the logs-sophos.utm-3.8.1 pipeline by incorporating the 'snort' event.provider. Subsequently, I directed it to a newly created pipeline named "logs-sophos.utm-vers-snort." In essence, the addition to logs-sophos.utm-3.8.1, in the vicinity of the pipelines section, looked like this:

{ "pipeline": { "name": "logs-sophos.utm-vers-snort", "if": "ctx.event?.provider == 'snort'", "tag": "pipeline_snort" } }

Then I created the logs-sophos.utm-vers-snort pipeline and put this into the contents:

[ { "grok": { "field": "event.original", "patterns": [ "^%{GREEDYDATA} id=\"(%{NUMBER:event.id:int})\" severity=\"(%{WORD:sophos.utm.severity})\" sys=\"(%{WORD:sophos.utm.sys})\" sub=\"(%{WORD:sophos.utm.sub})\" name=\"(%{DATA:sophos.utm.name})\" action=\"(%{WORD:event.action})\" reason=\"(%{DATA:sophos.utm.reason})\" group=\"(%{NUMBER:group.id:int})\" srcip=\"(%{IP:source.ip})\" dstip=\"(%{IP:destination.ip})\" proto=\"(%{NUMBER:network.iana_number})\" srcport=\"(%{NUMBER:source.port})\" dstport=\"(%{NUMBER:destination.port})\" sid=\"(%{NUMBER:sid})\" class=\"(%{DATA:threat.class})\" priority=\"(%{NUMBER:threat.priority:int})\"" ], "ignore_missing": true, "ignore_failure": true } }, { "geoip": { "field": "source.ip", "target_field": "source.geo", "ignore_missing": true, "tag": "geo_source_ip" } }, { "geoip": { "field": "destination.ip", "target_field": "destination.geo", "ignore_missing": true, "tag": "geo_destination_ip" } }, { "geoip": { "field": "source.ip", "target_field": "source.as", "database_file": "GeoLite2-ASN.mmdb", "properties": [ "asn", "organization_name" ], "ignore_missing": true, "tag": "geo_source_as" } }, { "geoip": { "field": "destination.ip", "target_field": "destination.as", "database_file": "GeoLite2-ASN.mmdb", "properties": [ "asn", "organization_name" ], "ignore_missing": true, "tag": "geo_destination_as" } } ]

Works wonders and just needs to be repeated for any other applicable logs and edited for the log it's parsing. Seem simple enough for now.



This thread was automatically locked due to age.