Sensei on OPNsense - Application based filtering

Started by mb, August 25, 2018, 03:38:14 AM

Previous topic - Next topic
May 09, 2020, 09:21:10 PM #885 Last Edit: May 09, 2020, 09:35:13 PM by guyp2k
Disregard, I was able to address the issue.


Installed Sensei and subscibed but stuck at "waiting for database service to come up." Any suggestions as I have tried w/ out success.

I reinstalled elasticsearch5 w/ out success.

Thanks

Quote from: mb on May 08, 2020, 02:45:45 AM
Quote from: packetmangler on May 04, 2020, 04:45:25 PM
EDIT: I'm doing forward and reverse lookups on the firewall for all addresses on my local network and it appears that the graphs are indeed populating with host names where IP addresses were earlier.  So now the question is how often should that run?

Hi @packetmangler,

With release 1.5, cache time to live is 8 hours. (higher with 1.4) So, could be every 6 hours so that it replenishes the cache.

Thanks mb.  I have my simple one-liner running every 4 hours for the time being and it seems like it's doing what it needs.

Enjoying 1.5!

Quote from: guyp2k on May 09, 2020, 09:21:10 PM
Disregard, I was able to address the issue.


Installed Sensei and subscibed but stuck at "waiting for database service to come up." Any suggestions as I have tried w/ out success.

I reinstalled elasticsearch5 w/ out success.

Thanks

there should be a next button at the bottom...
in anycase... try to uninstall (via web or console) and reinstall then do the wizard again

Policies & Filtering

where is this in 1.5?
is this available in free?
also can we now upload list instead?

Hi,

quick question here regarding the different plans.

I´m already paying for the home premium and i´d like to know if features like the "Stream Reporting Data to External Elasticsearch" or the future "SSL proxy/inspection" are included in this tier or only in the premium (highest enterprise plan).

Quote from: mb on May 08, 2020, 02:59:21 AM
Sensei runs on inner-facing interfaces and determines the "remote" / "local" properties in terms of where the connection is initiated. If it comes from the LAN side, than the src ip address is considered local and dst ip address is regarded as "remote".

So if a connection is from a local host behind network A to a host behind local network B, sensei will consider the host on local network B as "remote", since for the context of the connection, it was the "remote end".

Obviously this is creating a bit confusion. Let us give this a bit of thought.
Yep that makes complete sense. Should have given that more thought before posting. I guess, you could build into an option to "Define local hosts" whereby we could add all addresses or subnets that we class as local and use them in the report. You couldn't just use RFC1918 as you'd end up with IPs on the end of VPNs being considered local. So easier said than done no doubt but if you were to use Logstash (you don't I know but as you're using an Elastic backend, the reference is somewhat valid), you could have all submitted "local addresses" assigned to a key/value pair file and import them into an array to use in the filter.

You'd loop through that array on each update of the report and if the source address existed in the KV pair, add a field called "local" then use than field as a key in the report for local connections. Given that very little thought until now so there are probably many reasons why that wouldn't work. Overhead for one thing... :-)

Hi @packetmangler, hi @guyp2k, glad to hear that.

Hi @tong2x, Policy based filtering is available in all paid subscriptions.

Hi @Mitheor, for log streaming to elasticsearch, you can do that with the Free Edition. But please note that this will offload the database to this remote database system.
Starting with SOHO subscription tier, you can both have local Elastic/Mongo and at the same time stream reporting data to another remote database.

TLS decryption will appear on the highest plan (Premium, which will be re-named to Enterprise soon).

For the complete list of features and how they appear in free/paid subscriptions, please refer to this page:

https://sunnyvalley.io/plans

Hi @Callahan,

Thanks for the suggestion. I would agree, simply RFC1918 wouldn't do the job.

Indeed, we evaluated to have this on the packet engine itself since it can already do a lot more complex data enrichment.

We were not sure whether people would want to manually enter such a list. Thinking again, this list wouldn't be such a long one anyway.

Hi,

do you see also a huge increase in memory consumption (~20%) due to the last update?

br

Hi @Mks,

1.5 do not normally have any updates which might induce increased memory usage. Let's see if some other people also experience this, and we can analyze further.

What we've observed however that, with one of the OPNsense 20.1.x updates, Operating System swappiness behavior changed in a way that it is more likely to do swapping even if there's decent amount of free memory in the system. This is why we've introduced SWAP warning threashold configuration parameter. This might be unrelated to your case though.


Hi, I have an external Elasticsearch container (7.7.0) and it is complaining a lot about invalid UTF-8 bytes from Sensei, eg :


{"type": "server", "timestamp": "2020-05-16T00:28:33,695Z", "level": "DEBUG", "component": "o.e.a.b.TransportShardBulkAction", "cluster.name": "docker-cluster", "node.name": "da8d9957dfaf", "message": "[conn-200516][0] failed to execute bulk item (index) index {[conn_write][_doc][_9_hGnIBvp4cvgKY7pYd], source[{\"transport_proto\":\"UDP\",\"policyid\":\"0\",\"interface\":\"vtnet0\",\"vlanid\":\"0\",\"conn_uuid\":\"12a6680a-5ce0-4a7c-ae38-1a27c85ff66d\",\"src_hostname\":\"librarian.local\",\"src_username\":\"\",\"ip_src_saddr\":\"10.1.1.10\",\"ip_src_port\":65062,\"src_dir\":\"EGRESS\",\"dst_hostname\":\"81.0.84.116\",\"dst_username\":\"\",\"ip_dst_saddr\":\"81.0.84.116\",\"ip_dst_port\":57997,\"dst_dir\":\"INGRESS\",\"input\":1,\"output\":1,\"src_npackets\":1,\"src_nbytes\":0,\"src_pbytes\":104,\"dst_npackets\":2,\"dst_nbytes\":345,\"dst_pbytes\":317,\"src tcp_flags\":\"\",\"dst tcp_flags\":\"\",\"start_time\":1589588789000,\"end_time\":1589588911000,\"encryption\":\"TLS\",\"app_id\":16,\"app_proto\":\"QUIC\",\"app_name\":\"Quic UDP Connection\",\"app_category\":\"Streaming\",\"tags\":\"Encrypted,SSL,QUIC\",\"src_geoip\":{\"timezone\":\"\",\"continent_code\":\"\",\"city_name\":\"\",\"country_name\":\"\",\"country_code2\":\"\",\"country_code3\":\"\",\"dma_code\":\"0\",\"region_name\":\"\",\"region_code\":\"\",\"postal_code\":\"\",\"area\":\"0\",\"metro\":\"0\",\"asn\":\"0\",\"latitude\":0.0,\"longitude\":0.0,\"location\":{\"lat\":0.0,\"lon\":0.0}},\"dst_geoip\":{\"timezone\":\"\",\"continent_code\":\"\",\"city_name\":\"Duna�jv�ros\",\"country_name\":\"HU\",\"country_code2\":\"\",\"country_code3\":\"\",\"dma_code\":\"0\",\"region_name\":\"\",\"region_code\":\"\",\"postal_code\":\"\",\"area\":\"0\",\"metro\":\"0\",\"asn\":\"0\",\"latitude\":46.983299255371097,\"longitude\":18.933300018310548,\"location\":{\"lat\":46.983299255371097,\"lon\":18.933300018310548}}}]}", "cluster.uuid": "3zoVrbvRRfmZcZZHbXwCZw", "node.id": "5MoI-6jVTFGAfVm-XSZ4TA" ,
"stacktrace": ["org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [dst_geoip.city_name] of type [text] in document with id '_9_hGnIBvp4cvgKY7pYd'. Preview of field's value: ''",
"Caused by: com.fasterxml.jackson.core.JsonParseException: Invalid UTF-8 middle byte 0x72",
" at [Source: (org.elasticsearch.common.bytes.AbstractBytesReference$MarkSupportingStreamInputWrapper); line: 1, column: 1108]",
"at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1840) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:712) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidOther(UTF8StreamJsonParser.java:3574) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._reportInvalidOther(UTF8StreamJsonParser.java:3581) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._decodeUtf8_3fast(UTF8StreamJsonParser.java:3386) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._finishString2(UTF8StreamJsonParser.java:2490) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._finishAndReturnString(UTF8StreamJsonParser.java:2438) ~[jackson-core-2.10.4.jar:2.10.4]",
"at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.getText(UTF8StreamJsonParser.java:294) ~[jackson-core-2.10.4.jar:2.10.4]",
"at org.elasticsearch.common.xcontent.json.JsonXContentParser.text(JsonXContentParser.java:83) ~[elasticsearch-x-content-7.7.0.jar:7.7.0]",
"at org.elasticsearch.common.xcontent.support.AbstractXContentParser.textOrNull(AbstractXContentParser.java:253) ~[elasticsearch-x-content-7.7.0.jar:7.7.0]",
"at org.elasticsearch.index.mapper.TextFieldMapper.parseCreateField(TextFieldMapper.java:823) ~[elasticsearch-7.7.0.jar:7.7.0]",
"at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:284) ~[elasticsearch-7.7.0.jar:7.7.0]",


And so on. The Opnsense install is the DVD ISO in Proxmox 6.2, the Elasticsearch is in a Docker container on an adjacent host. Any ideas?

Hi @Parallax,

Thanks for the heads-up. Remote Elastic support is quite fresh. There might still be some bugs left.

Can you reach out to the team via "Report Bug" so that we can follow up?

EDIT: Spotted and fixed this. Fix is shipping with 1.5.1 scheduled for this week(end).

super stupid question:
Once I enable the app control i have a few websites i can't access anymore.
Is there a way to whitelist these as exception?

i'm using engine version 1.5 and App & Rules DB Version: 1.5.20200501062917
DEC750 Deciso

Quote from: nikkon on May 18, 2020, 12:26:28 PM
super stupid question:
Once I enable the app control i have a few websites i can't access anymore.
Is there a way to whitelist these as exception?

i'm using engine version 1.5 and App & Rules DB Version: 1.5.20200501062917


Reports -> Blocks -> Live Blocked Session Explorer -> find Session -> Click green ✅ and allow host

Quote from: nikkon on May 18, 2020, 12:26:28 PM
super stupid question:
Once I enable the app control i have a few websites i can't access anymore.
Is there a way to whitelist these as exception?

i'm using engine version 1.5 and App & Rules DB Version: 1.5.20200501062917

It´s via what binaryanomaly said or in policies / web control / Whitelist