Hello together,
i'm running 20.1.5 and maltrail is using a lot of memory.
it looks like it is the sensor.
root@OPNsense:~ # ps aux | grep malt
root 92801 4.0 12.2 2042480 2010324 - S 19:06 55:43.65 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 60547 3.8 12.2 2042480 2010376 - S 19:05 51:04.24 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 63032 2.1 12.2 2044660 2011212 - S 19:07 45:40.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 16601 1.6 12.2 2043252 2011920 - S 19:07 19:29.16 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 26204 1.6 12.2 2043252 2011876 - S 19:07 19:30.77 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 52144 1.6 12.2 2043252 2011920 - S 19:07 19:27.29 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 57799 1.6 12.2 2043252 2011912 - S 19:07 19:31.01 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 71282 1.6 12.2 2042996 2011316 - S 19:07 19:27.99 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 37702 1.5 12.2 2043252 2011896 - S 19:07 19:32.95 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 84110 1.5 12.2 2043252 2011876 - S 19:07 19:28.63 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 2856 0.6 8.7 2040048 1440720 - S 18:13 12:49.65 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 18245 0.0 12.2 2040048 2009608 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 18402 0.0 12.2 2042484 2010140 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 31085 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 38651 0.0 12.2 2042484 2010144 - I 19:06 0:00.91 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 38910 0.0 12.2 2040048 2009820 - I 19:06 0:00.74 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 43528 0.0 12.2 2042484 2010144 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 45590 0.0 0.2 47104 28312 - S 18:12 0:06.38 python3 /usr/local/share/maltrail/server.py (python3.7)
root 59845 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 68733 0.0 12.2 2042484 2010144 - I 19:06 0:00.90 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 72123 0.0 12.2 2040048 2009600 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 73946 0.0 12.2 2040048 2009608 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 77175 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 78432 0.0 12.2 2040048 2009612 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 86148 0.0 12.2 2042484 2010144 - I 19:06 0:00.90 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 90587 0.0 12.2 2042228 2009920 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 7953 0.0 0.0 1058012 2424 0 S+ 09:29 0:00.00 grep malt
root@OPNsense:~ #
it is set to capture all, but changing does not make a differance
I am now having the same issue. The sensor is using up over 4GB. There must be a memory leak, no?
I cannot find any issues posted on github for maltrail: https://github.com/stamparm/MalTrail/issues (https://github.com/stamparm/MalTrail/issues)
EDIT: There is a discussion about a memory leak from Python3, with a patch: https://github.com/stamparm/maltrail/issues/162 (https://github.com/stamparm/maltrail/issues/162) (towards the bottom). But the problem seems further upstream in a library. Does the plugin not have the patch?
EDIT2: Just saw today that there is an update for maltrail to 0.22. It takes some time for the memory leak to manifest itself, so won't know for awhile if the patch is there.
Quote from: ascii on April 28, 2020, 09:43:31 AM
Hello together,
i'm running 20.1.5 and maltrail is using a lot of memory.
it looks like it is the sensor.
root@OPNsense:~ # ps aux | grep malt
root 92801 4.0 12.2 2042480 2010324 - S 19:06 55:43.65 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 60547 3.8 12.2 2042480 2010376 - S 19:05 51:04.24 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 63032 2.1 12.2 2044660 2011212 - S 19:07 45:40.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 16601 1.6 12.2 2043252 2011920 - S 19:07 19:29.16 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 26204 1.6 12.2 2043252 2011876 - S 19:07 19:30.77 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 52144 1.6 12.2 2043252 2011920 - S 19:07 19:27.29 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 57799 1.6 12.2 2043252 2011912 - S 19:07 19:31.01 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 71282 1.6 12.2 2042996 2011316 - S 19:07 19:27.99 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 37702 1.5 12.2 2043252 2011896 - S 19:07 19:32.95 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 84110 1.5 12.2 2043252 2011876 - S 19:07 19:28.63 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 2856 0.6 8.7 2040048 1440720 - S 18:13 12:49.65 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 18245 0.0 12.2 2040048 2009608 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 18402 0.0 12.2 2042484 2010140 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 31085 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 38651 0.0 12.2 2042484 2010144 - I 19:06 0:00.91 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 38910 0.0 12.2 2040048 2009820 - I 19:06 0:00.74 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 43528 0.0 12.2 2042484 2010144 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 45590 0.0 0.2 47104 28312 - S 18:12 0:06.38 python3 /usr/local/share/maltrail/server.py (python3.7)
root 59845 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 68733 0.0 12.2 2042484 2010144 - I 19:06 0:00.90 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 72123 0.0 12.2 2040048 2009600 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 73946 0.0 12.2 2040048 2009608 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 77175 0.0 12.2 2040048 2009604 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 78432 0.0 12.2 2040048 2009612 - I 19:07 0:00.06 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 86148 0.0 12.2 2042484 2010144 - I 19:06 0:00.90 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 90587 0.0 12.2 2042228 2009920 - I 19:06 0:00.89 python3 /usr/local/share/maltrail/sensor.py (python3.7)
root 7953 0.0 0.0 1058012 2424 0 S+ 09:29 0:00.00 grep malt
root@OPNsense:~ #
it is set to capture all, but changing does not make a differance
Too many processes, when you update to 20.1.9 you'll get a fresh version and the reboot will kill the zombies
Quote from: jds on July 24, 2020, 04:49:00 PM
I am now having the same issue. The sensor is using up over 4GB. There must be a memory leak, no?
I cannot find any issues posted on github for maltrail: https://github.com/stamparm/MalTrail/issues (https://github.com/stamparm/MalTrail/issues)
EDIT: There is a discussion about a memory leak from Python3, with a patch: https://github.com/stamparm/maltrail/issues/162 (https://github.com/stamparm/maltrail/issues/162) (towards the bottom). But the problem seems further upstream in a library. Does the plugin not have the patch?
EDIT2: Just saw today that there is an update for maltrail to 0.22. It takes some time for the memory leak to manifest itself, so won't know for awhile if the patch is there.
0.22 already shipped with latest version, should be ok
Dear all
i can confirm - there is a big memory issue with Maltrail ..
Just one week ago - i did increase the Swap Swap space of my opnsense system to almost 6GB and my memor is 4GB .. that means .. the system has got about 10GB of memory space for use.
But maltrail eats this all - every day - more than 1GB of additional memory is needed - at the end i had to reboot my machine two time a week !
Now - after disabling maltrail - the swap usage is 0 (ZERO !!!) for days .. and the consumption is only 24%.
Fact is .. there must be a big memory bug into this software.
I was having the same issue, I am now on 21.7.1 and issue still happens.
it has been stable for the past couple of hours after I did the following:
1. Set a maximum buffer size value on the admin UI for the sensor by default it takes 10% of available mem.
2. Cloned the latest maltrail from upstream and using that (kept my config file)
It's not so surprising.
If you look at the highlighted area, you'll notice it is set for RAM usage.
You can leave it blank and it will use 10% of your RAM, now if your system has little RAM to go around, it might be a problem, but if you have more than enough, it's not such a big deal.
Usage is calculated in MBs, so set the value accordingly between 10 and 1000 MB or leave that part as blank.
Also every time you make changes to malltrail in it's general, sensor and server sections, you need to restart malltrail server service, there's no button for it, other than the ones in main page or system ---> Diagnostics ---> Services
I've started getting this the past few days, on 21.7.1, nothing much has changed yet suddenly I'm getting alerted to excessive memory usage.
I've now limited it to 500Mb to see how it goes, but for me it's not the server itself but the sensor and annoyingly that does not get restarted when restarting the server. You have to go into the sensor settings and either make a change e.g. memory and click save, or disable/enable it and again click save.
Quote from: Taomyn on August 17, 2021, 09:37:06 PM
I've started getting this the past few days, on 21.7.1, nothing much has changed yet suddenly I'm getting alerted to excessive memory usage.
I've now limited it to 500Mb to see how it goes, but for me it's not the server itself but the sensor and annoyingly that does not get restarted when restarting the server. You have to go into the sensor settings and either make a change e.g. memory and click save, or disable/enable it and again click save.
i seem to have same problem with mailtrail since a while back, how do you limit to 500mMb?
Quote from: sp33dy on August 27, 2021, 08:48:49 AM
i seem to have same problem with mailtrail since a while back, how do you limit to 500mMb?
It's a setting for the senor, but it made no difference so I removed Maltrail completely so I cannot show you.
Quote from: Taomyn on August 27, 2021, 10:16:46 AM
Quote from: sp33dy on August 27, 2021, 08:48:49 AM
i seem to have same problem with mailtrail since a while back, how do you limit to 500mMb?
It's a setting for the senor, but it made no difference so I removed Maltrail completely so I cannot show you.
oh i see it now, you are referring to "Capture Buffer Size" i asume
Yeah likewise here, it just keeps eating upp my 16gb, need to disable it for now
The issue with the "endlessly spawning" sensor processes has been investigated by me and I invented a workaround.
It is documented at the end of my German blogpost – feel free to share it:
https://andersgood.de/blog/oeffentliche-ip-adressen-in-opnsense-mit-maltrail-absichern
The involved Monit scripts can be found here:
- https://codeberg.org/SWEETGOOD/shell-scripts/src/branch/main/OPNsense/check-maltrail-sensor-processes.sh
- https://codeberg.org/SWEETGOOD/shell-scripts/src/branch/main/OPNsense/kill-maltrail-sensors-and-restart.sh
So there is a Bug in the rc Script to correctly stop the process. I'll have a look at it. Thx for the research