Summary
Mattermost-team-edition v7.8 running on Docker is killed daily by the OS due to out-of-memory issue. It looks like a memory leak, but I have no experience to investigate or prove this.
Steps to reproduce
- Unsure how to repro this.
- I am running Mattermost-team-edition v7.8.0 with Postgres on Docker on a Linux box. The machine is acting as a server, sitting in my garage, hosting a small number of web applications through Docker.
- I can provide the docker-compose files?
Expected behavior
No memory leaks?
Observed behavior
- Installation is running fine from a user perspective.
- Every now and then, the entire server becomes completely unresponsive: web requests time out, and even running SSH sessions are disconnected.
- After 15-30 minutes, the system becomes responsive again.
-
dmesg -T | grep ill
reveals that a Mattermost process was killed by the OS due toout of memory
. -
top
reveals that CPU load is normally less than 0.5 but peaks at over 170(!!) while the memory issue is present. - This
kill
situation happens at least daily, even when no users are active on the Mattermost installation.
Example output of dmesg -T | grep ill
:
[Sa Mär 4 19:35:27 2023] [ 3858] 1000 3858 113617 244 106496 6 0 gsd-rfkill
[Sa Mär 4 19:35:27 2023] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/docker/66ecc2904f86807abf8e5a3e5fb9310028a03c19d09f6b8973eb2c6801c943f7,task=mattermost,pid=1664039,uid=2000
[Sa Mär 4 19:35:27 2023] Out of memory: Killed process 1664039 (mattermost) total-vm:119171600kB, anon-rss:4219024kB, file-rss:0kB, shmem-rss:0kB, UID:2000 pgtables:9300kB oom_score_adj:0
[So Mär 5 10:05:14 2023] nxnode.bin invoked oom-killer: gfp_mask=0x1100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
[So Mär 5 10:05:14 2023] oom_kill_process.cold+0xb/0x10
[So Mär 5 10:05:14 2023] [ 3858] 1000 3858 113617 185 106496 65 0 gsd-rfkill
[So Mär 5 10:05:14 2023] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/docker/6ceafe411e36e8c1ceb396ae0fe9e0772399b4d981f0c3b9f108c4b637300d6f,task=bundle,pid=2706941,uid=998
[So Mär 5 10:05:14 2023] Out of memory: Killed process 2706941 (bundle) total-vm:1425464kB, anon-rss:756816kB, file-rss:0kB, shmem-rss:204kB, UID:998 pgtables:2444kB oom_score_adj:0
[So Mär 5 14:42:06 2023] apport invoked oom-killer: gfp_mask=0x1100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
[So Mär 5 14:42:06 2023] oom_kill_process.cold+0xb/0x10
[So Mär 5 14:42:06 2023] [ 3858] 1000 3858 113617 185 106496 65 0 gsd-rfkill
[So Mär 5 14:42:06 2023] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/docker/6ceafe411e36e8c1ceb396ae0fe9e0772399b4d981f0c3b9f108c4b637300d6f,task=bundle,pid=3491975,uid=998
[So Mär 5 14:42:06 2023] Out of memory: Killed process 3491975 (bundle) total-vm:1716760kB, anon-rss:931788kB, file-rss:0kB, shmem-rss:1976kB, UID:998 pgtables:3164kB oom_score_adj:0
[So Mär 5 14:42:36 2023] gmain invoked oom-killer: gfp_mask=0x1100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
[So Mär 5 14:42:36 2023] oom_kill_process.cold+0xb/0x10
[So Mär 5 14:42:36 2023] [ 3858] 1000 3858 113617 185 106496 65 0 gsd-rfkill
[So Mär 5 14:42:36 2023] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/docker/66ecc2904f86807abf8e5a3e5fb9310028a03c19d09f6b8973eb2c6801c943f7,task=mattermost,pid=2051211,uid=2000
[So Mär 5 14:42:36 2023] Out of memory: Killed process 2051211 (mattermost) total-vm:142924812kB, anon-rss:4533848kB, file-rss:0kB, shmem-rss:0kB, UID:2000 pgtables:10168kB oom_score_adj:0