Large File import from Slack fails with i/o timeout

Summary
Attempting to import large (1.1TB) export from Slack fails with error: Error: failed to upload data: Post "http://localhost:8065/api/v4/uploads/ptmt4qbx17nepgsdo4h4bn4qqo": write tcp 127.0.0.1:52408->127.0.0.1:8065: use of closed network connection

Steps to reproduce
Create 2 Ubuntu server 20.04 VMs on esxi
install mattermost server 7.8 from tar on app server
install postgres 15 on db server
note: I didn’t install nginx on the app server
export data from slack according to documentation
utilize mmctl to upload file for import
receive error message posted above

Expected behavior
import job created successfully and able to import data into Mattermost

Observed behavior
the process runs for about 5-6 minutes according to the timestamps I see, then it fails
terminal output

Error: failed to upload data: Post "http://localhost:8065/api/v4/uploads/ptmt4qbx17nepgsdo4h4bn4qqo": write tcp 127.0.0.1:52408->127.0.0.1:8065: use of closed network connection

logfile:

Mar 21 03:11:32 appserver mattermost[962]: {"timestamp":"2023-03-21 03:11:32.848 Z","level":"error","msg":"Unable to write the file.","caller":"web/context.go:117","path":"/api/v4/uploads/ptmt4qbx17nepgsdo4h4bn4qqo","request_id":"81k1qe84rt8i7q6acfbzq6mf9h","ip_addr":"127.0.0.1","user_id":"4jdanbgb6tnz7gjt4da3muj4jy","method":"POST","err_where":"WriteFile","http_code":500,"error":"WriteFile: Unable to write the file., unable write the data in the file data/import/ptmt4qbx17nepgsdo4h4bn4qqo_export-with-emails-and-attachments.zip.tmp: read tcp 127.0.0.1:8065->127.0.0.1:52408: i/o timeout"}
Mar 21 03:17:24 appserver mattermost[962]: {"timestamp":"2023-03-21 03:17:24.309 Z","level":"info","msg":"SimpleWorker: Job is complete","caller":"jobs/base_workers.go:96","worker":"ExpiryNotify","job_id":"3qj31odufpn7xdg6e3rh1odrhw"}

I assume this is related to a timeout due to the large file size. The server has enough storage available to hold 2.5 copies of the import file so I am hoping it’s not related to that.
If there is a way to increase the timeout of the mattermost listener that might resolve my problem but this is my first Mattermost deployment so I am woefully inexperienced.

Would anyone be kind enough to point me in the correct direction?

Hi @enryoku ,

Not sure what timeouts you hit exactly here, but you can work around that by putting the file manually in the location where it belongs to, you don’t need to go through the webserver to upload it.
I assume that uploading small files works? If so, you can find them in your Mattermost’s data directory which is by default ./data relative to your Mattermost installation (by default /opt/mattermost), so the import files will be stored in /opt/mattermost/data/import. You do not need to manually add the generated ID there, it’s just to make sure that uploaded files are unique if you try to upload the same file multiple times. The only thing you need to take care of is that the uploaded file is writable by the mattermost application user after you copied/moved it there:

cp|mv /path/to/source.zip /opt/mattermost/data/import/source.zip
chown mattermost: /opt/mattermost/data/import/source.zip

You should then see this file in the output of mmctl import list available and should be able to proceed with the import process itself.

1 Like

Hello @agriesser ,
Thanks for the speedy reply!

I was able to import a 36GB file without issue.

Thanks for the tip! I did not know that at all.

I moved the file and chown’d it and now I’m getting a different error which might be unrelated:

  ID: naj1j9adkbg6pq63isk66n5jdo
  Status: error
  Created: 2023-03-21 04:34:32 +0000 UTC
  Started: 2023-03-21 04:34:39 +0000 UTC
  Data: map[error:Error during job execution. — ImportProcessWorker: Unable to process import: JSONL file is missing., jsonFile was nil import_file:export-with-emails-and-attachments.zip]

This makes me wish I could break down the export from slack into smaller date ranges so it would be easier to manage.

Can you show the file structure of the zip file (unzip -l <filename>)? Need to make sure it contains the correct folder structure.

1 Like

The length of the output is rather long. Is there a specific pattern I could filter for?
It looks something like this:

Archive:  export-with-emails-and-attachments.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
        0  1980-00-00 00:00   D03AGEVMJE8/
     4009  1980-00-00 00:00   D03AGEVMJE8/2022-06-16.json
     6164  1980-00-00 00:00   D03AGEVMJE8/2022-07-11.json
     4422  1980-00-00 00:00   D03AGEVMJE8/2022-07-13.json
     3622  1980-00-00 00:00   D03AGEVMJE8/2022-04-06.json
     4623  1980-00-00 00:00   D03AGEVMJE8/2022-07-12.json
        0  1980-00-00 00:00   D04K1HVALFR/
     8934  1980-00-00 00:00   D04K1HVALFR/2023-01-18.json
  1692899  1980-00-00 00:00   __uploads/F04KJJ3HE12/20230118_085712.jpg
        0  1980-00-00 00:00   D02TCTE05C3/
--- snip ---
        0  1980-00-00 00:00   D03C96M6NBU/
    12056  1980-00-00 00:00   D03C96M6NBU/2022-04-20.json
        0  1980-00-00 00:00   D035LC76TF1/
     5081  1980-00-00 00:00   D035LC76TF1/2022-03-07.json
   859217  1980-00-00 00:00   __uploads/F036DJ5BFPT/Image from iOS.jpg
    11875  1980-00-00 00:00   D035LC76TF1/2022-05-16.json
     5763  1980-00-00 00:00   D035LC76TF1/2022-05-11.json
---------                     -------
1280566870760                     1751092 files

Again, thanks for your response.

I see what I did wrong @agriesser I didn’t finish the prep stage which generates the jsonl file and then adds it to the import, I just skipped to the end. I’m going to finish that process and hope for the best.

I only made it to step 3 here.

I’ll leave this open in case something goes wrong during this process, but I’m feeling slightly more optimistic now that I understand what I did wrong.

Thanks again for your assistance!

Yep, that’s what I expected and therefore I wanted to look at the file contents. Your zip file should only contain one json and the data directory if I’m not mistaken after the conversion by the bulk import tool.
Let me know how it goes.

1 Like

Thanks, it’s still going. I hope it completes successfully.

Well, it finished but seems I have another error after attempting an import.

  ID: a7cgr3qz3t86idqancuuwm64xh
  Status: error
  Created: 2023-03-24 04:03:04 +0000 UTC
  Started: 2023-03-24 04:03:12 +0000 UTC
  Data: map[error:Error during job execution. — BulkImport: Channel type is invalid. import_file:mattermost-bulk-import.zip line_number:1621]

I’m going to try breaking it down into smaller pieces so it’s easier to troubleshoot.

Hi again,

what’s the channel type on this line of your jsonl? Can you open it and post the line here and maybe also compare it with another channel that has been created before (something below line 1620)?

1 Like

@enryoku - There is an mmctl import validate command you can run to validate and fix any errors that you have in your import file. I think that might be helpful in your case rather than uploading and fixing an error at one time.

1 Like

Thanks for the suggestion!

I deleted the large file and am importing smaller files based on yearly exports. I will see if the issue arises again and attempt the comparison suggested at that time.

So after creating year by year exports instead of a single large export I was able to make some progress. The stage I’m in now is troubleshooting an import I attempted a few days ago. It’s only about 268GB.

[
  {
    "id": "qiyu1dozofbs8kcod8zcnse5hy",
    "type": "import_process",
    "priority": 0,
    "create_at": 1679782648807,
    "start_at": 1679782649509,
    "last_activity_at": 1679782649509,
    "status": "in_progress",
    "progress": 0,
    "data": {
      "import_file": "bulk_import_01022021_01012022.zip"
    }
  }
]

After a bit of research the job has most likely crashed.

I did perform a “mmctl import validate” against the zip file which returned a very long list of data.
Many of the errors are similar to the following:

import validation error in bulk_import_01022021_01012022.zip->mattermost_import.jsonl:63597 field "post": BulkImport: app.import.validate_direct_post_import_data.message_length.error

Looking at that line of the jsonl file I can see the post in question is about 11,444 characters long. This is not an uncommon format for a post. Is this expected behavior for importing data to Mattermost? I can’t seem to find a config option to define the message_length value it’s complaining about.

Any suggestions on how to proceed?

Edit: I did attempt and succeed in creating posts in the existing Mattermost environment in a new channel that duplicated the content length the import validate tool was complaining about.

The max message length is the length of the message field in the posts table.

SELECT
				COALESCE(character_maximum_length, 0)
			FROM
				information_schema.columns
			WHERE
				table_name = 'posts'
			AND	column_name = 'message'

By default, it’s 65535. So 11,444 should be fine. Note that we check utf8.RuneCountInString, so depending on unicode characters, the length can be greater.

But the length validation code in the importer and while posting a message is exactly the same. If a message can be posted in MM, then the same message should be able to be imported.

If you can create a dummy post for me to test, I can look into it.

1 Like

This would be an example from the message field:

```BO_ 1615 TSTTestStatus                          : 8 TST\n MP_ TSTTestSeverity                            : 0|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ MBMSTestSeverity                           : 3|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ TestHiAppDatSpd                            : 6|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxApplejacksData                      : 7|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiDuckTagsr                            : 8|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxDataoverVnc                         : 9|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxCurroverVnc                         : 10|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxEDataOilPump                        : 11|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxABABData                            : 12|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxABABNonData                         : 13|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxAADataShrtTermVnc                   : 14|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxDuckTest                           : 15|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxAADataLeak                          : 16|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxAADataData                          : 17|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxESCM                                : 18|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxINVShrtAbaVnc                       : 19|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxINVPeakAbaVnc                       : 20|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxINVData                             : 21|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxINVOp                               : 22|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiBase                                 : 23|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiAAIL                                 : 24|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiPTData                               : 25|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiEstop                                : 26|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiABABEnFlt                            : 27|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiABABData                             : 28|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiABABNonData                          : 29|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiDCFCFlt                              : 30|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiDataState                            : 31|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiTmmData                              : 32|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiDataContVnc                          : 33|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiTmmEData                             : 34|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiINVLngTermVnc                        : 35|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestHiINVState                             : 36|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ Data08TestStat                             : 37|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data07TestStat                             : 40|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data06TestStat                             : 43|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data05TestStat                             : 46|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data04TestStat                             : 49|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data03TestStat                             : 52|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data02TestStat                             : 55|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data01TestStat                             : 58|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ Data00TestStat                             : 61|3@1+ (1,0) [0|7] \"\" Access__NNN\n\nBO_ 1616 TSTTestHi                              : 8 TST\n MP_ TSTAppTest                                 : 0|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestDatSpd                                 : 1|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1State                              : 2|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2State                              : 3|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1LongAbaVnc                         : 4|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2LongAbaVnc                         : 5|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ FaulltTMMApplejacksOp                       : 6|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestTMMApplejacksVnc                       : 7|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1Data                               : 8|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2Data                               : 9|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestTMMDataOp                              : 10|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestTMMDataVnc                             : 11|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr1MaxData                      : 12|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr2MaxData                      : 13|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr3MaxData                      : 14|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr1MinData                      : 15|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr2MinData                      : 16|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr3MinData                      : 17|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB1Op                                : 18|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB1Duck                              : 19|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB1Internal                          : 20|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB2Op                                : 21|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB2Duck                              : 22|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB2Internal                          : 23|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB3Op                                : 24|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB3Duck                              : 25|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB3Internal                          : 26|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB1Data                              : 27|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB2Data                              : 28|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestABAB3Data                              : 29|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestDataStrMaxData                         : 30|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestDataStrMinData                         : 31|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINVMaxData                             : 32|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestMtrMaxData                             : 33|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseINV1LinkData                       : 34|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseINV2LinkData                       : 35|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData0Iso                           : 36|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData1Iso                           : 37|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData2Iso                           : 38|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData3Iso                           : 39|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData4Iso                           : 40|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData5Iso                           : 41|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData6Iso                           : 42|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData7Iso                           : 43|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseData8Iso                           : 44|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ LVDataTagsLvl                               : 45|3@1+ (1,0) [0|7] \"\" Access__NNN\n MP_ TestBaseABAB1LinkData                      : 48|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseABAB2LinkData                      : 49|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseABAB3LinkData                      : 50|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestBaseAADataIsoResLow                    : 51|1@1+ (1,0) [0|1] \"\" Access__NNN\n\nBO_ 1617 TSTTestQtx                             : 8 TST\n MP_ TestINV1Op                                 : 0|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2Op                                 : 1|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1PeakAbaVnc                         : 2|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2PeakAbaVnc                         : 3|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1ShrtAbaVnc                         : 4|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2ShrtAbaVnc                         : 5|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestESCMOp                                 : 6|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestESCMData                               : 7|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAirProcSys                             : 8|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestESCMState                              : 9|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr1MaxData                      : 10|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr2MaxData                      : 11|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr3MaxData                      : 12|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr1MinData                      : 13|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr2MinData                      : 14|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr3MinData                      : 15|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr1LeakStat                     : 16|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr2LeakStat                     : 17|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataStr3LeakStat                     : 18|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV1Duck                               : 19|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestINV2Duck                               : 20|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataDuck                             : 21|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestTMMDuck                                : 22|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestESCMDuck                               : 23|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestAADataShrtVnc                          : 24|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestEOPStErr1                              : 25|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestEOPStErr2                              : 26|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestEOPOilOverData                         : 27|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxDataStrMaxData                      : 28|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxDataStrMinData                      : 29|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxINVMaxData                          : 30|1@1+ (1,0) [0|1] \"\" Access__NNN\n MP_ TestQtxMtrMaxData                          : 31|1@1+ (1,0) [0|1] \"\" Access__NNN```

Any further insight you might be able to provide would be appreciated.

I did retry the import job again and noticed the server crashed after importing 1422 users (there are more users left to import) even though the current running instance reports 2292 users (all of which are included in each export). I am starting to wonder if after all of these imports about the integrity of the install itself.

I also see an update has been released since I stood up the install and am now wondering if it would be wise to try the latest release.

So I tested dropping all users from the import since in theory they should be there from the previous import and tried running again. This time I did get a different error but I’m not sure what it’s referencing:

[
  {
    "id": "9zz3zaakgt8ibd3cbor7jhmnby",
    "type": "import_process",
    "priority": 0,
    "create_at": 1679965315731,
    "start_at": 1679965329711,
    "last_activity_at": 1679965385335,
    "status": "error",
    "progress": -1,
    "data": {
      "error": "Error during job execution. — BulkImport: Failed to create direct channel, createDirectChannelWithUser: Unable to save direct channel., default_channel_roles_select: pq: remaining connection slots are reserved for non-replication superuser connections",
      "import_file": "bulk_import_01022021_01012022.zip",
      "line_number": "18916"
    }
  }
]

The referenced line looks almost identical to the previous channel which I would assume was successfully imported It just references different users. I might be experiencing the previous error I found with the monolithic import.

The latest error message means that the connections to your PostgreSQL database have been exhausted - not sure what’s causing that, but you might hit limits there while the import is running. Please check your Mattermost server’s config.json file for the SqlSettings.Max*Conns settings and make sure they’re lower than the max_connections setting in your PostgreSQL instance (by either lowering them in config.json or increasing them in your postgresql.conf).

1 Like

When you say “server crashed”, is that a panic or an OOM or something else? Can you show us the logs of the crash?

1 Like

I was only looking at the journalctl -u mattermost output and failed to see the oom-killer, but here is the output, I am going to test removing oom-killer since the server should have plenty of RAM (128GB).

Mar 27 18:20:59 appserver mattermost[16300]: {"timestamp":"2023-03-27 18:20:59.390 Z","level":"info","msg":"Importing user","caller":"app/import_functions.go:342","user_name":"first.last"}
Mar 27 18:21:04 appserver kernel: mattermost invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
Mar 27 18:21:04 appserver kernel: CPU: 25 PID: 16471 Comm: mattermost Not tainted 5.4.0-144-generic #161-Ubuntu
Mar 27 18:21:04 appserver kernel: Hardware name: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 11/12/2020
Mar 27 18:21:04 appserver kernel: Call Trace:
Mar 27 18:21:04 appserver kernel:  dump_stack+0x6d/0x8b
Mar 27 18:21:04 appserver kernel:  dump_header+0x4f/0x1eb
Mar 27 18:21:04 appserver kernel:  oom_kill_process.cold+0xb/0x10
Mar 27 18:21:04 appserver kernel:  out_of_memory+0x1cf/0x500
Mar 27 18:21:04 appserver kernel:  __alloc_pages_slowpath+0xdde/0xeb0
Mar 27 18:21:04 appserver kernel:  __alloc_pages_nodemask+0x2d0/0x320
Mar 27 18:21:04 appserver kernel:  alloc_pages_current+0x87/0xe0
Mar 27 18:21:04 appserver kernel:  __page_cache_alloc+0x72/0x90
Mar 27 18:21:04 appserver kernel:  pagecache_get_page+0xbf/0x300
Mar 27 18:21:04 appserver kernel:  filemap_fault+0x6b2/0xa50
Mar 27 18:21:04 appserver kernel:  ? unlock_page_memcg+0x12/0x20
Mar 27 18:21:04 appserver kernel:  ? page_add_file_rmap+0xff/0x1a0
Mar 27 18:21:04 appserver kernel:  ? xas_load+0xd/0x80
Mar 27 18:21:04 appserver kernel:  ? xas_find+0x17f/0x1c0
Mar 27 18:21:04 appserver kernel:  ? filemap_map_pages+0x24c/0x380
Mar 27 18:21:04 appserver kernel:  ext4_filemap_fault+0x32/0x50
Mar 27 18:21:04 appserver kernel:  __do_fault+0x3c/0x170
Mar 27 18:21:04 appserver kernel:  do_fault+0x24b/0x640
Mar 27 18:21:04 appserver kernel:  ? copy_fpstate_to_sigframe+0x2e2/0x370
Mar 27 18:21:04 appserver kernel:  __handle_mm_fault+0x4c5/0x7a0
Mar 27 18:21:04 appserver kernel:  handle_mm_fault+0xca/0x200
Mar 27 18:21:04 appserver kernel:  do_user_addr_fault+0x1f9/0x450
Mar 27 18:21:04 appserver kernel:  __do_page_fault+0x58/0x90
Mar 27 18:21:04 appserver kernel:  do_page_fault+0x2c/0xe0
Mar 27 18:21:04 appserver kernel:  page_fault+0x34/0x40
Mar 27 18:21:04 appserver kernel: RIP: 0033:0x458208
Mar 27 18:21:04 appserver kernel: Code: 48 01 f2 48 2b 91 a0 00 00 00 48 89 d6 48 c1 ea 0c 48 8d 14 92 48 c1 e2 02 48 03 91 98 00 00 00 81 e6 ff 0f 00 00 48 c1 ee 08 <8b> 3a 48 83 fe 10 73 75 0f b6 54 32 04 01 fa eb 03 44 89 c2 48 8b
Mar 27 18:21:04 appserver kernel: RSP: 002b:000000c00857b118 EFLAGS: 00010206
Mar 27 18:21:04 appserver kernel: RAX: 000000000006d61c RBX: 000000000046e61c RCX: 0000000004ae37c0
Mar 27 18:21:04 appserver kernel: RDX: 00000000035f9e44 RSI: 0000000000000006 RDI: 0000000000000001
Mar 27 18:21:04 appserver kernel: RBP: 000000c00857b128 R08: 00000000034c0010 R09: 000000000007c808
Mar 27 18:21:04 appserver kernel: R10: 00007fff45f49090 R11: 000000000d42a060 R12: 000000c00857b260
Mar 27 18:21:04 appserver kernel: R13: 0000000000000030 R14: 000000c008566000 R15: 00007fc89f789a1c
Mar 27 18:21:04 appserver kernel: Mem-Info:
Mar 27 18:21:04 appserver kernel: active_anon:30988042 inactive_anon:1677735 isolated_anon:4366
                                   active_file:228 inactive_file:325 isolated_file:0
                                   unevictable:4650 dirty:0 writeback:0 unstable:0
                                   slab_reclaimable:59762 slab_unreclaimable:69382
                                   mapped:2066 shmem:22 pagetables:68815 bounce:0
                                   free:72145 free_pcp:0 free_cma:0
Mar 27 18:21:04 appserver kernel: Node 0 active_anon:41335768kB inactive_anon:2296000kB active_file:540kB inactive_file:1120kB unevictable:17388kB isolated(anon):2284kB isolated(file):0kB mapped:6996kB dirty:0kB writeback:0kB shmem:80kB shmem_thp: 0kB shmem_pmdmapped: 0kB anon_thp: 12050432kB writeback_tmp:0kB unstab
le:0kB all_unreclaimable? no
Mar 27 18:21:04 appserver kernel: Node 1 active_anon:41218580kB inactive_anon:2295980kB active_file:196kB inactive_file:0kB unevictable:760kB isolated(anon):12492kB isolated(file):0kB mapped:756kB dirty:0kB writeback:0kB shmem:0kB shmem_thp: 0kB shmem_pmdmapped: 0kB anon_thp: 4026368kB writeback_tmp:0kB unstable:0kB
all_unreclaimable? no
Mar 27 18:21:04 appserver kernel: Node 2 active_anon:41397820kB inactive_anon:2118960kB active_file:176kB inactive_file:196kB unevictable:452kB isolated(anon):2688kB isolated(file):0kB mapped:512kB dirty:0kB writeback:0kB shmem:8kB shmem_thp: 0kB shmem_pmdmapped: 0kB anon_thp: 3696640kB writeback_tmp:0kB unstable:0kB
 all_unreclaimable? no
Mar 27 18:21:04 appserver kernel: Node 0 DMA free:15908kB min:12kB low:24kB high:36kB active_anon:0kB inactive_anon:0kB active_file:0kB inactive_file:0kB unevictable:0kB writepending:0kB present:15992kB managed:15908kB mlocked:0kB kernel_stack:0kB pagetables:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
Mar 27 18:21:04 appserver kernel: lowmem_reserve[]: 0 2909 43142 43142 43142
Mar 27 18:21:04 appserver kernel: Node 0 DMA32 free:163424kB min:2540kB low:5516kB high:8492kB active_anon:2881704kB inactive_anon:9704kB active_file:0kB inactive_file:24kB unevictable:0kB writepending:0kB present:3129216kB managed:3063680kB mlocked:0kB kernel_stack:0kB pagetables:796kB bounce:0kB free_pcp:0kB local_
pcp:0kB free_cma:0kB
Mar 27 18:21:04 appserver kernel: lowmem_reserve[]: 0 0 40233 40233 40233
Mar 27 18:21:04 appserver kernel: Node 0 Normal free:34840kB min:35176kB low:76372kB high:117568kB active_anon:38454064kB inactive_anon:2286296kB active_file:560kB inactive_file:1096kB unevictable:17388kB writepending:0kB present:41943040kB managed:41206384kB mlocked:17388kB kernel_stack:5764kB pagetables:82304kB bou
nce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
Mar 27 18:21:04 appserver kernel: lowmem_reserve[]: 0 0 0 0 0
Mar 27 18:21:04 appserver kernel: Node 1 Normal free:37276kB min:37452kB low:81316kB high:125180kB active_anon:41218904kB inactive_anon:2296020kB active_file:196kB inactive_file:0kB unevictable:760kB writepending:0kB present:44564480kB managed:43865324kB mlocked:760kB kernel_stack:2676kB pagetables:103148kB bounce:0k
B free_pcp:0kB local_pcp:0kB free_cma:0kB
Mar 27 18:21:04 appserver kernel: lowmem_reserve[]: 0 0 0 0 0
Mar 27 18:21:04 appserver kernel: Node 2 Normal free:37132kB min:37452kB low:81316kB high:125180kB active_anon:41397820kB inactive_anon:2119020kB active_file:176kB inactive_file:196kB unevictable:452kB writepending:0kB present:44564480kB managed:43865124kB mlocked:452kB kernel_stack:4088kB pagetables:89012kB bounce:0
kB free_pcp:0kB local_pcp:0kB free_cma:0kB
Mar 27 18:21:04 appserver kernel: lowmem_reserve[]: 0 0 0 0 0
Mar 27 18:21:04 appserver kernel: Node 0 DMA: 1*4kB (U) 0*8kB 0*16kB 1*32kB (U) 2*64kB (U) 1*128kB (U) 1*256kB (U) 0*512kB 1*1024kB (U) 1*2048kB (M) 3*4096kB (M) = 15908kB
Mar 27 18:21:04 appserver kernel: Node 0 DMA32: 148*4kB (UME) 157*8kB (UE) 404*16kB (UME) 357*32kB (UME) 288*64kB (UME) 197*128kB (UE) 121*256kB (UME) 39*512kB (UME) 48*1024kB (UME) 0*2048kB 0*4096kB = 163480kB
Mar 27 18:21:04 appserver kernel: Node 0 Normal: 158*4kB (UMEH) 298*8kB (UMEH) 289*16kB (UMEH) 332*32kB (UMEH) 107*64kB (UMEH) 44*128kB (UMEH) 8*256kB (ME) 0*512kB 2*1024kB (M) 0*2048kB 0*4096kB = 34840kB
Mar 27 18:21:04 appserver kernel: Node 1 Normal: 187*4kB (UME) 731*8kB (UME) 731*16kB (UME) 375*32kB (UME) 67*64kB (UME) 7*128kB (UM) 2*256kB (M) 1*512kB (M) 1*1024kB (M) 0*2048kB 0*4096kB = 37524kB
Mar 27 18:21:04 appserver kernel: Node 2 Normal: 241*4kB (MEH) 112*8kB (UMEH) 21*16kB (UEH) 1103*32kB (UMEH) 5*64kB (H) 3*128kB (H) 0*256kB 0*512kB 0*1024kB 0*2048kB 0*4096kB = 38196kB
Mar 27 18:21:04 appserver kernel: Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=1048576kB
Mar 27 18:21:04 appserver kernel: Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
Mar 27 18:21:04 appserver kernel: Node 1 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=1048576kB
Mar 27 18:21:04 appserver kernel: Node 1 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
Mar 27 18:21:04 appserver kernel: Node 2 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=1048576kB
Mar 27 18:21:04 appserver kernel: Node 2 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
Mar 27 18:21:04 appserver kernel: 55312 total pagecache pages
Mar 27 18:21:04 appserver kernel: 52681 pages in swap cache
Mar 27 18:21:04 appserver kernel: Swap cache stats: add 5935559, delete 5879890, find 1717529/2414745
Mar 27 18:21:04 appserver kernel: Free swap  = 0kB
Mar 27 18:21:04 appserver kernel: Total swap = 4194300kB
Mar 27 18:21:04 appserver kernel: 33554302 pages RAM
Mar 27 18:21:04 appserver kernel: 0 pages HighMem/MovableOnly
Mar 27 18:21:04 appserver kernel: 550197 pages reserved
Mar 27 18:21:04 appserver kernel: 0 pages cma reserved
Mar 27 18:21:04 appserver kernel: 0 pages hwpoisoned
Mar 27 18:21:04 appserver kernel: Tasks state (memory values in pages):
Mar 27 18:21:04 appserver kernel: [  pid  ]   uid  tgid total_vm      rss pgtables_bytes swapents oom_score_adj name
Mar 27 18:21:04 appserver kernel: [    815]     0   815   134671      639  1056768      227          -250 systemd-journal
Mar 27 18:21:04 appserver kernel: [    851]     0   851     5788      273    73728      564         -1000 systemd-udevd
Mar 27 18:21:04 appserver kernel: [   1138]     0  1138    70066     4519    94208        0         -1000 multipathd
Mar 27 18:21:04 appserver kernel: [   1186]   102  1186    22720      615    86016      206             0 systemd-timesyn
Mar 27 18:21:04 appserver kernel: [   1196]     0  1196    11886      468    86016      359             0 VGAuthService
Mar 27 18:21:04 appserver kernel: [   1197]     0  1197    59261      466    86016      289             0 vmtoolsd
Mar 27 18:21:04 appserver kernel: [   1242]   100  1242     4767      776    77824      194             0 systemd-network
Mar 27 18:21:04 appserver kernel: [   1244]   101  1244     6136      779    90112      983             0 systemd-resolve
Mar 27 18:21:04 appserver kernel: [   1259]     0  1259    59861      644   110592      246             0 accounts-daemon
Mar 27 18:21:04 appserver kernel: [   1263]     0  1263     1704      481    49152       49             0 cron
Mar 27 18:21:04 appserver kernel: [   1264]   103  1264     1937      791    53248      171          -900 dbus-daemon
Mar 27 18:21:04 appserver kernel: [   1273]     0  1273    20513      701    61440       65             0 irqbalance
Mar 27 18:21:04 appserver kernel: [   1276]     0  1276     7352      842    98304     1967             0 networkd-dispat
Mar 27 18:21:04 appserver kernel: [   1277]     0  1277    59106      704    98304      229             0 polkitd
Mar 27 18:21:04 appserver kernel: [   1279]   104  1279    56086      361    86016      497             0 rsyslogd
Mar 27 18:21:04 appserver kernel: [   1283]     0  1283   791208      558   528384     4219          -900 snapd
Mar 27 18:21:04 appserver kernel: [   1289]     0  1289     4389      680    69632      221             0 systemd-logind
Mar 27 18:21:04 appserver kernel: [   1297]     0  1297    98896      864   126976      469             0 udisksd
Mar 27 18:21:04 appserver kernel: [   1298]     0  1298      949      482    45056       35             0 atd
Mar 27 18:21:04 appserver kernel: [   1318]     0  1318    79705      485   126976      433             0 ModemManager
Mar 27 18:21:04 appserver kernel: [   1324]     0  1324     3045      703    73728      228         -1000 sshd
Mar 27 18:21:04 appserver kernel: [   1345]     0  1345    26980      740   114688     1918             0 unattended-upgr
Mar 27 18:21:04 appserver kernel: [   2213]     0  2213    62385      568   102400      311             0 upowerd
Mar 27 18:21:04 appserver kernel: [   7511]     0  7511     1457      355    45056       28             0 agetty
Mar 27 18:21:04 appserver kernel: [   9516]     0  9516     1457      353    49152       32             0 agetty
Mar 27 18:21:04 appserver kernel: [   9571]  1000  9571     4789      751    73728      392             0 systemd
Mar 27 18:21:04 appserver kernel: [   9573]  1000  9573    42663      567   106496      589             0 (sd-pam)
Mar 27 18:21:04 appserver kernel: [  16300]   997 16300 48681167 32431128 274227200   847194             0 mattermost
Mar 27 18:21:04 appserver kernel: [  16351]   997 16351   182663     1355   147456      380             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16366]   997 16366   642688      921   364544      618             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16380]   997 16380   185093     2106   180224     1024             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16401]   997 16401   183240     1619   172032     1036             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16421]   997 16421   185189     2506   176128      720             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16441]   997 16441   185211     2017   192512     1193             0 plugin-linux-am
Mar 27 18:21:04 appserver kernel: [  16494]  1000 16494     1871      292    53248       79             0 screen
Mar 27 18:21:04 appserver kernel: [  16495]  1000 16495     2139      832    61440      165             0 bash
Mar 27 18:21:04 appserver kernel: [  20416]  1000 20416    46333    40091   409600     5051             0 zip
Mar 27 18:21:04 appserver kernel: [  20672]     0 20672     3451      662    77824      365             0 sshd
Mar 27 18:21:04 appserver kernel: [  20757]  1000 20757     3485      440    77824      368             0 sshd
Mar 27 18:21:05 appserver kernel: [  20758]  1000 20758     2069      480    53248      410             0 bash
Mar 27 18:21:05 appserver kernel: [  20810]     0 20810     3450      665    61440      363             0 sshd
Mar 27 18:21:05 appserver kernel: [  20893]  1000 20893     3713      568    65536      550             0 sshd
Mar 27 18:21:05 appserver kernel: [  20894]  1000 20894     2069      476    61440      424             0 bash
Mar 27 18:21:05 appserver kernel: [  21093]  1000 21093   303426   126962  2469888   172513             0 vim
Mar 27 18:21:05 appserver kernel: [  21145]     0 21145     3450      644    65536      361             0 sshd
Mar 27 18:21:05 appserver kernel: [  21271]  1000 21271     3484      347    69632      327             0 sshd
Mar 27 18:21:05 appserver kernel: [  21272]  1000 21272     2069      591    57344      293             0 bash
Mar 27 18:21:05 appserver kernel: [  21288]  1000 21288     1705      378    49152       30             0 screen
Mar 27 18:21:05 appserver kernel: [  21289]  1000 21289     1846      392    53248       37             0 screen
Mar 27 18:21:05 appserver kernel: [  21290]  1000 21290     2073      856    53248       46             0 bash
Mar 27 18:21:05 appserver kernel: [  21297]     0 21297     3450      664    65536      355             0 sshd
Mar 27 18:21:05 appserver kernel: [  21382]  1000 21382     3484      427    61440      327             0 sshd
Mar 27 18:21:05 appserver kernel: [  21383]  1000 21383     2069      706    53248      207             0 bash
Mar 27 18:21:05 appserver kernel: [  21392]     0 21392     2358      554    57344      114             0 sudo
Mar 27 18:21:05 appserver kernel: [  21395]     0 21395    73980      838   331776       12             0 journalctl
Mar 27 18:21:05 appserver kernel: [  21445]  1000 21445     2540     1023    65536       57             0 htop
Mar 27 18:21:05 appserver kernel: [  21487]     0 21487     5788      341    65536      589             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21488]     0 21488     5788      349    65536      570             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21489]     0 21489     5788      308    65536      622             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21490]     0 21490     5788      372    65536      539             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21491]     0 21491     5788      367    65536      563             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21492]     0 21492     5788      448    65536      482             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21493]     0 21493     5788      407    65536      523             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21494]     0 21494     5788      430    65536      501             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21495]     0 21495     5788      483    65536      463             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21496]     0 21496     5788      415    65536      515             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: [  21497]     0 21497     5788      450    65536      480             0 systemd-udevd
Mar 27 18:21:05 appserver kernel: oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0-2,global_oom,task_memcg=/system.slice/mattermost.service,task=mattermost,pid=16300,uid=997
Mar 27 18:21:05 appserver kernel: Out of memory: Killed process 16300 (mattermost) total-vm:194724668kB, anon-rss:129724512kB, file-rss:0kB, shmem-rss:0kB, UID:997 pgtables:267800kB oom_score_adj:0
Mar 27 18:21:05 appserver mattermost[16300]: {"timestamp":"2023-03-27 18:20:59.432 Z","level":"info","msg":"Validating user","caller":"app/import_functions.go:331","user_name":"jack.mester"}
Mar 27 18:21:05 appserver mattermost[16300]: {"timestamp":"2023-03-27 18:20:59.432 Z","level":"info","msg":"Importing user","caller":"app/import_functions.go:342","user_name":"jack.mester"}
Mar 27 18:21:05 appserver mattermost[16300]: {"timestamp":"2023-03-27 18:20:59.713 Z","level":"info","msg":"Validating user","caller":"app/import_functions.go:331","user_name":"andrea.tello"}
Mar 27 18:21:05 appserver mattermost[16300]: {"timestamp":"2023-03-27 18:20:59.829 Z","level":"info","msg":"Validating user","caller":"app/import_functions.go:331","user_name":"derek.hartzel"}
Mar 27 18:21:08 appserver kernel: oom_reaper: reaped process 16300 (mattermost), now anon-rss:0kB, file-rss:0kB, shmem-rss:0kB
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Main process exited, code=killed, status=9/KILL
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16351 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16366 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16380 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16401 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16421 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16441 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16369 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16371 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16372 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16376 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16388 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16392 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16393 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16394 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16408 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16410 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16412 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16422 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16428 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16430 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16434 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16438 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16439 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16440 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16442 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16443 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16444 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16445 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16448 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16449 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16450 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16451 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16454 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16455 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16456 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16458 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16459 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16460 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16469 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16472 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16475 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16609 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16672 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16691 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16692 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16694 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16695 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16696 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16704 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16709 (plugin-linux-am) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16773 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 16783 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 18007 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Killing process 18160 (n/a) with signal SIGKILL.
Mar 27 18:21:08 appserver systemd[1]: mattermost.service: Failed with result 'signal'.
Mar 27 18:21:18 appserver systemd[1]: mattermost.service: Scheduled restart job, restart counter is at 2.
Mar 27 18:21:18 appserver systemd[1]: Stopped Mattermost.
Mar 27 18:21:18 appserver systemd[1]: Starting Mattermost...