Setting up using Docker

OK, as I expected.

Your docker container is also still running as it seems, please make sure to stop all Mattermost related docker containers (using the docker compose command you used to start them; docker ps should not show any running containers for Mattermost anymore (neither the database, nor nginx, nor the mattermost application itself). Then, make sure to stop the docker container that uses port 80, since we will need to run the certificate creation again. Once that’s all done (please verify, that no containers are running anymore), proceed with the following steps:

cd /home/jpzone282/mattermost
rm -r certs
./scripts/issue-certificate.sh -d mattermost.mysebsite.com -o ${PWD}/certs

If this all worked, please try to run the following command again:

CERT=$(awk -F= '$1~/^CERT_PATH/ { print $2 }' .env); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT

The output should look different now and there should not be any error messages. On a working system, the output should be similar to this:

./volumes/web/cert/cert.pem
./volumes/web/cert/cert.pem: PEM certificate
-rw-r--r-- 1 2000 2000 5595 May 18 07:33 ./volumes/web/cert/cert.pem
-----BEGIN CERTIFICATE-----
MIIFJzCCBA+gAwIBAgISBGaFi17tXWztClOAzmqFkLPWMA0GCSqGSIb3DQEBCwUA
MDIxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MQswCQYDVQQD

If it matches what you see (except for the timestamps and the certificate data, of course), you can try to start the Mattermost containers using the docker compose command again.
Hopefully this is the last thing that was broken in your deployment.

There are no containers running.


user@systemname:~/mattermost$ sudo CERT=$(awk -F= '$1~/^CERT_PATH/ { print $2 }' .env); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT
sudo: ./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem: command not found
./volumes/web/cert/cert.pem
find: ‘./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem’: Permission denied
./volumes/web/cert/cert.pem:                          directory
./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem: cannot open `./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem' (Permission denied)
ls: cannot access './certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem': Permission denied
./volumes/web/cert/cert.pem:
total 0
==> ./volumes/web/cert/cert.pem <==
head: error reading './volumes/web/cert/cert.pem': Is a directory
head: cannot open './certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem' for reading: Permission denied

I did not say anything about sudo… Please follow my instructions to the point and do not add things or change them.
Also it looks like you did not issue the command to remove the certs directory, because it‘s still there.

If you got an error message while doing so, I‘d need to see it to accomodate.
If you got a permission error while removing the directory, please prefix the command with sudo; it could be that you once tried to start the container as root and the folders are leftovers from this try.

Oh sorry. I tried sudo the second time just in case. But did the same thing.

Although i did sudo rm -r certs

Here is what was before i did sudo

user@systemname:~/mattermost$  rm -r certs
rm: descend into write-protected directory 'certs'?

try:

sudo rm -rf certs

then make sure the certs directory is gone and if it is, run the ./scripts/issue-certificate.sh … again (without sudo)

Okay, certs directory has gone.

I ran ./scripts/issue-certificate.sh -d mattermost.mysebsite.com -o ${PWD}/certs and it is now asking me to enter an email address and then agreed to the T’s & C’s etc. Then it comes with "Successfully received Certification.

Certificate is saved at: /etc/letsencrypt/live/mattermost.mywebsite.com/fullchain.pem
Key is saved at:         /etc/letsencrypt/live/mattermost.mywebsite.com/privkey.pem

OK, good. In this comment I asked you to modify your .env file to point to the new certificate paths, but according to your output in this comment it looks as if you have both options active in the .env file. Please check the first mentioned comment again and verify that your .env file looks similar in the relevant section. A # sign in front of a line means that the line is NOT active. You can only have one active line starting with CERT_PATH and one with KEY_PATH and they should look like this:

CERT_PATH=./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem
KEY_PATH=./certs/etc/letsencrypt/live/${DOMAIN}/privkey.pem

Then please run the following commands to make sure those files are actually there, are readable and in the right format:

DOMAIN=$(awk -F= '$1~/^DOMAIN/ { print $2 }' .env); CERT=$(awk -F= '$1~/^CERT_PATH/ { print $2 }' .env | sed 's/\${DOMAIN}/'$DOMAIN'/'); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT
DOMAIN=$(awk -F= '$1~/^DOMAIN/ { print $2 }' .env); CERT=$(awk -F= '$1~/^KEY_PATH/ { print $2 }' .env | sed 's/\${DOMAIN}/'$DOMAIN'/'); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT

The first line will validate the CERT_PATH, the second line will validate the KEY_PATH.
The output should look similar to this:

# for CERT_PATH
./certs/etc/letsencrypt/live/mattermost.mywebsite.com/fullchain.pem
./certs/etc/letsencrypt/live/mattermost.mywebsite.com/fullchain.pem: symbolic link to ../../archive/mattermost.mywebsite.com/fullchain1.pem
lrwxrwxrwx 1 root root 46 Aug 28 21:40 ./certs/etc/letsencrypt/live/mattermost.mywebsite.com/fullchain.pem -> ../../archive/mattermost.mywebsite.com/fullchain1.pem
-----BEGIN CERTIFICATE-----
MIIFKTCCBBGgAwIBAgISBHrIWlZDgcBKF1xVx1ZAyRvjMA0GCSqGSIb3DQEBCwUA
MDIxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MQswCQYDVQQD


# for KEY_PATH
./certs/etc/letsencrypt/live/mattermost.mywebsite.com/privkey.pem
./certs/etc/letsencrypt/live/mattermost.mywebsite.com/privkey.pem: symbolic link to ../../archive/mattermost.mywebsite.com/privkey1.pem
lrwxrwxrwx 1 root root 44 Aug 28 21:40 ./certs/etc/letsencrypt/live/mattermost.mywebsite.com/privkey.pem -> ../../archive/mattermost.mywebsite.com/privkey1.pem
-----BEGIN PRIVATE KEY-----
MIIEvwIBADANBgkqhkiG9w0BAQEFAASCBKkwggSlAgEAAoIBAQCi0owRzEjnm+1x
Dc0gZml7tZPDWi6vgLFz/jiyVzLumKNhRmvmg6UtL+jZiC0mbM9FZW64l8a5PlZl

If this is the case, you can try to docker compose ... up again (without the -d flag) and let me know if there are any other error messages on the screen.

I have changed the following in the .env


#CERT_PATH=./volumes/web/cert/cert.pem
#KEY_PATH=./volumes/web/cert/key-no-password.pem
#GITLAB_PKI_CHAIN_PATH=<path_to_your_gitlab_pki>/pki_chain.pem
CERT_PATH=./certs/etc/letsencrypt/live/${DOMAIN}/fullchain.pem
KEY_PATH=./certs/etc/letsencrypt/live/${DOMAIN}/privkey.pem

I have ran the following

user@system:~/mattermost$ DOMAIN=$(awk -F= '$1~/^DOMAIN/ { print $2 }' .env); CERT=$(awk -F= '$1~/^CERT_PATH/ { print $2 }' .env | sed 's/\${DOMAIN}/'$DOMAIN'/'); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT
find: ‘./certs/etc/letsencrypt/live/mattermost.mysite.com/fullchain.pem’: Permission denied
./certs/etc/letsencrypt/live/mattermost.mysite.com/fullchain.pem: cannot open `./certs/etc/letsencrypt/live/mattermost.mysite.com/fullchain.pem' (Permission denied)
ls: cannot access './certs/etc/letsencrypt/live/mattermost.mysite.com/fullchain.pem': Permission denied
head: cannot open './certs/etc/letsencrypt/live/mattermost.mysite.com/fullchain.pem' for reading: Permission denied

user@system:~/mattermost$ DOMAIN=$(awk -F= '$1~/^DOMAIN/ { print $2 }' .env); CERT=$(awk -F= '$1~/^KEY_PATH/ { print $2 }' .env | sed 's/\${DOMAIN}/'$DOMAIN'/'); find $CERT; file $CERT; ls -l $CERT; head -3 $CERT
find: ‘./certs/etc/letsencrypt/live/mattermost.mysite.com/privkey.pem’: Permission denied
./certs/etc/letsencrypt/live/mattermost.mysite.com/privkey.pem: cannot open `./certs/etc/letsencrypt/live/mattermost.mysite.com/privkey.pem' (Permission denied)
ls: cannot access './certs/etc/letsencrypt/live/mattermost.mysite.com/privkey.pem': Permission denied
head: cannot open './certs/etc/letsencrypt/live/mattermost.mysite.com/privkey.pem' for reading: Permission denied

OK, this looks better, although we have some permission problems because the files are stored as root (didn’t think about that), but the nginx container is also running as root, so this should work.

Please start your containers now and let me know if that works.

docker compose -f docker-compose.yml -f docker-compose.nginx.yml up

(If you use docker-compose instead of docker compose, then please adjust the command accordingly).

Getting a 502

Mattermost Log

      
{"timestamp":"2022-08-29 08:20:48.329 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-gitlab-v1.3.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.330 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-nps-v1.2.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.330 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-autolink-v1.2.2-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.335 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-calls-v0.7.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.340 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-antivirus-v0.1.2-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.341 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-aws-SNS-v1.2.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.341 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-jenkins-v1.1.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.354 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-playbooks-v1.29.1-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.369 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-zoom-v1.6.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:48.373 Z","level":"debug","msg":"Processing prepackaged plugin","caller":"app/plugin.go:967","path":"/mattermost/prepackaged_plugins/mattermost-plugin-welcomebot-v1.2.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:50.197 Z","level":"debug","msg":"Installing prepackaged plugin","caller":"app/plugin.go:997","path":"/mattermost/prepackaged_plugins/mattermost-plugin-nps-v1.2.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:50.212 Z","level":"debug","msg":"Installing prepackaged plugin","caller":"app/plugin.go:997","path":"/mattermost/prepackaged_plugins/mattermost-plugin-calls-v0.7.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:50.359 Z","level":"debug","msg":"Installing prepackaged plugin","caller":"app/plugin.go:997","path":"/mattermost/prepackaged_plugins/mattermost-plugin-apps-v1.1.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:50.382 Z","level":"debug","msg":"starting plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.calls","wrapped_extras":"pathplugins/com.mattermost.calls/server/dist/plugin-linux-amd64args[plugins/com.mattermost.calls/server/dist/plugin-linux-amd64]"}


{"timestamp":"2022-08-29 08:20:50.387 Z","level":"debug","msg":"plugin started","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.calls","wrapped_extras":"pathplugins/com.mattermost.calls/server/dist/plugin-linux-amd64pid25"}


{"timestamp":"2022-08-29 08:20:50.387 Z","level":"debug","msg":"waiting for RPC address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.calls","wrapped_extras":"pathplugins/com.mattermost.calls/server/dist/plugin-linux-amd64"}


{"timestamp":"2022-08-29 08:20:50.533 Z","level":"debug","msg":"plugin address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.calls","wrapped_extras":"address/tmp/plugin2530450391networkunixtimestamp2022-08-29T08:20:50.533Z"}


{"timestamp":"2022-08-29 08:20:50.534 Z","level":"debug","msg":"using plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.calls","wrapped_extras":"version1"}


{"timestamp":"2022-08-29 08:20:50.539 Z","level":"debug","msg":"starting plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.nps","wrapped_extras":"pathplugins/com.mattermost.nps/server/dist/plugin-linux-amd64args[plugins/com.mattermost.nps/server/dist/plugin-linux-amd64]"}


{"timestamp":"2022-08-29 08:20:50.552 Z","level":"debug","msg":"plugin started","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.nps","wrapped_extras":"pathplugins/com.mattermost.nps/server/dist/plugin-linux-amd64pid31"}


{"timestamp":"2022-08-29 08:20:50.552 Z","level":"debug","msg":"waiting for RPC address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.nps","wrapped_extras":"pathplugins/com.mattermost.nps/server/dist/plugin-linux-amd64"}


{"timestamp":"2022-08-29 08:20:50.669 Z","level":"debug","msg":"Initializing telemetry","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*Plugin).initTelemetry telemetry.go:45"}


{"timestamp":"2022-08-29 08:20:50.676 Z","level":"debug","msg":"activating","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*Plugin).OnActivate activate.go:19"}


{"timestamp":"2022-08-29 08:20:50.683 Z","level":"debug","msg":"cleaning up calls state","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*Plugin).cleanUpState channel_state.go:89"}


{"timestamp":"2022-08-29 08:20:50.816 Z","level":"debug","msg":"plugin address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.nps","wrapped_extras":"address/tmp/plugin608780886networkunixtimestamp2022-08-29T08:20:50.816Z"}


{"timestamp":"2022-08-29 08:20:50.816 Z","level":"debug","msg":"using plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.nps","wrapped_extras":"version1"}


{"timestamp":"2022-08-29 08:20:50.827 Z","level":"debug","msg":"Installing prepackaged plugin","caller":"app/plugin.go:997","path":"/mattermost/prepackaged_plugins/focalboard-v7.1.0-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:50.878 Z","level":"debug","msg":"Activating plugin","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.nps"}


{"timestamp":"2022-08-29 08:20:50.881 Z","level":"info","msg":"Ensuring Feedbackbot exists","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.nps"}


{"timestamp":"2022-08-29 08:20:50.909 Z","level":"debug","msg":"Found feedbackbot","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.nps"}


{"timestamp":"2022-08-29 08:20:50.914 Z","level":"debug","msg":"Plugin activated","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.nps"}


{"timestamp":"2022-08-29 08:20:51.116 Z","level":"debug","msg":"Installing prepackaged plugin","caller":"app/plugin.go:997","path":"/mattermost/prepackaged_plugins/mattermost-plugin-playbooks-v1.29.1-linux-amd64.tar.gz"}


{"timestamp":"2022-08-29 08:20:51.808 Z","level":"debug","msg":"starting plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.apps","wrapped_extras":"pathplugins/com.mattermost.apps/server/dist/plugin-linux-amd64args[plugins/com.mattermost.apps/server/dist/plugin-linux-amd64]"}


{"timestamp":"2022-08-29 08:20:51.820 Z","level":"debug","msg":"plugin started","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.apps","wrapped_extras":"pathplugins/com.mattermost.apps/server/dist/plugin-linux-amd64pid43"}


{"timestamp":"2022-08-29 08:20:51.820 Z","level":"debug","msg":"waiting for RPC address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.apps","wrapped_extras":"pathplugins/com.mattermost.apps/server/dist/plugin-linux-amd64"}


{"timestamp":"2022-08-29 08:20:52.548 Z","level":"debug","msg":"using plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.apps","wrapped_extras":"version1"}


{"timestamp":"2022-08-29 08:20:52.549 Z","level":"debug","msg":"plugin address","caller":"plugin/hclog_adapter.go:52","plugin_id":"com.mattermost.apps","wrapped_extras":"address/tmp/plugin1911846446networkunixtimestamp2022-08-29T08:20:52.548Z"}


{"timestamp":"2022-08-29 08:20:53.042 Z","level":"debug","msg":"ensured bot","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","username":"appsbot","id":"18ru3afo7jbg9j1x33qhb46opr"}


{"timestamp":"2022-08-29 08:20:53.113 Z","level":"debug","msg":"failed to fetch license twice. May incorrectly default to on-prem mode","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps"}


{"timestamp":"2022-08-29 08:20:53.113 Z","level":"debug","msg":"configured","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","cloud_mode":"false","developer_mode":"false","allow_http_apps":"true","version":"1.1.0","commit":"0856d8f","build_date":"Tue 24 May 2022 03:36:58 PM UTC"}


{"timestamp":"2022-08-29 08:20:53.114 Z","level":"debug","msg":"created an AWS client","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","secret":"<nil>","purpose":"Manifest store","region":"us-east-1","access":"<nil>"}


{"timestamp":"2022-08-29 08:20:53.121 Z","level":"debug","msg":"initialized API and persistent store","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps"}


{"timestamp":"2022-08-29 08:20:53.121 Z","level":"debug","msg":"available upstream","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","type":"HTTP"}


{"timestamp":"2022-08-29 08:20:53.123 Z","level":"debug","msg":"Skipped upstream: AWS Lambda: not configured.","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","error":"AWS credentials are not set: not found"}


{"timestamp":"2022-08-29 08:20:53.123 Z","level":"debug","msg":"available upstream","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","type":"Mattermost Plugin"}


{"timestamp":"2022-08-29 08:20:53.124 Z","level":"debug","msg":"Skipped upstream: OpenFaaS: not configured.","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps","error":"OPENFAAS_URL environment variable must be defined: not found"}


{"timestamp":"2022-08-29 08:20:53.124 Z","level":"debug","msg":"initialized the app proxy","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.apps"}


{"timestamp":"2022-08-29 08:20:53.129 Z","level":"info","msg":"activated","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.apps"}


{"timestamp":"2022-08-29 08:20:53.760 Z","level":"debug","msg":"starting plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"playbooks","wrapped_extras":"pathplugins/playbooks/server/dist/plugin-linux-amd64args[plugins/playbooks/server/dist/plugin-linux-amd64]"}


{"timestamp":"2022-08-29 08:20:53.761 Z","level":"debug","msg":"plugin started","caller":"plugin/hclog_adapter.go:52","plugin_id":"playbooks","wrapped_extras":"pathplugins/playbooks/server/dist/plugin-linux-amd64pid52"}


{"timestamp":"2022-08-29 08:20:53.761 Z","level":"debug","msg":"waiting for RPC address","caller":"plugin/hclog_adapter.go:52","plugin_id":"playbooks","wrapped_extras":"pathplugins/playbooks/server/dist/plugin-linux-amd64"}


{"timestamp":"2022-08-29 08:20:53.784 Z","level":"debug","msg":"using plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"playbooks","wrapped_extras":"version1"}


{"timestamp":"2022-08-29 08:20:53.784 Z","level":"debug","msg":"plugin address","caller":"plugin/hclog_adapter.go:52","plugin_id":"playbooks","wrapped_extras":"address/tmp/plugin174305801networkunixtimestamp2022-08-29T08:20:53.784Z"}


{"timestamp":"2022-08-29 08:20:54.223 Z","level":"debug","msg":"starting plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"focalboard","wrapped_extras":"pathplugins/focalboard/server/dist/plugin-linux-amd64args[plugins/focalboard/server/dist/plugin-linux-amd64]"}


{"timestamp":"2022-08-29 08:20:54.223 Z","level":"debug","msg":"plugin started","caller":"plugin/hclog_adapter.go:52","plugin_id":"focalboard","wrapped_extras":"pathplugins/focalboard/server/dist/plugin-linux-amd64pid61"}


{"timestamp":"2022-08-29 08:20:54.224 Z","level":"debug","msg":"waiting for RPC address","caller":"plugin/hclog_adapter.go:52","plugin_id":"focalboard","wrapped_extras":"pathplugins/focalboard/server/dist/plugin-linux-amd64"}


{"timestamp":"2022-08-29 08:20:54.257 Z","level":"debug","msg":"using plugin","caller":"plugin/hclog_adapter.go:52","plugin_id":"focalboard","wrapped_extras":"version1"}


{"timestamp":"2022-08-29 08:20:54.257 Z","level":"debug","msg":"plugin address","caller":"plugin/hclog_adapter.go:52","plugin_id":"focalboard","wrapped_extras":"address/tmp/plugin968349881networkunixtimestamp2022-08-29T08:20:54.254Z"}


{"timestamp":"2022-08-29 08:20:54.293 Z","level":"info","msg":"connectDatabase","caller":"app/plugin_api.go:937","plugin_id":"focalboard","dbType":"postgres"}


{"timestamp":"2022-08-29 08:20:54.314 Z","level":"debug","msg":"Acquiring cluster lock for Unique IDs migration","caller":"app/plugin_api.go:934","plugin_id":"focalboard"}


{"timestamp":"2022-08-29 08:20:54.324 Z","level":"debug","msg":"Releasing cluster lock for Unique IDs migration","caller":"app/plugin_api.go:934","plugin_id":"focalboard"}


{"timestamp":"2022-08-29 08:20:54.341 Z","level":"debug","msg":"Mention listener added.","caller":"app/plugin_api.go:934","plugin_id":"focalboard","listener_count":"1"}


{"timestamp":"2022-08-29 08:20:54.344 Z","level":"info","msg":"Initialized notification backend","caller":"app/plugin_api.go:937","plugin_id":"focalboard","name":"notifyMentions"}


{"timestamp":"2022-08-29 08:20:54.348 Z","level":"debug","msg":"Starting subscriptions backend","caller":"app/plugin_api.go:934","plugin_id":"focalboard","freq_card":"120","freq_board":"86400"}


{"timestamp":"2022-08-29 08:20:54.350 Z","level":"info","msg":"Initialized notification backend","caller":"app/plugin_api.go:937","plugin_id":"focalboard","name":"notifySubscriptions"}


{"timestamp":"2022-08-29 08:20:54.351 Z","level":"info","msg":"Initialized notification backend","caller":"app/plugin_api.go:937","plugin_id":"focalboard","name":"notifyLogger"}


{"timestamp":"2022-08-29 08:20:54.353 Z","level":"debug","msg":"notify loop - no hints in queue","caller":"app/plugin_api.go:934","plugin_id":"focalboard","next_check":"\"2022-08-29 09:20:54.353 Z\""}


{"timestamp":"2022-08-29 08:20:54.357 Z","level":"debug","msg":"subscription notifier loop","caller":"app/plugin_api.go:934","plugin_id":"focalboard","next_notify":"\"2022-08-29 09:20:54.353 Z\""}


{"timestamp":"2022-08-29 08:20:54.374 Z","level":"debug","msg":"Fetched template blocks","caller":"app/plugin_api.go:934","plugin_id":"focalboard","count":"7"}


{"timestamp":"2022-08-29 08:20:54.387 Z","level":"debug","msg":"Template import not needed, skipping","caller":"app/plugin_api.go:934","plugin_id":"focalboard"}


{"timestamp":"2022-08-29 08:20:54.393 Z","level":"info","msg":"Server.Start","caller":"app/plugin_api.go:937","plugin_id":"focalboard"}


{"timestamp":"2022-08-29 08:20:54.402 Z","level":"debug","msg":"server not bind to any port","caller":"app/plugin_api.go:934","plugin_id":"focalboard"}


{"timestamp":"2022-08-29 08:20:55.871 Z","level":"info","msg":"got public IP address","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Info log.go:84","addr":"123.123.123.123"}


{"timestamp":"2022-08-29 08:20:55.871 Z","level":"info","msg":"rtc: server is listening on udp 8443","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Info log.go:84"}


{"timestamp":"2022-08-29 08:20:55.872 Z","level":"debug","msg":"rtc: udp buffers","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Debug log.go:80","writeBufSize":"425984","readBufSize":"425984"}


{"timestamp":"2022-08-29 08:20:55.873 Z","level":"info","msg":"rtc: server is listening on udp 8443","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Info log.go:84"}


{"timestamp":"2022-08-29 08:20:55.873 Z","level":"debug","msg":"rtc: udp buffers","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Debug log.go:80","writeBufSize":"425984","readBufSize":"425984"}


{"timestamp":"2022-08-29 08:20:55.873 Z","level":"info","msg":"rtc: server is listening on udp 8443","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Info log.go:84"}


{"timestamp":"2022-08-29 08:20:55.874 Z","level":"debug","msg":"rtc: udp buffers","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Debug log.go:80","writeBufSize":"425984","readBufSize":"425984"}


{"timestamp":"2022-08-29 08:20:55.874 Z","level":"info","msg":"rtc: server is listening on udp 8443","caller":"app/plugin_api.go:937","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Info log.go:84"}


{"timestamp":"2022-08-29 08:20:55.874 Z","level":"debug","msg":"rtc: udp buffers","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*logger).Debug log.go:80","writeBufSize":"425984","readBufSize":"425984"}


{"timestamp":"2022-08-29 08:20:55.875 Z","level":"debug","msg":"activated","caller":"app/plugin_api.go:934","plugin_id":"com.mattermost.calls","origin":"main.(*Plugin).OnActivate activate.go:136","ClusterID":""}


{"timestamp":"2022-08-29 08:20:55.891 Z","level":"error","msg":"Mail server connection test failed","caller":"app/server.go:1216","error":"unable to connect: unable to connect to the SMTP server: dial tcp 127.0.0.1:10025: connect: connection refused"}


{"timestamp":"2022-08-29 08:20:55.891 Z","level":"debug","msg":"Able to write files to local storage.","caller":"filestore/localstore.go:78"}


{"timestamp":"2022-08-29 08:20:55.896 Z","level":"info","msg":"Starting Server...","caller":"app/server.go:1234"}


{"timestamp":"2022-08-29 08:20:55.900 Z","level":"info","msg":"Server is listening on [::]:8065","caller":"app/server.go:1307","address":"[::]:8065"}


{"timestamp":"2022-08-29 08:20:55.900 Z","level":"debug","msg":"No license provided; Remote Cluster services disabled","caller":"app/server.go:877"}


{"timestamp":"2022-08-29 08:21:16.046 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"kbwnkuhc1fb39fawy3sw3de8ee","status_code":"200"}


{"timestamp":"2022-08-29 08:21:46.217 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"ttit64dtsjr9dg31zimrihioic","status_code":"200"}


{"timestamp":"2022-08-29 08:21:47.659 Z","level":"debug","msg":"Scheduling Job","caller":"migrations/scheduler.go:45","scheduler":"migrations"}


{"timestamp":"2022-08-29 08:21:47.659 Z","level":"debug","msg":"All migrations are complete.","caller":"migrations/scheduler.go:84","scheduler":"migrations"}


{"timestamp":"2022-08-29 08:21:47.660 Z","level":"debug","msg":"Next run time for scheduler","caller":"jobs/schedulers.go:147","scheduler_name":"migrations","next_runtime":"<nil>"}


{"timestamp":"2022-08-29 08:22:16.357 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"fergao8fki8qmdfxaesz3pxigy","status_code":"200"}


{"timestamp":"2022-08-29 08:22:46.489 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"4ja6pd6ysbn5brejwnynqcrysc","status_code":"200"}


{"timestamp":"2022-08-29 08:23:16.637 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"szwuaf7m5jfqtpfx94ist8nxoy","status_code":"200"}


{"timestamp":"2022-08-29 08:23:46.772 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"9er6k1onjt8amg83hfteawoeeh","status_code":"200"}


{"timestamp":"2022-08-29 08:24:16.896 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"k56nbz4zhjbi9j5wh3qxw47a5h","status_code":"200"}


{"timestamp":"2022-08-29 08:24:47.034 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"kjqexfg9siyqbq3boy1uqnqi4a","status_code":"200"}


{"timestamp":"2022-08-29 08:25:17.148 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"r1pf1z6b7i8zbg6onan36cjzdy","status_code":"200"}


{"timestamp":"2022-08-29 08:25:47.300 Z","level":"debug","msg":"Received HTTP request","caller":"web/handlers.go:156","method":"GET","url":"/api/v4/system/ping","request_id":"7tmf16khuiy19d8136spqudurh","status_code":"200"}


Mattermost NGINX Log

/docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration


/docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/


/docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh


10-listen-on-ipv6-by-default.sh: info: can not modify /etc/nginx/conf.d/default.conf (read-only file system?)


/docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh


/docker-entrypoint.sh: Launching /docker-entrypoint.d/30-tune-worker-processes.sh


/docker-entrypoint.sh: Configuration complete; ready for start up


/docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration


/docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/


/docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh


10-listen-on-ipv6-by-default.sh: info: can not modify /etc/nginx/conf.d/default.conf (read-only file system?)


/docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh


/docker-entrypoint.sh: Launching /docker-entrypoint.d/30-tune-worker-processes.sh


/docker-entrypoint.sh: Configuration complete; ready for start up

Mattermost:

{"timestamp":"2022-08-29 08:20:55.900 Z","level":"info","msg":"Server is listening on [::]:8065","caller":"app/server.go:1307","address":"[::]:8065"}

So this looks good in my opinion, the 502 could have been given to you because you might have been too fast; it takes up to a minute for the first start because all the databases need to be populated.
If you still cannot access the Mattermost webinterface, please provide the output of:

sudo docker ps

I think it is the Proxy manager ?

CONTAINER ID   IMAGE                                    COMMAND                  CREATED             STATUS                    PORTS                                                            NAMES
4cf5e1fda301   jc21/nginx-proxy-manager:latest          "/init"                  About an hour ago   Up About an hour          0.0.0.0:80->80/tcp, 0.0.0.0:443->443/tcp, 0.0.0.0:6741->81/tcp   nginx_proxy_manager
6b6def36b1e9   nginx                                    "/docker-entrypoint.…"   About an hour ago   Up About an hour          0.0.0.0:8080->80/tcp                                             website
fd61fd62e10b   portainer/portainer-ce                   "/portainer"             About an hour ago   Up About an hour          8000/tcp, 0.0.0.0:8000->9000/tcp                                 portainer
35daff9828a8   rootgg/plik:latest                       "/bin/sh -c ./plikd"     About an hour ago   Up About an hour          0.0.0.0:8091->8080/tcp                                           pilk_file_uploader
06cc780ccd00   nginx:alpine                             "/docker-entrypoint.…"   About an hour ago   Up 25 minutes             0.0.0.0:1180->80/tcp, 0.0.0.0:1443->443/tcp                      nginx_mattermost
427bd87ceb15   mattermost/mattermost-team-edition:7.1   "/entrypoint.sh matt…"   About an hour ago   Up 25 minutes (healthy)   8067/tcp, 0.0.0.0:8065->8065/tcp, 8074-8075/tcp                  docker-mattermost-1
b3a1f412201d   postgres:13-alpine                       "docker-entrypoint.s…"   About an hour ago   Up 25 minutes             5432/tcp                                                         docker-postgres-1

OK, this looks good - the Mattermost containers are up for 25 minutes no, no respawning.
You’re getting the 502 when you try to access https://mattermost.yourwebiste.com:1443 with the port :1443 as configured, right?
If so, let’s have a look at the nginx error logs in the nginx container.

To do so, enter the following commands:

docker exec -ti 06cc780ccd00 /bin/sh
tail -f /var/log/nginx/*.log

The tail command will not finish because it’s waiting for you to access the site now.
While leaving the command running, try to access your Mattermost installation using the URL https://mattermost.yourwebsite.com:1443 and if you get new lines in the console output of the tail command, please send them with your next message.

Below is me doing it through domain.
However, if i do http://ip-address:1443 it works.

So I am guessing it’s my subdomain setup?

/ # tail -f /var/log/nginx/*.log
==> /var/log/nginx/access.log <==

==> /var/log/nginx/error.log <==
2022/08/29 09:09:27 [notice] 1#1: start worker process 23
2022/08/29 09:09:27 [notice] 1#1: start worker process 24
2022/08/29 09:09:27 [notice] 1#1: start worker process 25
2022/08/29 09:09:27 [notice] 1#1: start worker process 26
2022/08/29 09:09:27 [notice] 1#1: start cache manager process 27
2022/08/29 09:09:27 [notice] 1#1: start cache loader process 28
2022/08/29 09:10:27 [notice] 28#28: http file cache: /var/cache/nginx 0.000M, bsize: 4096
2022/08/29 09:10:27 [notice] 1#1: signal 17 (SIGCHLD) received from 28
2022/08/29 09:10:27 [notice] 1#1: cache loader process 28 exited with code 0
2022/08/29 09:10:27 [notice] 1#1: signal 29 (SIGIO) received

==> /var/log/nginx/mm.access.log <==

==> /var/log/nginx/mm.error.log <==

Yes, maybe the subdomain does not point to the same IP address or has not been set up accordingly in your DNS. The requests obviously are not arriving at the Mattermost nginx instance.

Can you verify that the correct IPs are being returned by the client you’re using to access the Mattermost instance? If this is a windows client, you can use the ping command in a CMD box f.ex. to see what IP the name resolves to.
Sometimes your provider or your internet access also cache DNS entries (and your browser does to), so if the DNS record is already pointing to the correct IP, it is very likely that a reboot of your client will fix this issue then.

Hm yeah DNS records are correct. Restarted my client just in case, cleared the browser cache tried a different browser and a different client device.

Hard to debug without knowing the unmasked configuration files.
If you want, you can send me the output of the following command via private message here in this forum, so your domain name will not be publically posted here and I’ll have a look then.

docker exec -ti 06cc780ccd00 /usr/sbin/nginx -T

Maybe I can find out what the problem is when seeing the real domain names and the actual configuration files.

OK, so another 40 private replies later, here’s what we also did so everyone still following us here can also participate in the happy end :slight_smile:

After all, the problem was that the nginx proxy manager docker container was in a different docker network scope than the mattermost docker containers and so those two container groups could not communicate with each other.

After reverting all the ports to their defaults and removing the ports: configuration for the mattermost_nginx container in favor of an expose: section, we also found out that the container names were wrong and the nginx proxy manager was referring to a container name that was not available. So we made sure we also aligned these configurations.

Using the command docker network ls showed us a list of multiple different bridge networks and to find out which networks the containers use, we used the following command, which will iterate over all running docker containers and extract the network-related configuration from it and will display it in JSON format:

docker ps --format "{{.ID}}" | xargs docker inspect | jq .[].NetworkSettings.Networks

Here’s an example of what the output looked like:

# mattermost_nginx container
{
  "mattermost": {
    "IPAMConfig": null,
    "Links": null,
    "Aliases": [
      "nginx_mattermost",
      "nginx",
      "d338db70daac"
    ],
[...]

# nginx proxy manager
{
  "docker_default": {
    "IPAMConfig": null,
    "Links": null,
    "Aliases": [
      "nginx_proxy_manager",
      "nginx_proxy_manager",
      "4cf5e1fda301"
    ],
[...]

Here you can see, that the nginx proxy manager is in the network docker_default, whereas the nginx from Mattermost is in the network mattermost.

To fix that, stop the containers and at the bottom of the docker-compose.nginx.yml file, change the networks option to match the desired target network (in this case we moved the Mattermost container group to the docker_default network):

networks:
  default:
    name: docker_default

After we started the containers, we were immediately able to establish the connection and @JPzone282 was up and running after a long journey :slight_smile:

I’m marking this thread as resolved now, I think it’s long enough already.

1 Like

Thank you so much for the help @agriesser much appreciated!! I have learned a lot here!