Mattermost calls: Random "timeout waiting for peer connections" from wireguard connected clients

Hello,
so we have set up with docker a mattermost for teams for a month or so. It was working very well, including calls for any member of the team.

Some of those members, often use the calls plugins when connected to the server via a Wireguard VPN, also them had no issue reported…

Until 2 days ago at least… when we started experiencing random “timeout waiting for peer connections” when someone tries to join or start a call from a client connected via VPN. This is very unconsistent sometimes the same person connects or create a call without problems, some time they get the error but if they try again they are ok… from LAN everything works correctly.

I for example, didn’t manage to connect my Android to a single call via VPN, but if I connect my notebook with an hotspot provided by the same phone and connected to THE SAME VPN it does work!

Basically even reproducing the error is a random thing! It’s driving me mad! Connections are ok, VPN is stable, devices and server the same.

The only thing that has changed recently is docker 23… Could this be the problem?
I thought that I need a TURN… but why? If we needed something like this or some misconfiguration was in place it shouldn’t have worked on the first place and not in a random way.

I suspect this may be a bug but maybe someone could enlighten me with something I’m not considering.

Some other infos:

  • Mattermost version 7.9.1 was upgraded 2 weeks ago (I think we were on 7.8 a month ago… could this upgrade bring some bugs?)
  • Calls plugin version 0.14.1 upgraded from 0.14 while trying to fix the issue
  • RTCD port directly exposed and reacheable
  • No STUN/TURN
  • ICE Host candidate set to server IP
  • Clients connected to VPN are allowed to reach the server
  • No network/VPN changes
  • Android app latest version

Since I experience the problem just from android I couldn’t collect any error from a browser console. I will try to get it from someone experiencing this issue via desktop.

The only piece of error that I get from the server is

"level":"error","msg":"callback failed: call state is missing from channel state","caller":"app/plugin_api.go:976","plugin_id":"com.mattermost.calls","origin":"main.(*Plugin).handleLeave websocket.go:447"}

Hi @steccas and welcome to the Mattermost forums!

This is a very specific problem you’re reporting here. This smells a bit like a dual-stack problem (IPv6/IPv4), but since you’ve set the ICEHostOverride to the IPv4 address (no hostname, right?) of the server, I think this can be ruled out.

Not sure how to further debug that - @streamer45 can you help out here?

Hello and thank you! Yes i configured an ipv4 address in the ICE Host directive.
But I can verify something about ipv4/ipv6 if needed.

I can tell that the network I’m using is on ipv4, the nic is on ipv4 and there is no ipv6 enabled in general on the hosts. (I can check with the ip command, no v6 are assigned)

The same goes for docker. I’m on ubuntu server 22.04 and netplan is configured as such:

network:
  ethernets:
    enp7s0:
      dhcp4: yes
      addresses:
        - 192.168.2.2/28
      gateway4: 192.168.2.1
      nameservers:
        search: [mydomain]
        addresses: [192.168.2.5]
    enp5s0:
      dhcp4: yes
      dhcp4-overrides:
        use-dns: no
      addresses:
        - 192.168.150.3/28
      nameservers:
        addresses: [192.168.150.1]
  version: 2

I can see that various containers (not only mattermost) are throwing some warning because they try to attach to ipv6 but it never translated to real problems (excluding some nginx that have a listen directive on both v4 and v6). Maybe I have to configure something on the host?

Update:
today we had numerous call crashes during calls, in the logs I’ve found:

mattermost-web-1       | {"timestamp":"2023-04-07 16:09:37.446 +02:00","level":"error","msg":"plugin process exited","caller":"plugin/hclog_adapter.go:79","plugin_id":"com.mattermost.calls","wrapped_extras":"pathplugins/com.mattermost.calls/server/dist/plugin-linux-amd64pid18335errorexit status 2"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:44.770 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, RPC call failed","caller":"plugin/client_rpc.go:423","plugin_id":"com.mattermost.calls","error":"connection is shut down"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:45.957 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, RPC call failed","caller":"plugin/client_rpc.go:423","plugin_id":"com.mattermost.calls","error":"connection is shut down"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:46.202 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, RPC call failed","caller":"plugin/client_rpc.go:423","plugin_id":"com.mattermost.calls","error":"connection is shut down"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:46.257 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, RPC call failed","caller":"plugin/client_rpc.go:423","plugin_id":"com.mattermost.calls","error":"connection is shut down"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:48.799 +02:00","level":"error","msg":"RPC call OnDeactivate to plugin failed.","caller":"plugin/client_rpc_generated.go:33","plugin_id":"com.mattermost.calls","error":"connection is shut down"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:49.771 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't Accept request body connection","caller":"plugin/client_rpc.go:397","plugin_id":"com.mattermost.calls","error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:49.771 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't accept connection","caller":"plugin/client_rpc.go:378","plugin_id":"com.mattermost.calls","serve_http_stream_id":49,"error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:50.958 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't Accept request body connection","caller":"plugin/client_rpc.go:397","plugin_id":"com.mattermost.calls","error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:50.958 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't accept connection","caller":"plugin/client_rpc.go:378","plugin_id":"com.mattermost.calls","serve_http_stream_id":51,"error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:51.203 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't Accept request body connection","caller":"plugin/client_rpc.go:397","plugin_id":"com.mattermost.calls","error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:51.203 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't accept connection","caller":"plugin/client_rpc.go:378","plugin_id":"com.mattermost.calls","serve_http_stream_id":53,"error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:51.257 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't accept connection","caller":"plugin/client_rpc.go:378","plugin_id":"com.mattermost.calls","serve_http_stream_id":55,"error":"timeout waiting for accept"}
mattermost-web-1       | {"timestamp":"2023-04-07 16:09:51.257 +02:00","level":"error","msg":"Plugin failed to ServeHTTP, muxBroker couldn't Accept request body connection","caller":"plugin/client_rpc.go:397","plugin_id":"com.mattermost.calls","error":"timeout waiting for accept"}