Mattermost-plugin-ai. - Integration with local LLM

I have installed GitHub - mattermost/openops: Open source stack for applying AI to workflows in secure environments plugin with the mattermost docker image to use with Local AI. But unable to proceed to understand how to send custom payload. Someone please help me with this.

Hi @pramod1803 are you able to provide more context on your issue? Are you receiving any errors during step? I also recommend asking on our public AI Exchange channel if you still need help troubleshooting.