AI Link Drop - share your coolest finds!

LocalAI :houses:

LocalAI is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format.

OpenOps uses LocalAI out-of-the-box to integrate with local models.