How to setup Anything LLM to use Inference API (serverless)

#13
by Lbenj - opened

Trouble shooting

rather than paying for deploying with a service, I would like to deploy this code using anything LLM for local use if possible.

Sign up or log in to comment