← Back to Board
background platform & cms 1 sources

LiteLLM server now supported on Vercel

You can now deploy LiteLLM server on Vercel, giving developers LLM access with an OpenAI-compatible gateway connecting to any supported provider, including Vercel AI Gateway. To route a single model through Vercel AI Gateway, use the below configuration in litellm_config.yaml : Deploy LiteLLM on Vercel or learn more on our documentation Read more

Reported by Vercel Blog. Monitor for further developments.