LiteLLM (YC W23) reposted this
Excited to launch OAuth 2.0 on LiteLLM (YC W23) Gateway 🔐 use OAuth 2.0 to authenticate /chat/completions, /completions, /embeddings requests (h/t Nivid Dholakia) -> Start here: https://lnkd.in/dcnjqt2k 🪨 Support for returning trace for Bedrock Guardrails requests (h/t Ivane Kelaptrishvili) 📖 Docs - add example on setting up LiteLLM Gateway with Sagemaker 🛠️ Support for setting temperature=0 for Sagemaker requests on LiteLLM Gateway 🔐 PR - setting tpm/rpm limits per model for a given API Key https://lnkd.in/d-t_4pR9
Principal Software Consultant
2moSo, someone wanting to make a backend fetch to an OpenAI endpoint using your proxy will have to essentially make 2 API requests and authenticate via OAuth 2(or 3?) times. If they use Azure, +1 request and Auth. Unless I'm missing something?