Please, when possible, add Sambanova models to the list of litellm custom providers #1296
Labels
bug
Something isn't working
enhancement
New feature or request
question
Further information is requested
Request to litellm:
litellm.completion(messages=[{'role': 'user', 'content': 'Extract: Jason is 25 years old'}], model='sambanova/Meta-Llama-3.1-70B-Instruct', tools=[{'type': 'function', 'function': {'name': 'User', 'description': 'Correctly extracted
User
with all the required parameters with correct types', 'parameters': {'properties': {'name': {'title': 'Name', 'type': 'string'}, 'age': {'title': 'Age', 'type': 'integer'}}, 'required': ['age', 'name'], 'type': 'object'}}}], tool_choice={'type': 'function', 'function': {'name': 'User'}})File "/home/umby/groq/lib/python3.11/site-packages/instructor/retry.py", line 181, in retry_sync
raise InstructorRetryException(
instructor.exceptions.InstructorRetryException: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=sambanova/Meta-Llama-3.1-70B-Instruct
Pass model as E.g. For 'Huggingface' inference endpoints pass in
completion(model='huggingface/starcoder',..)
Learn more: https://docs.litellm.ai/docs/providersThe text was updated successfully, but these errors were encountered: