-
When I use Azure OpenAI, I often encounter errors, but occasionally it succeeds. I am not sure if the current instructor can use the Azure OpenAI API. Below is the function and frequent error message.
Describe the bug To Reproduce
Expected behavior Screenshots Desktop (please complete the following information):
Additional context |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments 2 replies
-
it does not support streaming but everything else should work. Might be that means that max retries is adding something |
Beta Was this translation helpful? Give feedback.
-
While trying the examples provided in the README using Azure OpenAI, I've encountered some issue with the following code: from pydantic import BaseModel, ValidationError, BeforeValidator
from typing_extensions import Annotated
from instructor import llm_validator
class QuestionAnswer(BaseModel):
question: str
answer: Annotated[
str,
BeforeValidator(llm_validator("don't say objectionable things"))
]
try:
qa = QuestionAnswer(
question="What is the meaning of life?",
answer="The meaning of life is to be evil and steal",
)
except ValidationError as e:
print(e) The error message is the following: InvalidRequestError
Traceback (most recent call last)
[/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb) Cell 11 line 1
[7](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=6) answer: Annotated[
[8](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=7) str,
[9](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=8) BeforeValidator(llm_validator("don't say objectionable things"))
[10](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=9) ]
[12](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=11) try:
---> [13](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=12) qa = QuestionAnswer(
[14](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=13) question="What is the meaning of life?",
[15](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=14) answer="The meaning of life is to be evil and steal",
[16](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=15) )
[17](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=16) except ValidationError as e:
[18](vscode-notebook-cell:/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/introduction.ipynb#X12sZmlsZQ%3D%3D?line=17) print(e)
[... skipping hidden 1 frame]
File [~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67), in llm_validator.<locals>.llm(v)
[66](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:66) def llm(v):
---> [67](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:67) resp = openai.ChatCompletion.create(
[68](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:68) functions=[Validator.openai_schema],
[69](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:69) function_call={"name": Validator.openai_schema["name"]},
[70](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:70) messages=[
[71](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:71) {
[72](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/instructor/dsl/validators.py:72) "role": "system",
...
[89](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:89) )
[90](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:90) else:
[91](https://file+.vscode-resource.vscode-cdn.net/Users/xxxxxxxxx/Documents/Projects/GenerativeAI/Instructor/~/miniconda3/envs/openai/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py:91) if model is None and engine is None:
InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?7d480287-eba0-4dee-b8e4-cd8f0030d85f) or open in a [text editor](command:workbench.action.openLargeOutput?7d480287-eba0-4dee-b8e4-cd8f0030d85f). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)... I then checked in the validators.py the function def llm(v):
resp = openai.ChatCompletion.create(
functions=[Validator.openai_schema],
function_call={"name": Validator.openai_schema["name"]},
messages=[
{
"role": "system",
"content": "You are a world class validation model. Capable to determine if the following value is valid for the statement, if it is not, explain why and suggest a new value.",
},
{
"role": "user",
"content": f"Does `{v}` follow the rules: {statement}",
},
],
model=model,
temperature=temperature,
) # type: ignore
resp = Validator.from_response(resp) This code explain why I have this error. How to use Instructor with Azure OpenAI? |
Beta Was this translation helpful? Give feedback.
-
oh interesting i see, theres some hard coded |
Beta Was this translation helpful? Give feedback.
-
For now I'd just use this code, copy paste this and run it locally. from pydantic import Field
from typing import Optional
import instructor
import openai
class Validator(instructor.OpenAISchema):
"""
Validate if an attribute is correct and if not,
return a new value with an error message
"""
is_valid: bool = Field(
default=True,
description="Whether the attribute is valid based on the requirements",
)
reason: Optional[str] = Field(
default=None,
description="The error message if the attribute is not valid, otherwise None",
)
fixed_value: Optional[str] = Field(
default=None,
description="If the attribute is not valid, suggest a new value for the attribute",
)
def llm_validator(
statement: str,
allow_override: bool = False,
engine: str = "gpt-35-turbo",
temperature: float = 0,
):
"""
Create a validator that uses the LLM to validate an attribute
## Usage
```python
from instructor import llm_validator
from pydantic import BaseModel, Field, field_validator
class User(BaseModel):
name: str = Annotated[str, llm_validator("The name must be a full name all lowercase")]
age: int = Field(description="The age of the person")
try:
user = User(name="Jason Liu", age=20)
except ValidationError as e:
print(e)
```
```
1 validation error for User
name
The name is valid but not all lowercase (type=value_error.llm_validator)
```
Note that there, the error message is written by the LLM, and the error type is `value_error.llm_validator`.
Parameters:
statement (str): The statement to validate
model (str): The LLM to use for validation (default: "gpt-3.5-turbo-0613")
temperature (float): The temperature to use for the LLM (default: 0)
"""
def llm(v):
resp = openai.ChatCompletion.create(
functions=[Validator.openai_schema],
function_call={"name": Validator.openai_schema["name"]},
messages=[
{
"role": "system",
"content": "You are a world class validation model. Capable to determine if the following value is valid for the statement, if it is not, explain why and suggest a new value.",
},
{
"role": "user",
"content": f"Does `{v}` follow the rules: {statement}",
},
],
engine=engine,
temperature=temperature,
) # type: ignore
resp = Validator.from_response(resp)
# If the response is not valid, return the reason, this could be used in
# the future to generate a better response, via reasking mechanism.
assert resp.is_valid, resp.reason
if allow_override and not resp.is_valid and resp.fixed_value is not None:
# If the value is not valid, but we allow override, return the fixed value
return resp.fixed_value
return v
return llm |
Beta Was this translation helpful? Give feedback.
-
does now! |
Beta Was this translation helpful? Give feedback.
-
Just as an FYI to others, I ran into this error while using instructor v0.3.5 with Azure:
The fix was to switch away from the |
Beta Was this translation helpful? Give feedback.
-
@jxnl I'm not sure if this is a specific problem for Azure, but when the function calling fails validations for the first time, instructor passes in the previous response back to openAI to get it fixed. However if the previous response does not have a "content" key, then the second inference calls fails with the 400 error :
This is happening for the Azure Open AI service (2023-09-01-preview) for gpt-3.5-turbo-0613, openai = "0.28.1", instructor = "<0.3.0". Did you encounter this problem earlier? |
Beta Was this translation helpful? Give feedback.
-
https://medium.com/@dipam44/using-instructor-with-a-private-azure-endpoint-df03c40e4c71 |
Beta Was this translation helpful? Give feedback.
Just as an FYI to others, I ran into this error while using instructor v0.3.5 with Azure:
openai.NotFoundError: Error code: 404 - {'error': {'message': 'Unrecognized request argument supplied: functions', 'type': 'invalid_request_error', 'param': None, 'code': None}}
The fix was to switch away from the
2023-05-15
version of the Azure API. Version2023-09-01-preview
did the trick, though I haven't checked whether there are others that also work.