Skip to content

Commit

Permalink
more docs passing!
Browse files Browse the repository at this point in the history
  • Loading branch information
jxnl committed Feb 5, 2024
1 parent 1581629 commit e45740e
Show file tree
Hide file tree
Showing 7 changed files with 78 additions and 61 deletions.
20 changes: 10 additions & 10 deletions docs/concepts/distillation.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,16 +57,16 @@ for _ in range(10):
a = random.randint(100, 999)
b = random.randint(100, 999)
print(fn(a, b))
#> a=464 b=781 result=362384
#> a=260 b=758 result=197080
#> a=509 b=801 result=407709
#> a=403 b=694 result=279682
#> a=834 b=282 result=235188
#> a=767 b=757 result=580619
#> a=849 b=844 result=716556
#> a=549 b=437 result=239913
#> a=478 b=637 result=304486
#> a=545 b=969 result=528105
#> a=373 b=297 result=110781
#> a=988 b=392 result=387296
#> a=688 b=817 result=562096
#> a=804 b=592 result=475968
#> a=397 b=386 result=153242
#> a=584 b=842 result=491728
#> a=887 b=772 result=684764
#> a=187 b=648 result=121176
#> a=632 b=450 result=284400
#> a=947 b=749 result=709303
```

## The Intricacies of Fine-tuning Language Models
Expand Down
4 changes: 4 additions & 0 deletions docs/concepts/enums.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,10 @@ class UserDetail(BaseModel):
If you're having a hard time with `Enum` and alternative is to use `Literal` instead.

```python hl_lines="4"
from typing import Literal
from pydantic import BaseModel


class UserDetail(BaseModel):
age: int
name: str
Expand Down
1 change: 1 addition & 0 deletions docs/concepts/fastapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ def endpoint_function(data: UserData) -> UserDetail:
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
from typing import Iterable
from pydantic import BaseModel

app = FastAPI()

Expand Down
59 changes: 21 additions & 38 deletions docs/concepts/maybe.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ This pattern is particularly useful when making LLM calls, as providing language
Using Pydantic, we'll first define the `UserDetail` and `MaybeUser` classes.

```python
from pydantic import BaseModel, Field, Optional
from pydantic import BaseModel, Field
from typing import Optional


class UserDetail(BaseModel):
Expand All @@ -36,11 +37,28 @@ Once we have the model defined, we can create a function that uses the `Maybe` p
```python
import instructor
from openai import OpenAI
from pydantic import BaseModel, Field, Optional
from typing import Optional

# This enables the `response_model` keyword
client = instructor.patch(OpenAI())


class UserDetail(BaseModel):
age: int
name: str
role: Optional[str] = Field(default=None)


class MaybeUser(BaseModel):
result: Optional[UserDetail] = Field(default=None)
error: bool = Field(default=False)
message: Optional[str] = Field(default=None)

def __bool__(self):
return self.result is not None


def extract(content: str) -> MaybeUser:
return openai.chat.completions.create(
model="gpt-3.5-turbo",
Expand All @@ -52,47 +70,12 @@ def extract(content: str) -> MaybeUser:


user1 = extract("Jason is a 25-year-old scientist")
# output:
{
"result": {"age": 25, "name": "Jason", "role": "scientist"},
"error": false,
"message": null,
}
print(user1.model_dump_json(indent=2))

user2 = extract("Unknown user")
# output:
{"result": null, "error": true, "message": "User not found"}
print(user2.model_dump_json(indent=2))
```

As you can see, when the data is extracted successfully, the `result` field contains the `UserDetail` instance. When an error occurs, the `error` field is set to `True`, and the `message` field contains the error message.

## Handling the result

There are a few ways we can handle the result. Normally, we can just access the individual fields.

```python
def process_user_detail(maybe_user: MaybeUser):
if not maybe_user.error:
user = maybe_user.result
print(f"User {user.name} is {user.age} years old")
else:
print(f"Not found: {user1.message}")
```

### Pattern Matching

We can also use pattern matching to handle the result. This is a great way to handle errors in a structured way.

```python
def process_user_detail(maybe_user: MaybeUser):
match maybe_user:
case MaybeUser(error=True, message=msg):
print(f"Error: {msg}")
case MaybeUser(result=user_detail) if user_detail:
assert isinstance(user_detail, UserDetail)
print(f"User {user_detail.name} is {user_detail.age} years old")
case _:
print("Unknown error")
```

If you want to learn more about pattern matching, check out Pydantic's docs on [Structural Pattern Matching](https://docs.pydantic.dev/latest/concepts/models/#structural-pattern-matching)
14 changes: 6 additions & 8 deletions docs/concepts/patching.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,19 +15,19 @@ There are three methods for structured output:
## Function Calling

```python
from openai import OpenAI
import instructor
from openai import OpenAI

client = instructor.patch(OpenAI())
client = instructor.patch(OpenAI(), mode=instructor.Mode.FUNCTIONS)
```

## Tool Calling

```python
import instructor
from instructor import Mode
from openai import OpenAI

client = instructor.patch(OpenAI(), mode=Mode.TOOLS)
client = instructor.patch(OpenAI(), mode=instructor.Mode.TOOLS)
```

## JSON Mode
Expand All @@ -48,10 +48,9 @@ client = instructor.patch(OpenAI(), mode=Mode.JSON)

```python
import instructor
from instructor import Mode
from openai import OpenAI

client = instructor.patch(OpenAI(), mode=Mode.MD_JSON)
client = instructor.patch(OpenAI(), mode=instructor.Mode.MD_JSON)
```

### Schema Integration
Expand All @@ -61,12 +60,11 @@ In JSON Mode, the schema is part of the system message:
```python
import instructor
from openai import OpenAI
from pydantic import BaseModel

client = instructor.patch(OpenAI())


class UserExtract(BaseModel):
class UserExtract(instructor.OpenAISchema):
name: str
age: int

Expand Down
8 changes: 4 additions & 4 deletions docs/concepts/raw_response.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ user: UserExtract = client.chat.completions.create(
print(user._raw_response)
"""
ChatCompletion(
id='chatcmpl-8oxRV9dsLMi55VXvTsS61RGo0s8iU',
id='chatcmpl-8oxXT2lKjVu1yuUcPpqhAnFIA7ED8',
choices=[
Choice(
finish_reason='stop',
Expand All @@ -35,18 +35,18 @@ ChatCompletion(
content=None,
role='assistant',
function_call=FunctionCall(
arguments='{\n "name": "jason",\n "age": 25\n}',
arguments='{\n "name": "Jason",\n "age": 25\n}',
name='UserExtract',
),
tool_calls=None,
),
)
],
created=1707155589,
created=1707155959,
model='gpt-3.5-turbo-0613',
object='chat.completion',
system_fingerprint=None,
usage=CompletionUsage(completion_tokens=17, prompt_tokens=73, total_tokens=90),
usage=CompletionUsage(completion_tokens=16, prompt_tokens=73, total_tokens=89),
)
"""
```
Expand Down
33 changes: 32 additions & 1 deletion docs/concepts/retrying.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,8 +70,19 @@ If you want more control over how we define retries such as back-offs and additi
Rather than using the decorator `@retry`, we can use the `Retrying` and `AsyncRetrying` classes to define our own retry logic.

```python
import openai
import instructor
from pydantic import BaseModel
from tenacity import Retrying, stop_after_attempt, wait_fixed

client = instructor.patch(openai.OpenAI(), mode=instructor.Mode.TOOLS)


class UserDetail(BaseModel):
name: str
age: int


response = client.chat.completions.create(
model="gpt-4-turbo-preview",
response_model=UserDetail,
Expand All @@ -83,6 +94,13 @@ response = client.chat.completions.create(
wait=wait_fixed(1), # (2)!
), # (3)!
)
print(response.model_dump_json(indent=2))
"""
{
"name": "jason",
"age": 12
}
"""
```

1. We stop after 2 attempts
Expand All @@ -94,7 +112,18 @@ response = client.chat.completions.create(
If you're using asynchronous code, you can use `AsyncRetrying` instead.

```python
from tenacity import AsyncRetrying, stop_after_attempt, wait_fixed
import openai
import instructor
from pydantic import BaseModel
from tenacity import stop_after_attempt, wait_fixed

client = instructor.patch(openai.OpenAI(), mode=instructor.Mode.TOOLS)


class UserDetail(BaseModel):
name: str
age: int


response = await client.chat.completions.create(
model="gpt-4-turbo-preview",
Expand All @@ -107,6 +136,8 @@ response = await client.chat.completions.create(
wait=wait_fixed(1),
),
)

print(response.model_dump_json(indent=2))
```

## Other Features of Tenacity
Expand Down

0 comments on commit e45740e

Please sign in to comment.