Skip to content

Commit

Permalink
wip
Browse files Browse the repository at this point in the history
  • Loading branch information
isahers1 committed Jan 10, 2025
1 parent f71402c commit c09fe78
Show file tree
Hide file tree
Showing 23 changed files with 68 additions and 82 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,8 @@ The API key will be shown only once, so make sure to copy it and store it in a s

## Configure the SDK

You may set the following environment variables in addition to `LANGCHAIN_API_KEY` (or equivalently `LANGSMITH_API_KEY`).
You may set the following environment variables in addition to `LANGSMITH_API_KEY`.
These are only required if using the EU instance.

:::info
`LANGCHAIN_HUB_API_URL` is only required if using the legacy langchainhub sdk
:::

`LANGCHAIN_ENDPOINT=`<RegionalUrl type='api' link={false} />
`LANGCHAIN_HUB_API_URL=`<RegionalUrl type='hub' link={false} />
`LANGSMITH_ENDPOINT=`<RegionalUrl type='api' link={false} />
Original file line number Diff line number Diff line change
Expand Up @@ -174,10 +174,10 @@ import requests


def main():
api_key = os.environ["LANGCHAIN_API_KEY"]
# LANGCHAIN_ORGANIZATION_ID is not a standard environment variable in the SDK, just used for this example
organization_id = os.environ["LANGCHAIN_ORGANIZATION_ID"]
base_url = os.environ.get("LANGCHAIN_ENDPOINT") or "https://api.smith.langchain.com"
api_key = os.environ["LANGSMITH_API_KEY"]
# LANGSMITH_ORGANIZATION_ID is not a standard environment variable in the SDK, just used for this example
organization_id = os.environ["LANGSMITH_ORGANIZATION_ID"]
base_url = os.environ.get("LANGSMITH_ENDPOINT") or "https://api.smith.langchain.com"
headers = {
"Content-Type": "application/json",
"X-API-Key": api_key,
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/how_to_guides/evaluate_with_attachments.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ png_url = "https://www.w3.org/Graphics/PNG/nurbcup2si.png"\n
pdf_bytes = requests.get(pdf_url).content
wav_bytes = requests.get(wav_url).content
png_bytes = requests.get(png_url).content\n
# Define the LANGCHAIN_API_KEY environment variable with your API key
# Define the LANGSMITH_API_KEY environment variable with your API key
langsmith_client = Client()\n
dataset_name = "attachment-test-dataset:" + str(uuid.uuid4())[0:8]\n
dataset = langsmith_client.create_dataset(
Expand Down Expand Up @@ -104,7 +104,7 @@ if (!response.ok) {
const pdfArrayBuffer = await fetchArrayBuffer(pdfUrl);
const wavArrayBuffer = await fetchArrayBuffer(wavUrl);
const pngArrayBuffer = await fetchArrayBuffer(pngUrl);\n
// Create the LangSmith client (Ensure LANGCHAIN_API_KEY is set in env)
// Create the LangSmith client (Ensure LANGSMITH_API_KEY is set in env)
const langsmithClient = new Client();\n
// Create a unique dataset name
const datasetName = "attachment-test-dataset:" + uuid4().substring(0, 8);\n
Expand Down
10 changes: 5 additions & 5 deletions docs/evaluation/how_to_guides/unit_testing.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -162,10 +162,10 @@ The `expect` utility is modeled off of [Jest](https://jestjs.io/docs/expect)'s e

#### Dry-run mode

If you want to run the tests without syncing the results to LangSmith, you can set `LANGCHAIN_TEST_TRACKING=false` in your environment.
If you want to run the tests without syncing the results to LangSmith, you can set `LANGSMITH_TEST_TRACKING=false` in your environment.

```bash
LANGCHAIN_TEST_TRACKING=false pytest tests/
LANGSMITH_TEST_TRACKING=false pytest tests/
```

The tests will run as normal, but the experiment logs will not be sent to LangSmith.
Expand All @@ -174,10 +174,10 @@ The tests will run as normal, but the experiment logs will not be sent to LangSm

LLMs on every commit in CI can get expensive. To save time and resources, LangSmith lets you cache results to disk. Any identical inputs will be loaded from the cache so you don't have to call out to your LLM provider unless there are changes to the model, prompt, or retrieved data.

To enable caching, run with `LANGCHAIN_TEST_CACHE=/my/cache/path`. For example:
To enable caching, run with `LANGSMITH_TEST_CACHE=/my/cache/path`. For example:

```bash
LANGCHAIN_TEST_CACHE=tests/cassettes pytest tests/my_llm_tests
LANGSMITH_TEST_CACHE=tests/cassettes pytest tests/my_llm_tests
```

All requests will be cached to `tests/cassettes` and loaded from there on subsequent runs. If you check this in to your repository, your CI will be able to use the cache as well.
Expand All @@ -188,7 +188,7 @@ With caching enabled, you can iterate quickly on your tests using `watch` mode w

```bash
pip install pytest-watch
LANGCHAIN_TEST_CACHE=tests/cassettes ptw tests/my_llm_tests
LANGSMITH_TEST_CACHE=tests/cassettes ptw tests/my_llm_tests
```

## Explanations
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ body = {
resp = requests.post(
"https://api.smith.langchain.com/api/v1/datasets/upload-experiment",
json=body,
headers={"x-api-key": os.environ["LANGCHAIN_API_KEY"]}
headers={"x-api-key": os.environ["LANGSMITH_API_KEY"]}
)
print(resp.json())
```
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,8 @@ To create an API key head to the <RegionalUrl text='Settings page' suffix='/sett

<CodeTabs
tabs={[
ShellBlock(`export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY="<your-langchain-api-key>"
ShellBlock(`export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY="<your-langchain-api-key>"
# The example uses OpenAI, but it's not necessary in general
export OPENAI_API_KEY="<your-openai-api-key>"`),
]}
Expand Down
4 changes: 2 additions & 2 deletions docs/evaluation/tutorials/agents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ def _set_env(var: str) -> None:
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"Set {var}: ")

os.environ["LANGCHAIN_TRACING_V2"] = "true"
_set_env("LANGCHAIN_API_KEY")
os.environ["LANGSMITH_TRACING"] = "true"
_set_env("LANGSMITH_API_KEY")
_set_env("OPENAI_API_KEY")
#endregion
```
Expand Down
8 changes: 4 additions & 4 deletions docs/evaluation/tutorials/backtesting.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,10 +45,10 @@ import os

# Set the project name to whichever project you'd like to be testing against
project_name = "Tweet Writing Task"
os.environ["LANGCHAIN_PROJECT"] = project_name
os.environ["LANGCHAIN_TRACING_V2"] = "true"
if not os.environ.get("LANGCHAIN_API_KEY"):
os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("YOUR API KEY")
os.environ["LANGSMITH_PROJECT"] = project_name
os.environ["LANGSMITH_TRACING"] = "true"
if not os.environ.get("LANGSMITH_API_KEY"):
os.environ["LANGSMITH_API_KEY"] = getpass.getpass("YOUR API KEY")

# Optional. You can swap OpenAI for any other tool-calling chat model.
os.environ["OPENAI_API_KEY"] = "YOUR OPENAI API KEY"
Expand Down
8 changes: 4 additions & 4 deletions docs/evaluation/tutorials/rag.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,13 +47,13 @@ First, let's set our environment variables:
python`
import os
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "YOUR LANGCHAIN API KEY"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["LANGSMITH_API_KEY"] = "YOUR LANGSMITH API KEY"
os.environ["OPENAI_API_KEY"] = "YOUR OPENAI API KEY"
`,
typescript`
process.env.LANGCHAIN_TRACING_V2 = "true";
process.env.LANGCHAIN_API_KEY = "YOUR LANGCHAIN API KEY";
process.env.LANGSMITH_TRACING = "true";
process.env.LANGSMITH_API_KEY = "YOUR LANGSMITH API KEY";
process.env.OPENAI_API_KEY = "YOUR OPENAI API KEY";
`,
]}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ messages = [
# highlight-next-line
# You can set metadata & tags **statically** when decorating a function
# Use the @traceable decorator with tags and metadata
# Ensure that the LANGCHAIN_TRACING_V2 environment variables are set for @traceable to work
# Ensure that the LANGSMITH_TRACING environment variables are set for @traceable to work
@ls.traceable(
run_type="llm",
name="OpenAI Call Decorator",
Expand Down
10 changes: 5 additions & 5 deletions docs/observability/how_to_guides/tracing/annotate_code.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,9 @@ If you are using LangChain (either Python or JS/TS), you can skip this section a
LangSmith makes it easy to log traces with minimal changes to your existing code with the `@traceable` decorator in Python and `traceable` function in TypeScript.

:::note
The `LANGCHAIN_TRACING_V2` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `@traceable` or `traceable`. This allows you to toggle tracing on and off without changing your code.
The `LANGSMITH_TRACING` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `@traceable` or `traceable`. This allows you to toggle tracing on and off without changing your code.

Additionally, you will need to set the `LANGCHAIN_API_KEY` environment variable to your API key (see [Setup](/) for more information).
Additionally, you will need to set the `LANGSMITH_API_KEY` environment variable to your API key (see [Setup](/) for more information).

By default, the traces will be logged to a project named `default`.
To log traces to a different project, see [this section](./log_traces_to_project).
Expand Down Expand Up @@ -172,9 +172,9 @@ The wrapper works seamlessly with the `@traceable` decorator or `traceable` func
Tool calls are automatically rendered

:::note
The `LANGCHAIN_TRACING_V2` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `wrap_openai` or `wrapOpenAI`. This allows you to toggle tracing on and off without changing your code.
The `LANGSMITH_TRACING` environment variable must be set to `'true'` in order for traces to be logged to LangSmith, even when using `wrap_openai` or `wrapOpenAI`. This allows you to toggle tracing on and off without changing your code.

Additionally, you will need to set the `LANGCHAIN_API_KEY` environment variable to your API key (see [Setup](/) for more information).
Additionally, you will need to set the `LANGSMITH_API_KEY` environment variable to your API key (see [Setup](/) for more information).

By default, the traces will be logged to a project named `default`.
To log traces to a different project, see [this section](./log_traces_to_project).
Expand Down Expand Up @@ -232,7 +232,7 @@ await chatPipeline("Can you summarize this morning's meetings?");`),
## Use the `RunTree` API

Another, more explicit way to log traces to LangSmith is via the `RunTree` API. This API allows you more control over your tracing - you can manually
create runs and children runs to assemble your trace. You still need to set your `LANGCHAIN_API_KEY`, but `LANGCHAIN_TRACING_V2` is not
create runs and children runs to assemble your trace. You still need to set your `LANGSMITH_API_KEY`, but `LANGSMITH_TRACING` is not
necessary for this method.

This method is not recommended, as it's easier to make mistakes in propagating trace context.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ You can change the destination project of your traces both statically through en

## Set the destination project statically

As mentioned in the [Tracing Concepts](/observability/concepts#projects) section, LangSmith uses the concept of a `Project` to group traces. If left unspecified, the project is set to `default`. You can set the `LANGCHAIN_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.
As mentioned in the [Tracing Concepts](/observability/concepts#projects) section, LangSmith uses the concept of a `Project` to group traces. If left unspecified, the project is set to `default`. You can set the `LANGSMITH_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.

```bash
export LANGCHAIN_PROJECT=my-custom-project
export LANGSMITH_PROJECT=my-custom-project
```

If the project specified does not exist, it will be created automatically when the first trace is ingested.
Expand All @@ -29,7 +29,7 @@ If the project specified does not exist, it will be created automatically when t
You can also set the project name at program runtime in various ways, depending on how you are [annotating your code for tracing](./annotate_code). This is useful when you want to log traces to different projects within the same application.

:::note
Setting the project name dynamically using one of the below methods overrides the project name set by the `LANGCHAIN_PROJECT` environment variable.
Setting the project name dynamically using one of the below methods overrides the project name set by the `LANGSMITH_PROJECT` environment variable.
:::

<CodeTabs
Expand All @@ -43,7 +43,7 @@ messages = [
{"role": "user", "content": "Hello!"}
]\n
# Use the @traceable decorator with the 'project_name' parameter to log traces to LangSmith
# Ensure that the LANGCHAIN_TRACING_V2 environment variables is set for @traceable to work
# Ensure that the LANGSMITH_TRACING environment variables is set for @traceable to work
@traceable(
run_type="llm",
name="OpenAI Call Decorator",
Expand All @@ -68,7 +68,7 @@ call_openai(
)\n
# The wrapped OpenAI client accepts all the same langsmith_extra parameters
# as @traceable decorated functions, and logs traces to LangSmith automatically.
# Ensure that the LANGCHAIN_TRACING_V2 environment variables is set for the wrapper to work.
# Ensure that the LANGSMITH_TRACING environment variables is set for the wrapper to work.
from langsmith import wrappers
wrapped_client = wrappers.wrap_openai(client)
wrapped_client.chat.completions.create(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ In some situations, you may need to prevent the inputs and outputs of your trace
If you want to completely hide the inputs and outputs of your traces, you can set the following environment variables when running your application:

```bash
LANGCHAIN_HIDE_INPUTS=true
LANGCHAIN_HIDE_OUTPUTS=true
LANGSMITH_HIDE_INPUTS=true
LANGSMITH_HIDE_OUTPUTS=true
```

This works for both the LangSmith SDK (Python and TypeScript) and LangChain.
Expand Down Expand Up @@ -98,7 +98,7 @@ This feature is available in the following LangSmith SDK versions:

To mask specific data in inputs and outputs, you can use the `create_anonymizer` / `createAnonymizer` function and pass the newly created anonymizer when instantiating the client. The anonymizer can be either constructed from a list of regex patterns and the replacement values or from a function that accepts and returns a string value.

The anonymizer will be skipped for inputs if `LANGCHAIN_HIDE_INPUTS = true`. Same applies for outputs if `LANGCHAIN_HIDE_OUTPUTS = true`.
The anonymizer will be skipped for inputs if `LANGSMITH_HIDE_INPUTS = true`. Same applies for outputs if `LANGSMITH_HIDE_OUTPUTS = true`.

However, if inputs or outputs are to be sent to client, the `anonymizer` method will take precedence over functions found in `hide_inputs` and `hide_outputs`. By default, the `create_anonymizer` will only look at maximum of 10 nesting levels deep, which can be configured via the `max_depth` parameter.

Expand Down
4 changes: 2 additions & 2 deletions docs/observability/how_to_guides/tracing/sample_traces.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@ This section is relevant for those using the LangSmith SDK or LangChain, not for
:::

By default, all traces are logged to LangSmith.
To down-sample the number of traces logged to LangSmith, set the `LANGCHAIN_TRACING_SAMPLING_RATE` environment variable to
To down-sample the number of traces logged to LangSmith, set the `LANGSMITH_TRACING_SAMPLING_RATE` environment variable to
any float between `0` (no traces) and `1` (all traces).
For instance, setting the following environment variable will log 75% of the traces.

```bash
export LANGCHAIN_TRACING_SAMPLING_RATE=0.75
export LANGSMITH_TRACING_SAMPLING_RATE=0.75
```

This works for the `traceable` decorator and `RunTree` objects.
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,5 @@ This section is only relevant for users who are

:::

If you've decided you no longer want to trace your runs, you can unset the `LANGCHAIN_TRACING_V2` environment variable. Traces will no longer be logged to LangSmith.
If you've decided you no longer want to trace your runs, you can unset the `LANGSMITH_TRACING` environment variable. Traces will no longer be logged to LangSmith.
Note that this currently does not affect the `RunTree` objects or API users, as these are meant to be low-level and not affected by the tracing toggle.
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ We provide a convenient integration with [Instructor](https://jxnl.github.io/ins
In order to use, you first need to set your LangSmith API key.

```shell
export LANGCHAIN_API_KEY=<your-api-key>
export LANGSMITH_API_KEY=<your-api-key>
```

Next, you will need to install the LangSmith SDK:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ chain.invoke({"question": "Am I using a callback?", "context": "I'm using a call
from langchain_core.tracers.context import tracing_v2_enabled
with tracing_v2_enabled():
chain.invoke({"question": "Am I using a context manager?", "context": "I'm using a context manager"})\n
# This will NOT be traced (assuming LANGCHAIN_TRACING_V2 is not set)
# This will NOT be traced (assuming LANGSMITH_TRACING is not set)
chain.invoke({"question": "Am I being traced?", "context": "I'm not being traced"})`),
TypeScriptBlock(`// You can configure a LangChainTracer instance to trace a specific invocation.
import { LangChainTracer } from "@langchain/core/tracers/tracer_langchain";\n
Expand All @@ -84,10 +84,10 @@ await chain.invoke(

### Statically

As mentioned in the [tracing conceptual guide](../../concepts) LangSmith uses the concept of a Project to group traces. If left unspecified, the tracer project is set to default. You can set the `LANGCHAIN_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.
As mentioned in the [tracing conceptual guide](../../concepts) LangSmith uses the concept of a Project to group traces. If left unspecified, the tracer project is set to default. You can set the `LANGSMITH_PROJECT` environment variable to configure a custom project name for an entire application run. This should be done before executing your application.

```shell
export LANGCHAIN_PROJECT=my-project
export LANGSMITH_PROJECT=my-project
```

### Dynamically
Expand Down Expand Up @@ -317,10 +317,10 @@ try {

As mentioned in other guides, the following environment variables allow you to configure tracing enabled, the api endpoint, the api key, and the tracing project:

- `LANGCHAIN_TRACING_V2`
- `LANGCHAIN_API_KEY`
- `LANGCHAIN_ENDPOINT`
- `LANGCHAIN_PROJECT`
- `LANGSMITH_TRACING`
- `LANGSMITH_API_KEY`
- `LANGSMITH_ENDPOINT`
- `LANGSMITH_PROJECT`

However, in some environments, it is not possible to set environment variables. In these cases, you can set the tracing configuration programmatically.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ import { generateText } from "ai";

interface Env {
OPENAI_API_KEY: string;
LANGSMITH_TRACING_V2: string;
LANGSMITH_TRACING: string;
LANGSMITH_ENDPOINT: string;
LANGSMITH_API_KEY: string;
}
Expand All @@ -218,9 +218,9 @@ const handler = {
model,
prompt: "Tell me a joke",
experimental_telemetry: AISDKExporter.getSettings({
// As `process.env.LANGSMITH_TRACING_V2` is undefined in Cloudflare Workers,
// As `process.env.LANGSMITH_TRACING` is undefined in Cloudflare Workers,
// we need to check the environment variable directly.
isEnabled: env.LANGSMITH_TRACING_V2 === "true",
isEnabled: env.LANGSMITH_TRACING === "true",
}),
});

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,16 +14,16 @@ import { RegionalUrl } from "@site/src/components/RegionalUrls";

As mentioned in other guides, the following environment variables allow you to configure tracing enabled, the api endpoint, the api key, and the tracing project:

- `LANGCHAIN_TRACING_V2`
- `LANGCHAIN_API_KEY`
- `LANGCHAIN_ENDPOINT`
- `LANGCHAIN_PROJECT`
- `LANGSMITH_TRACING`
- `LANGSMITH_API_KEY`
- `LANGSMITH_ENDPOINT`
- `LANGSMITH_PROJECT`

In some environments, it is not possible to set environment variables. In these cases, you can set the tracing configuration programmatically.

:::caution Recently changed behavior
Due to a number of asks for finer-grained control of tracing using the `trace` context manager,
**we changed the behavior** of `with trace` to honor the `LANGCHAIN_TRACING_V2` environment variable in version **0.1.95** of the Python SDK. You can find more details in the [release notes](https://github.com/langchain-ai/langsmith-sdk/releases/tag/v0.1.95).
**we changed the behavior** of `with trace` to honor the `LANGSMITH_TRACING` environment variable in version **0.1.95** of the Python SDK. You can find more details in the [release notes](https://github.com/langchain-ai/langsmith-sdk/releases/tag/v0.1.95).
The recommended way to disable/enable tracing without setting environment variables is to use the `with tracing_context` context manager, as shown in the example below.
:::

Expand Down
Loading

0 comments on commit c09fe78

Please sign in to comment.