Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "2.30.0"
".": "2.31.0"
}
6 changes: 3 additions & 3 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 152
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-00994178cc8e20d71754b00c54b0e4f5b4128e1c1cce765e9b7d696bd8c80d33.yml
openapi_spec_hash: 81f404053b663f987209b4fb2d08a230
config_hash: 5635033cdc8c930255f8b529a78de722
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-a6eca1bd01e0c434af356fe5275c206057216a4e626d1051d294c27016cd6d05.yml
openapi_spec_hash: 68abda9122013a9ae3f084cfdbe8e8c1
config_hash: 4975e16a94e8f9901428022044131888
29 changes: 29 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,34 @@
# Changelog

## 2.31.0 (2026-04-08)

Full Changelog: [v2.30.0...v2.31.0](https://github.com/openai/openai-python/compare/v2.30.0...v2.31.0)

### Features

* **api:** add phase field to conversations message ([3e5834e](https://github.com/openai/openai-python/commit/3e5834efb39b24e019a29dc54d890c67d18cbb54))
* **api:** add web_search_call.results to ResponseIncludable type ([ffd8741](https://github.com/openai/openai-python/commit/ffd8741dd38609a5af0159ceb800d8ddba7925f8))
* **client:** add support for short-lived tokens ([#1608](https://github.com/openai/openai-python/issues/1608)) ([22fe722](https://github.com/openai/openai-python/commit/22fe7228d4990c197cd721b3ad7931ad05cca5dd))
* **client:** support sending raw data over websockets ([f1bc52e](https://github.com/openai/openai-python/commit/f1bc52ef641dfca6fdf2a5b00ce3b09bff2552f5))
* **internal:** implement indices array format for query and form serialization ([49194cf](https://github.com/openai/openai-python/commit/49194cfa711328216ff131d6f65c9298822a7c51))


### Bug Fixes

* **client:** preserve hardcoded query params when merging with user params ([92e109c](https://github.com/openai/openai-python/commit/92e109c3d9569a942e1919e75977dc13fa015f9a))
* **types:** remove web_search_call.results from ResponseIncludable ([d3cc401](https://github.com/openai/openai-python/commit/d3cc40165cd86015833d15167cc7712b4102f932))


### Chores

* **tests:** bump steady to v0.20.1 ([d60e2ee](https://github.com/openai/openai-python/commit/d60e2eea7f6916540cd4ba901dceb07051119da4))
* **tests:** bump steady to v0.20.2 ([6508d47](https://github.com/openai/openai-python/commit/6508d474332d4e82d9615c0a9a77379f9b5e4412))


### Documentation

* **api:** update file parameter descriptions in vector_stores files and file_batches ([a9e7ebd](https://github.com/openai/openai-python/commit/a9e7ebd505b9ae90514339aa63c6f1984a08cf6b))

## 2.30.0 (2026-03-25)

Full Changelog: [v2.29.0...v2.30.0](https://github.com/openai/openai-python/compare/v2.29.0...v2.30.0)
Expand Down
103 changes: 103 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,109 @@ to add `OPENAI_API_KEY="My API Key"` to your `.env` file
so that your API key is not stored in source control.
[Get an API key here](https://platform.openai.com/settings/organization/api-keys).

### Workload Identity Authentication

For secure, automated environments like cloud-managed Kubernetes, Azure, and Google Cloud Platform, you can use workload identity authentication with short-lived tokens from cloud identity providers instead of long-lived API keys.

#### Kubernetes (service account tokens)

```python
from openai import OpenAI
from openai.auth import k8s_service_account_token_provider

client = OpenAI(
workload_identity={
"client_id": "your-client-id",
"identity_provider_id": "idp-123",
"service_account_id": "sa-456",
"provider": k8s_service_account_token_provider(
"/var/run/secrets/kubernetes.io/serviceaccount/token"
),
},
organization="org-xyz",
project="proj-abc",
)

response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}],
)
```

#### Azure (managed identity)

```python
from openai import OpenAI
from openai.auth import azure_managed_identity_token_provider

client = OpenAI(
workload_identity={
"client_id": "your-client-id",
"identity_provider_id": "idp-123",
"service_account_id": "sa-456",
"provider": azure_managed_identity_token_provider(
resource="https://management.azure.com/",
),
},
)
```

#### Google Cloud Platform (compute engine metadata)

```python
from openai import OpenAI
from openai.auth import gcp_id_token_provider

client = OpenAI(
workload_identity={
"client_id": "your-client-id",
"identity_provider_id": "idp-123",
"service_account_id": "sa-456",
"provider": gcp_id_token_provider(audience="https://api.openai.com/v1"),
},
)
```

#### Custom subject token provider

```python
from openai import OpenAI


def get_custom_token() -> str:
return "your-jwt-token"


client = OpenAI(
workload_identity={
"client_id": "your-client-id",
"identity_provider_id": "idp-123",
"service_account_id": "sa-456",
"provider": {
"token_type": "jwt",
"get_token": get_custom_token,
},
}
)
```

You can also customize the token refresh buffer (default is 1200 seconds (20 minutes) before expiration):

```python
from openai import OpenAI
from openai.auth import k8s_service_account_token_provider

client = OpenAI(
workload_identity={
"client_id": "your-client-id",
"identity_provider_id": "idp-123",
"service_account_id": "sa-456",
"provider": k8s_service_account_token_provider("/var/token"),
"refresh_buffer_seconds": 120.0,
}
)
```

### Vision

With an image URL:
Expand Down
1 change: 1 addition & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ from openai.types import (
FunctionDefinition,
FunctionParameters,
Metadata,
OAuthErrorCode,
Reasoning,
ReasoningEffort,
ResponseFormatJSONObject,
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openai"
version = "2.30.0"
version = "2.31.0"
description = "The official Python library for the openai API"
dynamic = ["readme"]
license = "Apache-2.0"
Expand Down
6 changes: 3 additions & 3 deletions scripts/mock
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@ echo "==> Starting mock server with URL ${URL}"
# Run steady mock on the given spec
if [ "$1" == "--daemon" ]; then
# Pre-install the package so the download doesn't eat into the startup timeout
npm exec --package=@stdy/cli@0.19.7 -- steady --version
npm exec --package=@stdy/cli@0.20.2 -- steady --version

npm exec --package=@stdy/cli@0.19.7 -- steady --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets "$URL" &> .stdy.log &
npm exec --package=@stdy/cli@0.20.2 -- steady --host 127.0.0.1 -p 4010 --validator-query-array-format=brackets --validator-form-array-format=brackets --validator-query-object-format=brackets --validator-form-object-format=brackets "$URL" &> .stdy.log &

# Wait for server to come online via health endpoint (max 30s)
echo -n "Waiting for server"
Expand All @@ -48,5 +48,5 @@ if [ "$1" == "--daemon" ]; then

echo
else
npm exec --package=@stdy/cli@0.19.7 -- steady --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets "$URL"
npm exec --package=@stdy/cli@0.20.2 -- steady --host 127.0.0.1 -p 4010 --validator-query-array-format=brackets --validator-form-array-format=brackets --validator-query-object-format=brackets --validator-form-object-format=brackets "$URL"
fi
2 changes: 1 addition & 1 deletion scripts/test
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ elif ! steady_is_running ; then
echo -e "To run the server, pass in the path or url of your OpenAPI"
echo -e "spec to the steady command:"
echo
echo -e " \$ ${YELLOW}npm exec --package=@stdy/cli@0.19.7 -- steady path/to/your.openapi.yml --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets${NC}"
echo -e " \$ ${YELLOW}npm exec --package=@stdy/cli@0.20.2 -- steady path/to/your.openapi.yml --host 127.0.0.1 -p 4010 --validator-query-array-format=brackets --validator-form-array-format=brackets --validator-query-object-format=brackets --validator-form-object-format=brackets${NC}"
echo

exit 1
Expand Down
2 changes: 2 additions & 0 deletions src/openai/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
from ._constants import DEFAULT_TIMEOUT, DEFAULT_MAX_RETRIES, DEFAULT_CONNECTION_LIMITS
from ._exceptions import (
APIError,
OAuthError,
OpenAIError,
ConflictError,
NotFoundError,
Expand Down Expand Up @@ -57,6 +58,7 @@
"APIResponseValidationError",
"BadRequestError",
"AuthenticationError",
"OAuthError",
"PermissionDeniedError",
"NotFoundError",
"ConflictError",
Expand Down
35 changes: 32 additions & 3 deletions src/openai/_base_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
cast,
overload,
)
from typing_extensions import Literal, override, get_origin
from typing_extensions import Unpack, Literal, override, get_origin

import anyio
import httpx
Expand Down Expand Up @@ -81,6 +81,7 @@
)
from ._streaming import Stream, SSEDecoder, AsyncStream, SSEBytesDecoder
from ._exceptions import (
OpenAIError,
APIStatusError,
APITimeoutError,
APIConnectionError,
Expand Down Expand Up @@ -542,6 +543,10 @@ def _build_request(
files = cast(HttpxRequestFiles, ForceMultipartDict())

prepared_url = self._prepare_url(options.url)
# preserve hard-coded query params from the url
if params and prepared_url.query:
params = {**dict(prepared_url.params.items()), **params}
prepared_url = prepared_url.copy_with(raw_path=prepared_url.raw_path.split(b"?", 1)[0])
if "_" in prepared_url.host:
# work around https://github.com/encode/httpx/discussions/2880
kwargs["extensions"] = {"sni_hostname": prepared_url.host.replace("_", "-")}
Expand Down Expand Up @@ -932,6 +937,15 @@ def _prepare_request(
"""
return None

def _send_request(
self,
request: httpx.Request,
*,
stream: bool,
**kwargs: Unpack[HttpxSendArgs],
) -> httpx.Response:
return self._client.send(request, stream=stream, **kwargs)

@overload
def request(
self,
Expand Down Expand Up @@ -1002,7 +1016,7 @@ def request(

response = None
try:
response = self._client.send(
response = self._send_request(
request,
stream=stream or self._should_stream_response_body(request=request),
**kwargs,
Expand All @@ -1021,6 +1035,9 @@ def request(

log.debug("Raising timeout error")
raise APITimeoutError(request=request) from err
except OpenAIError as err:
# Propagate OpenAIErrors as-is, without retrying or wrapping in APIConnectionError
raise err
except Exception as err:
log.debug("Encountered Exception", exc_info=True)

Expand Down Expand Up @@ -1526,6 +1543,15 @@ async def _prepare_request(
"""
return None

async def _send_request(
self,
request: httpx.Request,
*,
stream: bool,
**kwargs: Unpack[HttpxSendArgs],
) -> httpx.Response:
return await self._client.send(request, stream=stream, **kwargs)

@overload
async def request(
self,
Expand Down Expand Up @@ -1601,7 +1627,7 @@ async def request(

response = None
try:
response = await self._client.send(
response = await self._send_request(
request,
stream=stream or self._should_stream_response_body(request=request),
**kwargs,
Expand All @@ -1620,6 +1646,9 @@ async def request(

log.debug("Raising timeout error")
raise APITimeoutError(request=request) from err
except OpenAIError as err:
# Propagate OpenAIErrors as-is, without retrying or wrapping in APIConnectionError
raise err
except Exception as err:
log.debug("Encountered Exception", exc_info=True)

Expand Down
Loading
Loading