Skip to content
@stainless-sdks

Stainless SDKs

SDKs produced by Stainless Software Inc.

Stainless

Best-in-class developer interfaces for your API.

Stainless helps you deliver robust and polished SDKs, documentation that keeps up with your API, and token-efficient MCP servers. Built from decades of experience at companies like Stripe, Heroku, and Twilio, we enable you to provide a "Stripe-quality" developer experience without the maintenance burden.

Stainless.com · Read the docs · Request a demo


Generated SDKs

We generate production-ready SDKs depended on by millions of developers every day. We handle the boilerplate so your users can focus on building.

Supported languages:

Language Example SDKs Status
openai-node
anthropic-sdk-typescript
lithic-node
modern-treasury-node
turbopuffer-typescript
Generally available
openai-python
anthropic-sdk-python
metronome-python
stagehand-python
composio-base-py
Generally available
cloudflare-go
orb-go
knock-go
Generally available
openai-java
hiddenlayer-sdk-java
lithic-java
modern-treasury-java
Generally available
dodopayments-kotlin
lithic-kotlin
modern-treasury-kotlin
finch-api-kotlin
Generally available
anthropic-sdk-ruby
increase-ruby
imagekit-ruby
telnyx-ruby
Generally available
anthropic-sdk-php
stagehand-php
dodopayments-php
Generally available
anthropic-sdk-csharp
courier-csharp
arcade-dotnet
Generally available
terraform-provider-cloudflare Beta
Coming soon
Coming soon

Why developers love Stainless SDKs

Our libraries feel hand-written by an expert in the target language. We don't just generate code; we generate idiomatic interfaces.

  • Streaming Support: complete support for Server-Sent Events (SSE) and streaming responses.
  • Webhooks: Built-in helpers for verifying signatures and handling webhook events securely.
  • Rich Typing: End-to-end type safety with rich enums and object definitions.
  • Robustness: Automatic retries with exponential backoff, handling of pagination, and strict timeouts out-of-the-box.
  • Automated delivery: As your OpenAPI spec evolves, your SDKs update automatically.

Examples

TypeScript
import OpenAI from "openai"; import { zodTextFormat } from "openai/helpers/zod"; import { z } from "zod"; const openai = new OpenAI(); const format = zodTextFormat( z.object({ namesOfParticipants: z.array(z.string()) }), "namesOfParticipants" ); const response = await openai.responses.parse({ model: "gpt-5", instructions: "Extract the names of the participants.", input: "Alice and Bob are going to a science fair on Friday.", text: { format }, }); console.log(response.output_parsed);
Python
import asyncio from anthropic import AsyncAnthropic async def main(): client = AsyncAnthropic() async with client.messages.stream( messages=[{ "role": "user", "content": "What are some blue birds?", }], model="claude-opus-4-5", max_tokens=1024, ) as stream: async for text in stream.text_stream: print(text, end="", flush=True) print() asyncio.run(main())
Go
package main import ( "context" "fmt" "github.com/lithic-com/lithic-go" ) func main() { client := lithic.NewClient() params := lithic.CardListParams{} iter := client.Cards.ListAutoPaging(context.Background(), params) for i := 0; i < 30 && iter.Next(); i++ { card := iter.Current() fmt.Printf("%+v\n", card) } if err := iter.Err(); err != nil { panic(err.Error()) } }
Java
package com.increase.example; import com.increase.api.client.IncreaseClient; import com.increase.api.client.okhttp.IncreaseOkHttpClient; import com.increase.api.models.Account; import com.increase.api.models.AccountCreateParams; public class Main { public static void main(String[] args) { IncreaseClient client = IncreaseOkHttpClient.fromEnv(); AccountCreateParams params = AccountCreateParams.builder() .name("New Account!") .entityId("entity_n8y8tnk2p9339ti393yi") .programId("program_i2v2os4mwza1oetokh9i") .build(); Account account = client.accounts().create(params); } }
Kotlin
import com.increase.api.client.IncreaseClient import com.increase.api.client.okhttp.IncreaseOkHttpClient import com.increase.api.models.accounts.Account import com.increase.api.models.accounts.AccountCreateParams fun main() { val client = IncreaseOkHttpClient.fromEnv() val params = AccountCreateParams.builder() .name("New Account!") .entityId("entity_n8y8tnk2p9339ti393yi") .programId("program_i2v2os4mwza1oetokh9i") .build() val account = client.accounts().create(params) println(account) }
Ruby
require "openai" openai = OpenAI::Client.new chat_completion = openai.chat.completions.create( messages: [{role: :user, content: "Say this is a test"}], model: :"gpt-5.1" ) puts(chat_completion.choices[0].message)
PHP
<?php use Anthropic\Client; $client = new Client( apiKey: getenv("ANTHROPIC_API_KEY") ); $message = $client->messages->create([ "max_tokens" => 1024, "messages" => [["role" => "user", "content" => "Hello, Claude"]], "model" => "claude-opus-4-5", ]); var_dump($message->content);
C#
using System; using Anthropic; using Anthropic.Models.Messages; AnthropicClient client = new(); MessageCreateParams parameters = new() { MaxTokens = 1024, Messages = [ new() { Role = Role.User, Content = "Hello, Claude", }, ], Model = Model.ClaudeOpus4_5_20251101, }; var message = await client.Messages.Create(parameters); Console.WriteLine(message);
Terraform
provider "cloudflare" { } resource "cloudflare_zone" "home_zone" { account = { id = var.account_id } name = "stainless.com" type = "full" } resource "cloudflare_dns_record" "home_record" { zone_id = cloudflare_zone.home_zone.id type = "A" name = "stainless.com" content = var.home_page_ipv4 }

Stainless Docs Platform

API docs + SDK docs + handwritten guides, perfectly in sync.

Go from an OpenAPI spec to a live, interactive documentation site in minutes. The Stainless Docs Platform uses your generated SDKs to create code-first documentation that developers actually prefer.

  • Always In Sync: Updates automatically when your API spec changes.
  • SDK-First: Shows usage examples in the languages your customers actually use, not just raw HTTP.
  • Built on Astro: High performance, static generation, and deployable to Cloudflare or your own cloud.
  • Extensible: Supports Markdown, MDX, and Markdoc.

See it in action:

Claude API Docs · Beeper · Digital Ocean · Letta · Sendblue

Explore the Docs Platform


MCP Server Generator

MCP servers built for frontier models.

Generate production-ready Model Context Protocol (MCP) servers that work seamlessly with Claude, Cursor, and other agentic clients.

  • Token Efficient: Automatically optimizes your API responses for constrained LLM context windows.
  • Plug & Play: Connect your REST API to the AI ecosystem instantly.

Explore MCP Generator


Developer tooling

We provide a suite of tools to ensure the integration experience is seamless from start to finish.

  • Stainless CLI: A powerful command-line interface to manage your configuration and generation pipeline.
brew tap stainless-api/tap brew install stl stl auth login stl init
  • LSP & VS Code Extension: We provide language server protocol support to give developers autocomplete, inline documentation, and validation while they write code against your API.

Trusted By

We power the developer experience for the world's leading API companies.


OpenAI  ·  Anthropic  ·  Cloudflare  ·  Lithic  ·  Modern Treasury  ·  Plaid

"Proud to have Stainless as a partner at OpenAI.

All our SDKs are generated by them. The team is extremely thoughtful about SDK design and push us to improve our API + other products while they're at it. Stainless is bringing a new standard of SDKs to our industry and this is a great thing for all developers

Stainless is bringing a new standard of SDKs to our industry and this is a great thing for all developers."

Nikunj Handa, API Product @ OpenAI

Popular repositories Loading

  1. todo-ninja-typescript todo-ninja-typescript Public

    Staging repository for https://github.com/alexarena/todo-ninja-typescript/tree/main.

    TypeScript 1 1

  2. py-ai-comparison py-ai-comparison Public

    Python 1 3

  3. .github .github Public

  4. pristine-config pristine-config Public

    1

  5. ts-ai-comparison ts-ai-comparison Public

    1

  6. llama-stack llama-stack Public

    Forked from llamastack/llama-stack

    Composable building blocks to build LLM Apps

    Python

Repositories

Showing 6 of 6 repositories

Top languages

Loading…

Most used topics

Loading…