Docs ◆ Samples ◆ Tools ◆ Agent Builder
Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.
- Lightweight & Flexible: Simple agent loop that just works and is fully customizable
- Model Agnostic: Support for Amazon Bedrock, Anthropic, Ollama, and custom providers
- Advanced Capabilities: Multi-agent systems, autonomous agents, and streaming support
- Built-in MCP: Native support for Model Context Protocol (MCP) servers, enabling access to thousands of pre-built tools
# Install Strands Agents pip install strands-agents strands-agents-toolsfrom strands import Agent from strands_tools import calculator agent = Agent(tools=[calculator]) agent("What is the square root of 1764")Note: For the default Amazon Bedrock model provider, you'll need AWS credentials configured and model access enabled for Claude 3.7 Sonnet in the us-west-2 region. See the Quickstart Guide for details on configuring other model providers.
Ensure you have Python 3.10+ installed, then:
# Create and activate virtual environment python -m venv .venv source .venv/bin/activate # On Windows use: .venv\Scripts\activate # Install Strands and tools pip install strands-agents strands-agents-toolsEasily build tools using Python decorators:
from strands import Agent, tool @tool def word_count(text: str) -> int: """Count words in text. This docstring is used by the LLM to understand the tool's purpose. """ return len(text.split()) agent = Agent(tools=[word_count]) response = agent("How many words are in this sentence?")Seamlessly integrate Model Context Protocol (MCP) servers:
from strands import Agent from strands.tools.mcp import MCPClient from mcp import stdio_client, StdioServerParameters aws_docs_client = MCPClient( lambda: stdio_client(StdioServerParameters(command="uvx", args=["awslabs.aws-documentation-mcp-server@latest"])) ) with aws_docs_client: agent = Agent(tools=aws_docs_client.list_tools_sync()) response = agent("Tell me about Amazon Bedrock and how to use it with Python")Support for various model providers:
from strands import Agent from strands.models import BedrockModel from strands.models.ollama import OllamaModel from strands.models.llamaapi import LlamaAPIModel # Bedrock bedrock_model = BedrockModel( model_id="us.amazon.nova-pro-v1:0", temperature=0.3, ) agent = Agent(model=bedrock_model) agent("Tell me about Agentic AI") # Ollama ollama_modal = OllamaModel( host="http://localhost:11434", model_id="llama3" ) agent = Agent(model=ollama_modal) agent("Tell me about Agentic AI") # Llama API llama_model = LlamaAPIModel( model_id="Llama-4-Maverick-17B-128E-Instruct-FP8", ) agent = Agent(model=llama_model) response = agent("Tell me about Agentic AI")Built-in providers:
Custom providers can be implemented using Custom Providers
Strands offers an optional strands-agents-tools package with pre-built tools for quick experimentation:
from strands import Agent from strands_tools import calculator agent = Agent(tools=[calculator]) agent("What is the square root of 1764")It's also available on GitHub via strands-agents/tools.
For detailed guidance & examples, explore our documentation:
We welcome contributions! See our Contributing Guide for details on:
- Reporting bugs & features
- Development setup
- Contributing via Pull Requests
- Code of Conduct
- Reporting of security issues
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Strands Agents is currently in public preview. During this period:
- APIs may change as we refine the SDK
- We welcome feedback and contributions