LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
  • Overview
  • Client
  • AsyncClient
  • Run Helpers
  • Run Trees
  • Evaluation
  • Schemas
  • Utilities
  • Wrappers
  • Anonymizer
  • Testing
  • Expect API
  • Middleware
  • Pytest Plugin
  • Deployment SDK
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

OverviewClientAsyncClientRun HelpersRun TreesEvaluationSchemasUtilitiesWrappersAnonymizerTestingExpect APIMiddlewarePytest PluginDeployment SDK
Language
Theme
Pythonlangsmithprompt_cacheAsyncPromptCacheaset
Method●Since v0.7

aset

Copy
aset( self, key: str, value: Any, refresh_func: Callable[[], 
View source on GitHub
Awaitable
[
Any
]
]
)
->
None

Parameters

NameTypeDescription
key*str
value*Any
refresh_func*Callable[[], Awaitable[Any]]

Set a value in the cache.

The cache key (prompt identifier).

The value to cache.

Async function to refresh this cache entry when stale.