An OpenClaw-native workflow shell: typed (JSON-first) pipelines, jobs, and approval gates.
OpenClaw (or any other AI agent) can use lobster as a workflow engine and avoid re-planning every step — saving tokens while improving determinism and resumability.
node bin/lobster.js "workflows.run --name github.pr.monitor --args-json '{\"repo\":\"openclaw/openclaw\",\"pr\":1152}'" [ { "kind": "github.pr.monitor", "repo": "openclaw/openclaw", "prNumber": 1152, "key": "github.pr:openclaw/openclaw#1152", "changed": false, "summary": { "changedFields": [], "changes": {} }, "prSnapshot": { "author": { "id": "MDQ6VXNlcjE0MzY4NTM=", "is_bot": false, "login": "vignesh07", "name": "Vignesh" }, "baseRefName": "main", "headRefName": "feat/lobster-plugin", "isDraft": false, "mergeable": "MERGEABLE", "number": 1152, "reviewDecision": "", "state": "OPEN", "title": "feat: Add optional lobster plugin tool (typed workflows, approvals/resume)", "updatedAt": "2026-01-18T20:16:56Z", "url": "https://github.com/openclaw/openclaw/pull/1152" } } ] node bin/lobster.js "workflows.run --name github.pr.monitor --args-json '{\"repo\":\"openclaw/openclaw\",\"pr\":1200}'" [ { "kind": "github.pr.monitor", "repo": "openclaw/openclaw", "prNumber": 1200, "key": "github.pr:openclaw/openclaw#1200", "changed": true, "summary": { "changedFields": [ "number", "title", "url", "state", "isDraft", "mergeable", "reviewDecision", "updatedAt", "baseRefName", "headRefName" ], "changes": { "number": { "from": null, "to": 1200 }, "title": { "from": null, "to": "feat(tui): add syntax highlighting for code blocks" }, "url": { "from": null, "to": "https://github.com/openclaw/openclaw/pull/1200" }, "state": { "from": null, "to": "MERGED" }, "isDraft": { "from": null, "to": false }, "mergeable": { "from": null, "to": "UNKNOWN" }, "reviewDecision": { "from": null, "to": "" }, "updatedAt": { "from": null, "to": "2026-01-19T05:06:09Z" }, "baseRefName": { "from": null, "to": "main" }, "headRefName": { "from": null, "to": "feat/tui-syntax-highlighting" } } }, "prSnapshot": { "author": { "id": "MDQ6VXNlcjE0MzY4NTM=", "is_bot": false, "login": "vignesh07", "name": "Vignesh" }, "baseRefName": "main", "headRefName": "feat/tui-syntax-highlighting", "isDraft": false, "mergeable": "UNKNOWN", "number": 1200, "reviewDecision": "", "state": "MERGED", "title": "feat(tui): add syntax highlighting for code blocks", "updatedAt": "2026-01-19T05:06:09Z", "url": "https://github.com/openclaw/openclaw/pull/1200" } } ] - Typed pipelines (objects/arrays), not text pipes.
- Local-first execution.
- No new auth surface: Lobster must not own OAuth/tokens.
- Composable macros that OpenClaw (or any agent) can invoke in one step to save tokens.
From this folder:
pnpm installpnpm testpnpm lintnode ./bin/lobster.js --helpnode ./bin/lobster.js doctornode ./bin/lobster.js "exec --json --shell 'echo [1,2,3]' | where '0>=0' | json"
pnpm testrunstscand then executes tests againstdist/.bin/lobster.jsprefers the compiled entrypoint indist/when present.
exec: run OS commandsexec --stdin raw|json|jsonl: feed pipeline input into subprocess stdinwhere,pick,head: data shapingjson,table: renderersapprove: approval gate (TTY prompt or--emitfor OpenClaw integration)
- OpenClaw integration: ship as an optional OpenClaw plugin tool.
Lobster workflow files are meant to read like small scripts:
run:orcommand:for deterministic shell/CLI stepspipeline:for native Lobster stages likellm.invokeapproval:for hard workflow gates between stepsstdin: $step.stdoutorstdin: $step.jsonto pass data forward
lobster run path/to/workflow.lobster lobster run --file path/to/workflow.lobster --args-json '{"tag":"family"}' Example file:
name: jacket-advice args: location: default: Phoenix steps: - id: fetch run: weather --json ${location} - id: confirm approval: Want jacket advice from the LLM? stdin: $fetch.json - id: advice pipeline: > llm.invoke --prompt "Given this weather data, should I wear a jacket? Be concise and return JSON." stdin: $fetch.json when: $confirm.approvedNotes:
run:andcommand:are equivalent;run:is the preferred spelling for new files.pipeline:shares the same args/env/results model as shell steps, so later steps can still reference$step.stdoutor$step.json.- If you need a human checkpoint before an LLM call, use a dedicated
approval:step in the workflow file rather thanapproveinside the nested pipeline. cwd,env,stdin,when, andconditionwork for both shell and pipeline steps.
Use llm.invoke from a native pipeline: step for model-backed work:
llm.invoke --prompt 'Summarize this diff' llm.invoke --provider openclaw --prompt 'Summarize this diff' llm.invoke --provider pi --prompt 'Summarize this diff'Provider resolution order:
--providerLOBSTER_LLM_PROVIDER- auto-detect from environment
Built-in providers today:
openclawviaOPENCLAW_URL/OPENCLAW_TOKENpiviaLOBSTER_PI_LLM_ADAPTER_URL(typically supplied by the Pi extension)httpviaLOBSTER_LLM_ADAPTER_URL
llm_task.invoke remains available as a backward-compatible alias for the OpenClaw provider.
Shell run: steps execute in your system shell, so OpenClaw tool calls there must be real executables.
If you install Lobster via npm/pnpm, it installs a small shim executable named:
openclaw.invoke(preferred)clawd.invoke(alias)
These shims forward to the Lobster pipeline command of the same name.
Prereqs:
OPENCLAW_URLpoints at a running OpenClaw gateway- optionally
OPENCLAW_TOKENif auth is enabled
export OPENCLAW_URL=http://127.0.0.1:18789 # export OPENCLAW_TOKEN=...In a workflow:
name: hello-world steps: - id: greeting run: > openclaw.invoke --tool llm-task --action json --args-json '{"prompt":"Hello"}'Use stdin: $stepId.stdout to pipe output from one step into the next.
${arg} substitution is a raw string replace into the shell command text.
For anything that may contain quotes, $, backticks, or newlines, prefer env vars:
- every resolved workflow arg is exposed as
LOBSTER_ARG_<NAME>(uppercased, non-alnum →_) - the full args object is also available as
LOBSTER_ARGS_JSON
Example:
args: text: default: "" steps: - id: safe env: TEXT: "$LOBSTER_ARG_TEXT" command: | jq -n --arg text "$TEXT" '{"result": $text}'