feat: add multi-provider AI support (Groq, OpenAI, Anthropic, Google)#35
feat: add multi-provider AI support (Groq, OpenAI, Anthropic, Google)#35dhairyashiil wants to merge 4 commits intomainfrom
Conversation
Replace the hardcoded Groq provider with a dynamic, env-driven provider selector supporting Groq, OpenAI, Anthropic, and Google Gemini, controlled by AI_PROVIDER and AI_MODEL env vars. - Rewrite lib/ai-provider.ts with provider registry and getModel() switch - Install @ai-sdk/anthropic and @ai-sdk/google packages - Generalize error helpers in lib/agent.ts for all providers - Add AI_PROVIDER validation in lib/env.ts - Update .env.example and README.md documentation
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
| Deployment failed with the following error: View Documentation: https://vercel.com/docs/accounts/team-members-and-roles |
There was a problem hiding this comment.
1 issue found across 7 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately. <file name="apps/chat/lib/agent.ts"> <violation number="1" location="apps/chat/lib/agent.ts:888"> P2: Treating any 429 as an AI token limit can misclassify unrelated 429 errors and return an incorrect user message.</violation> </file> Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
3 issues found across 7 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately. <file name="apps/chat/lib/ai-provider.ts"> <violation number="1" location="apps/chat/lib/ai-provider.ts:21"> P2: Keep the previous Groq fallback model or existing deployments without `AI_MODEL` will silently change behavior.</violation> </file> <file name="apps/chat/lib/agent.ts"> <violation number="1" location="apps/chat/lib/agent.ts:868"> P2: Don't classify every `function_call` mention as a tool-call failure.</violation> <violation number="2" location="apps/chat/lib/agent.ts:888"> P2: Avoid treating every 429 as a quota-exhaustion error.</violation> </file> Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| Deployment failed with the following error: View Documentation: https://vercel.com/docs/accounts/team-members-and-roles |
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
There was a problem hiding this comment.
1 issue found across 1 file (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately. <file name="apps/chat/lib/ai-provider.ts"> <violation number="1" location="apps/chat/lib/ai-provider.ts:21"> P2: This reverts the Groq default model back to `openai/gpt-oss-120b`, which contradicts the PR's stated default change to `llama-3.3-70b-versatile`. Default deployments will keep using the old model.</violation> </file> Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
1 issue found across 1 file (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately. <file name="apps/chat/lib/agent.ts"> <violation number="1" location="apps/chat/lib/agent.ts:2029"> P2: Plain HTTP 429 responses will no longer be recognized as AI rate limits.</violation> </file> Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
feat: add multi-provider AI support (Groq, OpenAI, Anthropic, Google)
Summary
Replaces the hardcoded Groq provider in
apps/chatwith a dynamic, env-driven provider selector. Two env vars now control AI behavior:AI_PROVIDER—groq|openai|anthropic|google(default:groq)AI_MODEL— optional model override (each provider has a sensible default)The Vercel AI SDK's provider-agnostic
LanguageModelinterface means onlylib/ai-provider.tsneeds to know which provider is active — the rest of the codebase (streamText, tools, etc.) is unchanged.Files changed (6):
apps/chat/package.json@ai-sdk/anthropic,@ai-sdk/googleapps/chat/lib/ai-provider.tsPROVIDER_CONFIGregistry +getModel()exhaustive switchapps/chat/lib/agent.tsisAIToolCallError/isAIRateLimitErrorfor all providersapps/chat/lib/env.tsAI_PROVIDERvalue and that the selected provider's API key is presentapps/chat/.env.exampleapps/chat/README.mdReview & Testing Checklist for Human
openai/gpt-oss-120b→llama-3.3-70b-versatile. Verify this is the intended default for existing deployments, or whether existing deploys should setAI_MODEL=openai/gpt-oss-120bexplicitly."function_call"pattern inisAIToolCallError: This substring is fairly generic. Confirm it won't false-positive on non-error messages that mention "function_call" in passing.isAIRateLimitErrornow returnstruefor any 429 status code, whereas previously it also required "retry" in the error message. Verify this won't cause unintended error handling behavior.AI_PROVIDER=openaiwith a real API key) to confirm end-to-end streaming works through the agent.^ranges (^3.0.58,^3.0.43) while existing@ai-sdk/groqand@ai-sdk/openaiare pinned exactly. Confirm this inconsistency is acceptable.Notes
bun run typecheck:chat). Pre-existing biome lint errors inpackages/cli/are unrelated to this PR.