All Integrations

better-cmdk vercel ai sdk integration

Vercel AI SDKIntegration Guide

Use this path when you want a clean streaming chat endpoint and full control over model/provider selection.

Implementation Notes

This setup keeps your AI pipeline inside your own API route. Choose it when you need model and infrastructure control; better-cmdk's default hosted chat is only a free trial path (no signup, 10 requests per 10 minutes).

Best For

  • - Teams already shipping Vercel AI SDK routes
  • - Apps that need provider flexibility without changing UI components
  • - Products that want chat in the command menu before adding AI actions

Quick Setup Checklist

  1. 1. Install better-cmdk with ai and @ai-sdk/react dependencies.
  2. 2. If you need custom model control, create app/api/chat/route.ts with streaming response support.
  3. 3. Pass chatEndpoint to CommandMenu and test responses in both themes.
  4. 4. Add request logging and rate limits before production rollout.

Integration FAQ

Can I use better-cmdk chat without creating my own endpoint first?

Yes. better-cmdk includes a free hosted trial endpoint so you can start without signup, but it is rate-limited to 10 requests per 10 minutes. For production, set your own chatEndpoint or use modifywithai for agentic capabilities.

Can I start with the default chat and add actions later?

Yes. Many teams begin with chat only, then add structured actions and approval rules incrementally.

Does this lock me to one model provider?

No. You can switch model providers in your route without changing CommandMenu wiring.

Should chatEndpoint be public?

Only for public surfaces. Authenticated product routes should enforce user identity and rate limits.