helpbase

Bring your own key (BYOK)

Three ways to skip the helpbase hosted proxy and call your LLM provider directly. For CI, power users, and privacy-sensitive workflows.

byokadvancedci
Edit on GitHub

helpbase is free to use with a one-command login — 500,000 tokens per day, no credit card. That covers most people.

You might want your own key instead when:

  • You're running in CI. helpbase sessions are designed for interactive use; machines need a token they already have.
  • You already have a provider key. Most teams have an Anthropic or OpenAI key lying around. Use it directly — no second provider to sign up for.
  • Free-tier limits are too low for your workload. 500k/day is ~10 full generate runs or ~250 context --ask calls. BYOK is unlimited (within your provider's own spend cap).
  • You want zero helpbase.dev round-trips. Every BYOK call goes direct to your provider. helpbase never sees your prompts.

Three keys, any one works

Set whichever one you already have. First key found wins; precedence is AI_GATEWAY_API_KEY > ANTHROPIC_API_KEY > OPENAI_API_KEY.

Env varRoutes toUse when
ANTHROPIC_API_KEY@ai-sdk/anthropic directYou already have an Anthropic key. Pair with --model anthropic/<id>.
OPENAI_API_KEY@ai-sdk/openai directYou already have an OpenAI key. Pair with --model openai/<id>.
AI_GATEWAY_API_KEYVercel AI GatewayYou want one key to route any provider (Anthropic, OpenAI, Google, etc.) with one bill.
┌──────────────────┐
│ helpbase CLI     │─────► Vercel AI SDK direct ─────► your provider
│                  │                                    (your key, your bill)
└──────────────────┘
  (no quota check, no usage log, no helpbase.dev round-trip)

No changes to prompts, no different models — same output as the hosted path, just routed direct.

Anthropic direct

export ANTHROPIC_API_KEY=sk-ant-...
helpbase generate --url https://docs.example.com \
  --model anthropic/claude-3-5-sonnet-latest

The --model prefix must be anthropic/ when only ANTHROPIC_API_KEY is set. Any other prefix will error with a clear message telling you the exact command to try instead.

Get a key at console.anthropic.com.

OpenAI direct

export OPENAI_API_KEY=sk-...
helpbase generate --url https://docs.example.com \
  --model openai/gpt-4o-mini

Same rule: the --model prefix must be openai/ when only OPENAI_API_KEY is set.

Get a key at platform.openai.com.

Vercel AI Gateway (multi-provider)

export AI_GATEWAY_API_KEY=vck_live_...
helpbase generate --url https://docs.example.com
# Any --model works:
helpbase generate --url https://docs.example.com --model anthropic/claude-sonnet-4.6
helpbase generate --url https://docs.example.com --model google/gemini-3.1-flash-lite-preview
helpbase generate --url https://docs.example.com --model openai/gpt-4o-mini
  1. Sign in at vercel.com/ai-gateway.
  2. Create an API key.
  3. Fund it (first $5 is typically free credit).
  4. export AI_GATEWAY_API_KEY=vck_... or drop it in .env.local.

CI usage

In GitHub Actions and other CI, set whichever key you use as a repo secret:

- run: helpbase generate --url https://docs.example.com
  env:
    ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}

CI users don't need helpbase login. If you still want to run helpbase deploy from CI (which requires a helpbase account), set HELPBASE_TOKEN as well — that's the helpbase session, separate from the provider key.

Switching back to hosted

Unset whichever key is active:

unset ANTHROPIC_API_KEY   # or OPENAI_API_KEY, or AI_GATEWAY_API_KEY

The next CLI command will use the helpbase hosted proxy again. No state to clear, no account to delete.

Verify which mode you're in

helpbase whoami

whoami names the active key if BYOK is on: "BYOK mode: ANTHROPIC_API_KEY is set — calls bypass helpbase (no quota applied)". Otherwise it shows your usage against the free-tier cap.

Costs

BYOK costs are whatever your provider charges for the model you pick. Rough costs for a 9-article helpbase generate run (~50,000 tokens):

  • Gemini 3.1 Flash Lite (Gateway default, --test): ~$0.004 per run.
  • Claude Sonnet 4.6 (--model anthropic/claude-sonnet-4.6): ~$0.20–0.40.
  • GPT-4o-mini (--model openai/gpt-4o-mini): ~$0.01–0.03.

Troubleshooting

"ANTHROPIC_API_KEY is set but the model is '...'" Direct-key mode requires the --model prefix to match your key. Either pass --model anthropic/<id>, or set AI_GATEWAY_API_KEY instead to route any provider through Gateway.

"ANTHROPIC_API_KEY is not set" but I just exported it. New terminal sessions don't see variables from old sessions. Re-export in the current shell, or persist in .env.local.

"BYOK mode active" but I want to use hosted. Run unset <KEY_NAME> using whichever key helpbase whoami names. You can also delete the line from .env.local.

CI passes the env var but still shows "not signed in". helpbase login and BYOK serve different purposes. Login is for hosted quota + deploy authorization. BYOK bypasses quota only. For deploy in CI, you also need HELPBASE_TOKEN (a helpbase session token) alongside your provider key.