Skip to content

Setting Up Anthropic

Anthropic's Claude models are an excellent choice for tasks that demand high accuracy, complex reasoning, or large context windows. Quincy supports Anthropic as a first-class provider alongside local models.

Get an API Key

  1. Go to console.anthropic.com
  2. Create an account or sign in
  3. Navigate to API Keys and generate a new key
  4. Copy the key — you'll need it in the next step

Add Anthropic to Quincy

If this is your first time running Quincy, the onboarding wizard will offer Anthropic as a provider option. Select it and paste your API key when prompted.

If Quincy is already set up, you can add Anthropic as an additional provider during any conversation — just ask:

> Add Anthropic as a provider

Quincy will prompt you for your API key interactively.

Where Is the Key Stored?

Your API key is stored in the macOS Keychain — it is never written to a config file, environment variable, or any other plaintext location on disk. The key is stored securely and isolated from other applications.

When an agent needs to call the Anthropic API, it retrieves the key from the Keychain at runtime. The key is passed directly to the HTTP client — the LLM itself never sees it. See Security & Trust for the full picture.

When to Use Cloud Models

Cloud models like Claude shine for:

  • Complex reasoning — Multi-step analysis, planning, and decision-making
  • Long context — Processing large documents or conversation histories
  • Accuracy-critical tasks — When the task demands a level of reasoning that local models may not reliably deliver
  • Orchestration — The main orchestrator agent benefits from a more capable model since it decides how to break down and delegate tasks

For routine, focused tasks (file reading, simple queries, structured data extraction), local models are often fast enough and keep everything private. See Choosing Models for guidance on blending local and cloud models.