Skip to content

Getting a Groq API Key for the Gaffer

The Gaffer supports three AI providers — Groq, OpenAI, and Anthropic. Groq is the recommended starting point. It is fast, generous on the free tier, and requires no credit card to get started.

This guide covers Groq setup as of March 2026.


Why Groq

Groq runs on custom LPU (Language Processing Unit) hardware purpose-built for inference. The result is inference speeds that are significantly faster than standard GPU-based providers. In the EGAF evaluation, llama-3.3-70b-versatile via Groq produced a 10.8× speedup over the pure Python baseline while returning the correct result. The ungoverned approach was 6.9× faster but returned a wrong answer.

Fast inference matters in the Gaffer because every seal acknowledgment, every execution turn, and every regen after a prune involves a real API call. Groq keeps sessions responsive.


Free tier

Signing up at console.groq.com gives access to a free tier with no credit card required. The free tier lets you call supported models subject to rate limits on requests and tokens per minute and per day. For governed sessions in the Gaffer — which are deliberately bounded and single-turn — the free tier is sufficient for personal and evaluation use.

If rate limits become a constraint, Groq's Developer tier offers up to 10× higher limits on a pay-as-you-go basis.


Step 1 — Create an account

  1. Go to console.groq.com
  2. Sign up with an email address or Google account
  3. Verify your email and complete the onboarding steps

Step 2 — Generate an API key

Navigate to the API Keys section and click Create API Key. Enter a descriptive name for the key — something like gaffer-personal — then click Submit. Copy the displayed API key immediately. This is the only time it will be shown.

Store it somewhere secure — a password manager is ideal. Treat it like a password. Anyone with the key can make API calls against your account.


Step 3 — Choose a model

The Gaffer accepts any model string that Groq supports. Two are recommended:

Model Use case
llama-3.3-70b-versatile Best quality — recommended for governed sessions
llama-3.1-8b-instant Fastest, lowest token consumption — good for high-volume use

The model string goes into the Model field in the Gaffer's Provider Settings panel. Enter it exactly as shown above.

To see all currently available models and their rate limits, visit console.groq.com/docs/models.


Step 4 — Enter your key in the Gaffer

  1. Open the Gaffer at ai.jclabs.tech
  2. Expand Provider Settings at the top of the interface
  3. Select Groq from the provider dropdown
  4. Paste your API key into the API Key field
  5. Enter your chosen model string into the Model field

The key stays local to your session and is never stored or transmitted to JC Laboratories. It goes directly from your browser to Groq's API.


Key safety

  • Never paste your API key into a chat, a document, or a public repository
  • If a key is exposed, revoke it immediately from console.groq.com/keys and generate a new one
  • Keys can be scoped per project in Groq's console — create a dedicated key for the Gaffer rather than reusing one from another project

Rate limits

If a session returns a ⚠ 429 Too Many Requests error, the rate limit for the current period has been reached. The limit resets within the same minute for per-minute limits, or by the next day for daily limits. No action is needed — wait for the reset and continue the session.

For sustained high-volume use, the Groq Developer tier removes this constraint.


Other providers

The Gaffer also supports OpenAI and Anthropic. The setup is identical — select the provider, enter the API key, enter the model string. Recommended models:

Provider Model string
OpenAI gpt-4o
Anthropic claude-sonnet-4-6

All three providers are governed identically by the framework once a vial is sealed. The provider choice affects speed, cost, and output style — not governance behavior.