๐ค Step 9: Model Configuration¶
Configure API keys for cloud models and test LLM connectivity.
๐ API Key Setup¶
Option 1: Groq (Recommended - Free Tier)¶
- Get your API key from console.groq.com
- Add to
.envfile in project root:
cat > .env << 'ENVEOF'
GROQ_API_KEY=gsk_your_key_here
ENVEOF
Option 2: OpenRouter¶
echo 'OPENROUTER_API_KEY=sk-or-v1-your-key-here' >> .env
Load API Keys¶
set -a
source .env
set +a
# Verify
echo $GROQ_API_KEY
๐ Check Model Configuration¶
The config/models.json file defines available models:
cat config/models.json
Default configuration includes:
- Cloud models: Groq Llama 3.3 70B (linear + agentic)
- Local models: TinyLlama 1B (fallback)
๐งช Test LLM Connectivity¶
Run the LLM test suite:
python core/execution/tests/test_llm_setup.py
Expected output:
======================================================================
LLM SETUP TEST RESULTS
======================================================================
โ
CLOUD_LINEAR
Provider: groq
Model: Groq Llama 3.3 70B
Response: [actual response]
โ
CLOUD_AGENTIC
Provider: groq
Model: Groq Llama 3.3 70B (Agentic)
Response: [actual response]
โ
LOCAL_LINEAR
Provider: local
Model: TinyLlama 1B Offline
Response: [actual response]
โ
LOCAL_AGENTIC
Provider: local
Model: TinyLlama 1B Offline (Agentic)
Response: [actual response]
โ ๏ธ Troubleshooting LLM Issues¶
| Error | Solution |
|---|---|
401 Unauthorized |
API key invalid - check .env file |
No API key found |
Keys not loaded - run source .env |
ModuleNotFoundError |
Install missing deps: pip install -r requirements.txt |
execution_time_ms error |
Update to latest code: git pull |
๐ Next Step¶
Proceed to Quick Start Guide to run your first experiment!