mirror of
https://github.com/eyaltoledano/claude-task-master.git
synced 2025-11-23 05:27:09 +00:00
131 lines
4.3 KiB
Plaintext
131 lines
4.3 KiB
Plaintext
---
|
||
title: Configuration
|
||
sidebarTitle: "Configuration"
|
||
|
||
---
|
||
|
||
Before getting started with Task Master, you'll need to set up your API keys. There are a couple of ways to do this depending on whether you're using the CLI or working inside MCP. It's also a good time to start getting familiar with the other configuration options available — even if you don’t need to adjust them yet, knowing what’s possible will help down the line.
|
||
|
||
## API Key Setup
|
||
|
||
Task Master uses environment variables to securely store provider API keys and optional endpoint URLs.
|
||
|
||
### MCP Usage: mcp.json file
|
||
|
||
For MCP/Cursor usage: Configure keys in the env section of your .cursor/mcp.json file.
|
||
|
||
```java .env lines icon="java"
|
||
{
|
||
"mcpServers": {
|
||
"task-master-ai": {
|
||
"command": "npx",
|
||
"args": ["-y", "task-master-ai"],
|
||
"env": {
|
||
"ANTHROPIC_API_KEY": "ANTHROPIC_API_KEY_HERE",
|
||
"PERPLEXITY_API_KEY": "PERPLEXITY_API_KEY_HERE",
|
||
"OPENAI_API_KEY": "OPENAI_API_KEY_HERE",
|
||
"GOOGLE_API_KEY": "GOOGLE_API_KEY_HERE",
|
||
"XAI_API_KEY": "XAI_API_KEY_HERE",
|
||
"OPENROUTER_API_KEY": "OPENROUTER_API_KEY_HERE",
|
||
"MISTRAL_API_KEY": "MISTRAL_API_KEY_HERE",
|
||
"AZURE_OPENAI_API_KEY": "AZURE_OPENAI_API_KEY_HERE",
|
||
"OLLAMA_API_KEY": "OLLAMA_API_KEY_HERE",
|
||
"GITHUB_API_KEY": "GITHUB_API_KEY_HERE"
|
||
}
|
||
}
|
||
}
|
||
}
|
||
```
|
||
|
||
<Tip>
|
||
**Optimize Context Usage**: You can control which Task Master MCP tools are loaded using the `TASK_MASTER_TOOLS` environment variable. This helps reduce LLM context usage by only loading the tools you need.
|
||
|
||
Options:
|
||
- `all` (default) - All 36 tools
|
||
- `standard` - 15 commonly used tools
|
||
- `core` or `lean` - 7 essential tools
|
||
|
||
Example:
|
||
```json
|
||
"env": {
|
||
"TASK_MASTER_TOOLS": "standard",
|
||
"ANTHROPIC_API_KEY": "your_key_here"
|
||
}
|
||
```
|
||
|
||
See the [MCP Tools documentation](/capabilities/mcp#configurable-tool-loading) for details.
|
||
</Tip>
|
||
|
||
### CLI Usage: `.env` File
|
||
|
||
Create a `.env` file in your project root and include the keys for the providers you plan to use:
|
||
|
||
|
||
|
||
```java .env lines icon="java"
|
||
# Required API keys for providers configured in .taskmaster/config.json
|
||
ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
|
||
PERPLEXITY_API_KEY=pplx-your-key-here
|
||
# OPENAI_API_KEY=sk-your-key-here
|
||
# GOOGLE_API_KEY=AIzaSy...
|
||
# AZURE_OPENAI_API_KEY=your-azure-openai-api-key-here
|
||
# etc.
|
||
|
||
# Optional Endpoint Overrides
|
||
# Use a specific provider's base URL, e.g., for an OpenAI-compatible API
|
||
# OPENAI_BASE_URL=https://api.third-party.com/v1
|
||
#
|
||
# Azure OpenAI Configuration
|
||
# AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/ or https://your-endpoint-name.cognitiveservices.azure.com/openai/deployments
|
||
# OLLAMA_BASE_URL=http://custom-ollama-host:11434/api
|
||
|
||
# Google Vertex AI Configuration (Required if using 'vertex' provider)
|
||
# VERTEX_PROJECT_ID=your-gcp-project-id
|
||
```
|
||
|
||
## What Else Can Be Configured?
|
||
|
||
The main configuration file (`.taskmaster/config.json`) allows you to control nearly every aspect of Task Master’s behavior. Here’s a high-level look at what you can customize:
|
||
|
||
<Tip>
|
||
You don’t need to configure everything up front. Most settings can be left as defaults or updated later as your workflow evolves.
|
||
</Tip>
|
||
|
||
<Accordion title="View Configuration Options">
|
||
|
||
### Models and Providers
|
||
- Role-based model setup: `main`, `research`, `fallback`
|
||
- Provider selection (Anthropic, OpenAI, Perplexity, etc.)
|
||
- Model IDs per role
|
||
- Temperature, max tokens, and other generation settings
|
||
- Custom base URLs for OpenAI-compatible APIs
|
||
|
||
### Global Settings
|
||
- `logLevel`: Logging verbosity
|
||
- `debug`: Enable/disable debug mode
|
||
- `projectName`: Optional name for your project
|
||
- `defaultTag`: Default tag for task grouping
|
||
- `defaultSubtasks`: Number of subtasks to auto-generate
|
||
- `defaultPriority`: Priority level for new tasks
|
||
|
||
### API Endpoint Overrides
|
||
- `ollamaBaseURL`: Custom Ollama server URL
|
||
- `azureBaseURL`: Global Azure endpoint
|
||
- `vertexProjectId`: Google Vertex AI project ID
|
||
- `vertexLocation`: Region for Vertex AI models
|
||
|
||
### Tag and Git Integration
|
||
- Default tag context per project
|
||
- Support for task isolation by tag
|
||
- Manual tag creation from Git branches
|
||
|
||
### State Management
|
||
- Active tag tracking
|
||
- Migration state
|
||
- Last tag switch timestamp
|
||
|
||
</Accordion>
|
||
|
||
<Note>
|
||
For advanced configuration options and detailed customization, see our [Advanced Configuration Guide](/best-practices/configuration-advanced) page.
|
||
</Note> |