Claude Code skill pack for Clari (18 skills)
Installation
Open Claude Code and run this command:
/plugin install clari-pack@claude-code-plugins-plus
Use --global to install for all projects, or --project for current project only.
Skills (18)
Integrate Clari export pipeline testing and validation into CI/CD.
Clari CI Integration
Overview
Add Clari export validation to CI: test API connectivity, validate export schemas, and run pipeline integration tests.
Instructions
GitHub Actions Workflow
name: Clari Pipeline Tests
on:
push:
paths: ["src/clari/**", "tests/clari/**"]
schedule:
- cron: "0 6 * * 1" # Weekly Monday check
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- run: pip install -r requirements.txt
- name: Unit tests (mock data)
run: pytest tests/ -v -k "not integration"
- name: Integration test (real API)
if: github.ref == 'refs/heads/main'
env:
CLARI_API_KEY: ${{ secrets.CLARI_API_KEY }}
run: |
python -c "
from clari_client import ClariClient
client = ClariClient()
forecasts = client.list_forecasts()
assert len(forecasts) > 0, 'No forecasts found'
print(f'Connected: {len(forecasts)} forecasts available')
"
- name: Schema validation
env:
CLARI_API_KEY: ${{ secrets.CLARI_API_KEY }}
run: |
python scripts/validate_schema.py
Store Secrets
gh secret set CLARI_API_KEY --body "your-api-token"
Resources
Next Steps
For deployment patterns, see clari-deploy-integration.
Diagnose and fix Clari API errors including auth failures, export issues, and data mismatches.
Clari Common Errors
Overview
Diagnostic guide for the most common Clari API issues: authentication failures, empty exports, job timeouts, and data discrepancies.
Error Reference
1. 401 Unauthorized
{"error": "Unauthorized", "message": "Invalid API key"}
Fix: Regenerate token at Clari > User Settings > API Token. Tokens may expire or be revoked by admins.
2. 403 Forbidden -- API Access Not Enabled
{"error": "Forbidden", "message": "API access not enabled for this user"}
Fix: Contact your Clari admin to enable API access. Requires enterprise plan.
3. 404 Forecast Not Found
{"error": "Not Found", "message": "Forecast 'wrong_name' not found"}
Fix: List available forecasts first:
curl -s -H "apikey: ${CLARI_API_KEY}" \
https://api.clari.com/v4/export/forecast/list | jq '.forecasts[].forecastName'
4. Export Returns Empty Entries
The API returns {"entries": []} with no error.
Causes:
- Time period has no submitted forecasts
- User lacks visibility into the forecast hierarchy
- Wrong forecast name (case-sensitive)
Fix: Verify in Clari UI that the forecast has submissions for the requested period.
5. Job Stuck in PENDING
Export job never reaches COMPLETED status.
Causes:
- Very large export (all reps, all periods)
- Clari backend queue congestion
Fix: Increase polling timeout. Break large exports into per-period batches.
6. Data Mismatch Between API and UI
Forecast numbers from API do not match what is shown in Clari UI.
Causes:
- API exports submitted calls, UI may show latest-edited values
- Currency conversion differences
- Time period boundary differences (calendar vs fiscal)
Fix: Use includeHistorical: true to get all submission versions. Match the exact time period label from the UI.
7. Copilot API OAuth Errors
{"error": "invalid_client"}
Fix: The Copilot API uses OAuth2, not API key auth. Register your app at https://api-doc.copilot.clari.com and use client credentials flow.
8. Rate Limit Exceeded
HTTP 429 Too Many Requests
Fix: Implement exponential backoff. See clari-rate-limits for patterns.
Quick Diagnostic Commands
# Test API key
curl -s -o /dev/null -w "%{hBuild a Clari forecast export pipeline to your data warehouse.
Clari Core Workflow: Forecast Export Pipeline
Overview
Primary workflow: build an automated pipeline that exports forecast submissions, quota, adjustments, and CRM data from Clari to your data warehouse. Supports Snowflake, BigQuery, and PostgreSQL as targets.
Prerequisites
- Completed
clari-install-authandclari-sdk-patternssetup - Target database or data warehouse with write access
- Python 3.10+ with
requestsand your DB driver
Instructions
Step 1: Define Export Configuration
# config.py
from dataclasses import dataclass
@dataclass
class ExportConfig:
forecast_name: str # From Clari forecast list
time_periods: list[str] # e.g., ["2026_Q1", "2025_Q4"]
export_types: list[str] = None
currency: str = "USD"
include_historical: bool = True
def __post_init__(self):
if self.export_types is None:
self.export_types = [
"forecast", # Submitted forecast call
"forecast_updated", # Updated forecast history
"quota", # Quota values
"adjustment", # Manager adjustments
"crm_total", # Total CRM pipeline
"crm_closed", # Closed-won CRM amounts
]
Step 2: Build the Export Pipeline
# export_pipeline.py
from clari_client import ClariClient
from config import ExportConfig
import json
from datetime import datetime
def run_export(config: ExportConfig) -> list[dict]:
client = ClariClient()
all_entries = []
for period in config.time_periods:
print(f"Exporting {config.forecast_name} for {period}...")
data = client.export_and_download(
forecast_name=config.forecast_name,
time_period=period,
)
entries = data.get("entries", [])
for entry in entries:
entry["_exported_at"] = datetime.utcnow().isoformat()
entry["_forecast_name"] = config.forecast_name
all_entries.extend(entries)
print(f" {len(entries)} records exported")
return all_entries
def transform_forecast_data(entries: list[dict]) -> dict:
total_forecast = sum(e.get("forecastAmount", 0) for e in entries)
total_quota = sum(e.get("quotaAmount", 0) for e in entries)
total_closed = sum(e.get("crmClosed", 0) for e in entries)
return {
"total_forecast": total_forecast,
"total_quota": total_quota,
"total_closed": total_closed,
"attainment_percent": (total_closed / total_quota * 100) if total_quota else 0,
"coveraBuild Clari revenue analytics: pipeline coverage, forecast accuracy, and rep performance dashboards from exported data.
Clari Core Workflow: Revenue Analytics
Overview
Build revenue analytics from Clari export data: forecast accuracy tracking, pipeline coverage analysis, rep performance dashboards, and forecast call change detection.
Prerequisites
- Completed
clari-core-workflow-a(export pipeline) - Historical forecast exports for accuracy tracking
- Pandas/SQL for data analysis
Instructions
Step 1: Forecast Accuracy Analysis
import pandas as pd
def calculate_forecast_accuracy(
forecasts: list[dict], actuals: list[dict]
) -> pd.DataFrame:
df_forecast = pd.DataFrame(forecasts)
df_actual = pd.DataFrame(actuals)
merged = df_forecast.merge(
df_actual[["ownerEmail", "crmClosed"]],
on="ownerEmail",
suffixes=("_forecast", "_actual"),
)
merged["accuracy_pct"] = (
1 - abs(merged["forecastAmount"] - merged["crmClosed_actual"])
/ merged["forecastAmount"]
) * 100
merged["variance"] = merged["crmClosed_actual"] - merged["forecastAmount"]
return merged[["ownerName", "forecastAmount", "crmClosed_actual",
"accuracy_pct", "variance"]].sort_values("accuracy_pct")
Step 2: Pipeline Coverage Report
def pipeline_coverage_report(entries: list[dict]) -> dict:
df = pd.DataFrame(entries)
return {
"total_pipeline": df["crmTotal"].sum(),
"total_closed": df["crmClosed"].sum(),
"total_quota": df["quotaAmount"].sum(),
"total_forecast": df["forecastAmount"].sum(),
"coverage_ratio": df["crmTotal"].sum() / df["quotaAmount"].sum()
if df["quotaAmount"].sum() > 0 else 0,
"close_rate": df["crmClosed"].sum() / df["crmTotal"].sum()
if df["crmTotal"].sum() > 0 else 0,
"attainment_pct": df["crmClosed"].sum() / df["quotaAmount"].sum() * 100
if df["quotaAmount"].sum() > 0 else 0,
"at_risk_reps": len(df[df["forecastAmount"] < df["quotaAmount"] * 0.7]),
"on_track_reps": len(df[df["forecastAmount"] >= df["quotaAmount"] * 0.9]),
}
Step 3: Forecast Change Detection
def detect_forecast_changes(
current: list[dict], previous: list[dict], threshold_pct: float = 10.0
) -> list[dict]:
curr = {e["ownerEmail"]: e for e in current}
prev = {e["ownerEmail"]: e for e in previous}
cOptimize Clari API usage and integration costs.
Clari Cost Tuning
Overview
Minimize Clari API overhead: reduce export frequency, cache aggressively, export only needed data types, and monitor usage.
Instructions
Export Only What You Need
# Full export (6 data types) -- more API load
full_types = ["forecast", "quota", "forecast_updated",
"adjustment", "crm_total", "crm_closed"]
# Minimal export (2 data types) -- faster and lighter
minimal_types = ["forecast", "crm_closed"]
# Use minimal for dashboards, full for audit/compliance
Optimize Export Frequency
| Use Case | Recommended Frequency |
|---|---|
| Executive dashboard | Daily |
| Forecast accuracy tracking | Weekly |
| Compliance audit | Quarterly |
| Ad-hoc analysis | On demand |
Cache to Avoid Redundant Exports
# Cache recent exports (see clari-performance-tuning)
cache = ExportCache(ttl_hours=8)
def smart_export(client, forecast_name, period):
cached = cache.get(forecast_name, period)
if cached:
print(f"Cache hit for {period}")
return cached
data = client.export_and_download(forecast_name, period)
entries = data.get("entries", [])
cache.set(forecast_name, period, entries)
return entries
Usage Tracking
class ClariUsageTracker:
def __init__(self):
self.api_calls = 0
self.exports = 0
def track_call(self):
self.api_calls += 1
def track_export(self):
self.exports += 1
def report(self) -> dict:
return {
"api_calls": self.api_calls,
"exports": self.exports,
}
Resources
Next Steps
For architecture patterns, see clari-reference-architecture.
Collect Clari API diagnostic info for support cases.
Clari Debug Bundle
Overview
Collect Clari API diagnostic information for support: API connectivity, forecast list, job history, and error responses. All secrets are redacted.
Instructions
Debug Bundle Script
#!/bin/bash
# clari-debug-bundle.sh
set -euo pipefail
BUNDLE_DIR="clari-debug-$(date +%Y%m%d-%H%M%S)"
mkdir -p "$BUNDLE_DIR"
echo "=== Clari Debug Bundle ===" | tee "$BUNDLE_DIR/summary.txt"
echo "Generated: $(date -u)" | tee -a "$BUNDLE_DIR/summary.txt"
# 1. API connectivity
echo "--- API Connectivity ---" >> "$BUNDLE_DIR/summary.txt"
HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" \
-H "apikey: ${CLARI_API_KEY}" \
https://api.clari.com/v4/export/forecast/list)
echo "API Status: HTTP ${HTTP_CODE}" >> "$BUNDLE_DIR/summary.txt"
# 2. Forecast list (no sensitive data)
curl -s -H "apikey: ${CLARI_API_KEY}" \
https://api.clari.com/v4/export/forecast/list \
| jq '.forecasts[] | {forecastName, forecastId}' \
> "$BUNDLE_DIR/forecasts.json" 2>&1
# 3. Recent export jobs
curl -s -H "apikey: ${CLARI_API_KEY}" \
https://api.clari.com/v4/export/jobs \
| jq '.jobs[] | {jobId, status, createdAt, forecastName}' \
> "$BUNDLE_DIR/jobs.json" 2>&1
# 4. Environment info (redacted)
echo "--- Environment ---" >> "$BUNDLE_DIR/summary.txt"
echo "CLARI_API_KEY: ${CLARI_API_KEY:+[SET]}" >> "$BUNDLE_DIR/summary.txt"
python3 --version >> "$BUNDLE_DIR/summary.txt" 2>&1
pip3 show requests 2>/dev/null | grep Version >> "$BUNDLE_DIR/summary.txt" || true
# 5. Package
tar -czf "$BUNDLE_DIR.tar.gz" "$BUNDLE_DIR"
rm -rf "$BUNDLE_DIR"
echo "Bundle: $BUNDLE_DIR.tar.gz"
Safe to share: Forecast names, job IDs, HTTP status codes, library versions.
Never share: API key, forecast amounts, rep names, email addresses.
Resources
Next Steps
For rate limit handling, see clari-rate-limits.
Deploy Clari export pipelines to production with Airflow, Cloud Functions, or Lambda.
Clari Deploy Integration
Overview
Deploy Clari export pipelines to production environments: Airflow DAGs, AWS Lambda, or Google Cloud Functions for scheduled, serverless execution.
Instructions
Airflow DAG
# dags/clari_export_dag.py
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.models import Variable
from datetime import datetime, timedelta
def export_clari_forecast(**context):
from clari_client import ClariClient, ClariConfig
client = ClariClient(ClariConfig(
api_key=Variable.get("clari_api_key"),
))
period = context["params"].get("period", "2026_Q1")
data = client.export_and_download("company_forecast", period)
entries = data.get("entries", [])
context["ti"].xcom_push(key="entry_count", value=len(entries))
# Load to warehouse here
dag = DAG(
"clari_daily_export",
schedule_interval="0 6 * * *",
start_date=datetime(2026, 1, 1),
catchup=False,
default_args={"retries": 2, "retry_delay": timedelta(minutes=5)},
)
export_task = PythonOperator(
task_id="export_forecast",
python_callable=export_clari_forecast,
dag=dag,
)
AWS Lambda
# lambda_handler.py
import json
import boto3
from clari_client import ClariClient, ClariConfig
def handler(event, context):
ssm = boto3.client("ssm")
api_key = ssm.get_parameter(
Name="/clari/api-key", WithDecryption=True
)["Parameter"]["Value"]
client = ClariClient(ClariConfig(api_key=api_key))
data = client.export_and_download(
event.get("forecast_name", "company_forecast"),
event.get("period", "2026_Q1"),
)
return {
"statusCode": 200,
"body": json.dumps({"entries": len(data.get("entries", []))}),
}
Google Cloud Function
# main.py
import functions_framework
from google.cloud import secretmanager
from clari_client import ClariClient, ClariConfig
@functions_framework.http
def clari_export(request):
sm = secretmanager.SecretManagerServiceClient()
secret = sm.access_secret_version(name="projects/my-proj/secrets/clari-api-key/versions/latest")
api_key = secret.payload.data.decode()
client = ClariClient(ClariConfig(api_key=api_key))
data = client.export_and_download("company_forecast", "2026_Q1")
return {"entries": len(data.get("entries", []))}
Error Handling
| Issue | Cause | Solution | |||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Lambda timeout | Export takes > 15min | Use Step
Export your first Clari forecast and pipeline snapshot.
ReadWriteEditBash(curl:*)Bash(python3:*)
Clari Hello WorldOverviewFirst API calls against Clari: list available forecasts, export a forecast snapshot, and check export job status. The Clari Export API is the primary integration point for getting forecast, quota, and CRM data out of Clari. Prerequisites
InstructionsStep 1: List Available Forecasts
Step 2: Export a Forecast
Step 3: Check Export Job Status
Step 4: Download and Parse ResultsConfigure Clari API authentication with API key and set up export access.
ReadWriteEditBash(curl:*)Bash(pip:*)Grep
Clari Install & AuthOverviewSet up Clari API access for exporting forecast data, pipeline snapshots, and revenue intelligence to your data warehouse. Clari uses API key authentication via the Prerequisites
InstructionsStep 1: Generate API Token
Step 2: Configure Environment
Step 3: Test API Connectivity
Step 4: Copilot API Setup (Optional)Clari Copilot (conversation intelligence) has a separate API:
Error Handling
|