Claude Code skill pack for Notion (30 skills)
Installation
Open Claude Code and run this command:
/plugin install notion-pack@claude-code-plugins-plus
Use --global to install for all projects, or --project for current project only.
Skills (32)
Deep debugging for Notion API: response inspection, permission chain tracing, property type mismatches, pagination edge cases, and block nesting limits.
Notion Advanced Troubleshooting
Overview
Deep debugging techniques for Notion API issues that resist standard fixes. Covers API response inspection with request IDs, permission chain tracing through page hierarchies, property type mismatch detection against database schemas, pagination edge cases with cursor validation, and block nesting limit violations (max depth of 3 levels via API). Uses Client from @notionhq/client and raw curl for comparison testing.
Prerequisites
@notionhq/clientv2.x installed (npm install @notionhq/client)- Python:
notion-clientinstalled (pip install notion-client) curlavailable for raw API testingNOTIONTOKENenvironment variable set (internal integration token starting withntn)- Pages/databases shared with your integration via Notion UI
Instructions
Step 1: API Response Inspection with Request ID Tracking
Every Notion API response includes an x-request-id header. Capture it for debugging and support tickets.
import { Client, LogLevel, isNotionClientError, APIErrorCode } from '@notionhq/client';
const notion = new Client({
auth: process.env.NOTION_TOKEN,
logLevel: LogLevel.DEBUG, // Logs full request/response to stderr
});
// Wrapper that captures request ID and timing for every call
async function tracedCall<T>(
label: string,
fn: () => Promise<T>
): Promise<{ result: T; durationMs: number }> {
const start = Date.now();
try {
const result = await fn();
const durationMs = Date.now() - start;
console.log(`[${label}] OK ${durationMs}ms`);
return { result, durationMs };
} catch (error) {
const durationMs = Date.now() - start;
if (isNotionClientError(error)) {
console.error(`[${label}] FAILED ${durationMs}ms`, {
code: error.code,
status: error.status,
message: error.message,
body: error.body,
});
}
throw error;
}
}
// Compare SDK vs raw curl to isolate SDK issues
// Run in bash alongside:
// curl -v https://api.notion.com/v1/pages/PAGE_ID \
// -H "Authorization: Bearer $NOTION_TOKEN" \
// -H "Notion-Version: 2022-06-28" 2>&1 | grep x-request-id
from notion_client import Client
import logging
# Enable debug logging for full request/response visibility
logging.basicConfig(level=logging.DEBUG)
notion = Client(auth=os.environ["NOTION_TOKEN"], log_level=logging.DEBUG)
# Traced wrapper for Python
import time
def traced_call(label: str, fn):
start = time.time()
try:
result = fn()
duration = (time.time() - start) * 1000
print(f"[{label}] OK {duration:.0f}ms")
return result
Different Notion integration architectures: CMS (headless blog), task tracker (project management), knowledge base (wiki), form submission handler, and data pipeline source.
Notion Architecture Variants
Overview
Five validated architecture patterns for using Notion as a backend via the API. Each variant shows a specific use case with real Client from @notionhq/client code: headless CMS for blogs, project management task tracker, wiki-style knowledge base, form submission handler, and data pipeline source for analytics. Includes database schema design, API integration code, and deployment considerations.
Prerequisites
@notionhq/clientv2.x installed (npm install @notionhq/client)- Python:
notion-clientinstalled (pip install notion-client) NOTION_TOKENenvironment variable set- Notion databases created and shared with your integration
Instructions
Step 1: Headless CMS (Blog / Content Site)
Use Notion as a content management system — authors write in Notion, your site fetches and renders content via the API.
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
const CONTENT_DB = process.env.NOTION_CONTENT_DB!;
// Database schema in Notion:
// Title (title), Slug (rich_text), Status (select: Draft/Review/Published),
// Published Date (date), Author (people), Tags (multi_select),
// Excerpt (rich_text), Cover Image (files)
interface BlogPost {
id: string;
title: string;
slug: string;
status: string;
publishedDate: string | null;
author: string;
tags: string[];
excerpt: string;
}
// Fetch published posts for the blog index
async function getPublishedPosts(): Promise<BlogPost[]> {
const response = await notion.databases.query({
database_id: CONTENT_DB,
filter: {
property: 'Status',
select: { equals: 'Published' },
},
sorts: [{ property: 'Published Date', direction: 'descending' }],
page_size: 100,
});
return response.results
.filter((p): p is any => 'properties' in p)
.map(page => ({
id: page.id,
title: page.properties['Title']?.title?.[0]?.plain_text ?? 'Untitled',
slug: page.properties['Slug']?.rich_text?.[0]?.plain_text ?? page.id,
status: page.properties['Status']?.select?.name ?? 'Draft',
publishedDate: page.properties['Published Date']?.date?.start ?? null,
author: page.properties['Author']?.people?.[0]?.name ?? 'Unknown',
tags: page.properties['Tags']?.multi_select?.map((t: any) => t.name) ?? [],
excerpt: page.properties['Excerpt']?.rich_text?.[0]?.plain_text ?? '',
}));
}
// Fetch full page content as blocks (for rendering)
async function getPostContent(pageId: string): Promise<any[]> {
const blocks: any[] = [];
let cursor: string | undefined;
do {
const Integrate the Notion API into CI/CD pipelines for automated documentation sync, deploy tracking, and configuration reads.
Notion CI Integration
Overview
Automate documentation sync, deploy tracking, and configuration management by integrating the Notion API into CI/CD pipelines. This skill covers GitHub Actions workflows that push changelogs and release notes to Notion pages, update database entries on successful deploys, create pages for incident reports, and read feature flags or configuration from Notion databases — all with proper rate limit handling for CI environments.
Prerequisites
- GitHub repository with Actions enabled
- Notion internal integration token (create at
https://www.notion.so/my-integrations) - Target Notion pages/databases shared with the integration (click "..." > "Connections" > add your integration)
NOTION_TOKENstored as a GitHub Actions secret- Node.js 18+ or Python 3.9+ in CI environment
Instructions
Step 1: GitHub Actions Workflow for Documentation Sync
Push changelogs and release notes to Notion automatically on release.
# .github/workflows/notion-docs-sync.yml
name: Sync Docs to Notion
on:
release:
types: [published]
push:
branches: [main]
paths: ['CHANGELOG.md', 'docs/**']
env:
NOTION_TOKEN: ${{ secrets.NOTION_TOKEN }}
jobs:
sync-release-notes:
runs-on: ubuntu-latest
if: github.event_name == 'release'
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- run: npm ci
- name: Push release notes to Notion
run: node scripts/notion-release-sync.js
env:
NOTION_RELEASES_DB: ${{ secrets.NOTION_RELEASES_DB }}
RELEASE_TAG: ${{ github.event.release.tag_name }}
RELEASE_BODY: ${{ github.event.release.body }}
RELEASE_URL: ${{ github.event.release.html_url }}
sync-changelog:
runs-on: ubuntu-latest
if: github.event_name == 'push'
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- run: npm ci
- name: Sync CHANGELOG to Notion page
run: node scripts/notion-changelog-sync.js
env:
NOTION_CHANGELOG_PAGE: ${{ secrets.NOTION_CHANGELOG_PAGE }}
update-deploy-status:
runs-on: ubuntu-latest
needs: sync-release-notes
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- run: npm ci
- name: Update deploy tracker in Notion
run: node scripts/notion-deploy-update.js
env:
NOTION_DEPLOYS_DB: ${{ secrets.NOTION_DEPLOYS_DB }}
DEPLOY_VERSION: ${{ github.event.release.tag_name }}
DEPLOY_ENV: production
DEPLOY_SHA: ${{ github.sDiagnose and fix Notion API errors by HTTP status code and error code.
Notion Common Errors
Overview
Quick reference for all Notion API error codes with exact HTTP statuses, error bodies, and fixes. The API returns errors as JSON with three fields:
{
"object": "error",
"status": 400,
"code": "validation_error",
"message": "Title is not a property that exists."
}
All requests require Authorization: Bearer and Notion-Version: 2022-06-28 headers.
Prerequisites
@notionhq/clientinstalled (npm install @notionhq/client)NOTIONTOKENenvironment variable set (internal integration token starting withntnorsecret_)- Target pages/databases shared with the integration via the Connections menu
Instructions
Step 1: Identify the Error
Run the diagnostic script below or check your application logs. Match the HTTP status and code field to the sections that follow.
Step 2: Match Error Code and Apply Fix
401 — unauthorized
{"object": "error", "status": 401, "code": "unauthorized", "message": "API token is invalid."}
Cause: Token is missing, malformed, expired, or revoked.
Fix:
# Verify token is set
echo ${NOTION_TOKEN:+SET}
# Test directly
curl -s https://api.notion.com/v1/users/me \
-H "Authorization: Bearer ${NOTION_TOKEN}" \
-H "Notion-Version: 2022-06-28" | jq .
If the response shows your integration bot user, the token is valid. Otherwise regenerate at notion.so/my-integrations. Tokens starting with secret are legacy format — new integrations use ntn prefix.
403 — restricted_resource
{"object": "error", "status": 403, "code": "restricted_resource", "message": "Insufficient permissions for this resource."}
Cause: The integration exists and the page is shared, but the integration lacks the required capability (read content, update content, insert content, read comments).
Fix: Go to notion.so/my-integrations, select your integration, and enable the needed capabilities under "Capabilities." Common missing capability: "Read comments" when querying comments, or "Insert content" when creating pages.
404 — objectnotfound
{"object": &quoCreate, update, archive, and compose Notion pages and block content.
Notion Content Management
Overview
Complete guide to creating, updating, archiving, and composing Notion pages and block content using the @notionhq/client SDK. Covers page lifecycle, all common block types, rich text formatting, and bulk content operations.
Prerequisites
- Completed
notion-install-authsetup NOTION_TOKENenvironment variable set- Target database or page shared with your integration (via Connections menu)
@notionhq/clientv2+ installed (TypeScript) ornotion-client(Python)
Instructions
Step 1: Create, Update, and Archive Pages
Create a page in a database with typed properties and initial block content:
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// Create a page with properties and inline content
async function createPage(databaseId: string) {
const page = await notion.pages.create({
parent: { database_id: databaseId },
icon: { emoji: '📄' },
cover: {
external: { url: 'https://images.unsplash.com/photo-cover-id' },
},
properties: {
// Title property (required for database pages)
Name: {
title: [{ text: { content: 'Q1 Sprint Retrospective' } }],
},
Status: {
select: { name: 'In Progress' },
},
Priority: {
select: { name: 'High' },
},
Tags: {
multi_select: [{ name: 'Engineering' }, { name: 'Sprint' }],
},
'Due Date': {
date: { start: '2026-04-01', end: '2026-04-05' },
},
Assignee: {
people: [{ id: 'user-uuid-here' }],
},
Effort: {
number: 8,
},
Done: {
checkbox: false,
},
URL: {
url: 'https://example.com/sprint-board',
},
},
// Initial page body (block children)
children: [
{
heading_2: {
rich_text: [{ text: { content: 'Summary' } }],
},
},
{
paragraph: {
rich_text: [{ text: { content: 'This page tracks the Q1 sprint retrospective.' } }],
},
},
],
});
console.log('Created page:', page.id);
return page;
}
Update page properties after creation:
async function updatePageProperties(pageId: string) {
const updated = await notion.pages.update({
page_id: pageId,
properties: {
Status: { select: { name: 'Done' } },
Done: { checkbox: true },
// Clear a property by setting to null
'Due Date': { date: null },
},
// Update icon/cover
icon: { emoji: '✅' },
});
console.log('Updated page:', updated.id);
return updQuery, filter, and manage Notion databases and pages.
Notion Core Workflow A — Databases & Pages
Overview
Primary workflow for Notion integrations: querying databases with filters/sorts, creating pages with typed properties, updating page properties, and retrieving page content.
Prerequisites
- Completed
notion-install-authsetup - A Notion database shared with your integration
- Understanding of your database's property schema
Instructions
Step 1: Retrieve Database Schema
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
async function getDatabaseSchema(databaseId: string) {
const db = await notion.databases.retrieve({ database_id: databaseId });
// db.properties contains the schema
for (const [name, prop] of Object.entries(db.properties)) {
console.log(`${name}: ${prop.type}`);
// For select/multi_select, show options:
if (prop.type === 'select') {
console.log(' Options:', prop.select.options.map(o => o.name));
}
}
return db.properties;
}
Step 2: Query with Filters
Notion filters use a unique nested structure based on property type:
async function queryWithFilters(databaseId: string) {
const response = await notion.databases.query({
database_id: databaseId,
filter: {
and: [
{
property: 'Status',
select: { equals: 'In Progress' },
},
{
property: 'Priority',
select: { does_not_equal: 'Low' },
},
{
or: [
{
property: 'Assignee',
people: { contains: 'user-uuid-here' },
},
{
property: 'Tags',
multi_select: { contains: 'Urgent' },
},
],
},
],
},
sorts: [
{ property: 'Priority', direction: 'ascending' },
{ property: 'Created', direction: 'descending' },
],
page_size: 50,
});
return response.results;
}
Step 3: Filter Syntax by Property Type
// Text (title, rich_text, url, email, phone_number)
{ property: 'Name', title: { contains: 'search term' } }
{ property: 'Description', rich_text: { starts_with: 'Draft' } }
{ property: 'Email', email: { equals: 'user@example.com' } }
// Number
{ property: 'Score', number: { greater_than: 80 } }
{ property: 'Price', number: { less_than_or_equal_to: 100 } }
// Select / Multi-select
{ property: 'Status', select: { equals: 'Done' } }
{ property: 'Tags', multi_select: { contains: 'Bug' } }
// Date
{ property: 'Due Date', date: { before: '2026Work with Notion blocks, rich text, comments, and page content.
Notion Core Workflow B — Blocks, Content & Comments
Overview
Secondary workflow for content operations: reading block trees, appending content, building rich text with annotations, and managing comments.
Prerequisites
- Completed
notion-install-authsetup - A Notion page shared with your integration
- Familiarity with
notion-core-workflow-a(databases/pages)
Instructions
Step 1: Retrieve Block Children
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
async function getPageContent(pageId: string) {
const blocks = [];
let cursor: string | undefined;
do {
const response = await notion.blocks.children.list({
block_id: pageId,
start_cursor: cursor,
page_size: 100,
});
blocks.push(...response.results);
cursor = response.has_more ? response.next_cursor ?? undefined : undefined;
} while (cursor);
return blocks;
}
Step 2: Read Blocks Recursively (Nested Content)
async function getBlockTree(blockId: string, depth = 0): Promise<any[]> {
const blocks = await getPageContent(blockId);
const tree = [];
for (const block of blocks) {
const node: any = { ...block, children: [] };
// Recursively fetch children if block has them
if ('has_children' in block && block.has_children) {
node.children = await getBlockTree(block.id, depth + 1);
}
tree.push(node);
}
return tree;
}
// Extract plain text from a block tree
function blockToText(block: any): string {
const type = block.type;
if (block[type]?.rich_text) {
return block[type].rich_text.map((t: any) => t.plain_text).join('');
}
return '';
}
Step 3: Append Content Blocks
async function appendContent(pageId: string) {
await notion.blocks.children.append({
block_id: pageId,
children: [
// Heading
{
heading_1: {
rich_text: [{ text: { content: 'Section Title' } }],
},
},
// Paragraph with formatting
{
paragraph: {
rich_text: [
{ text: { content: 'Regular text, ' } },
{ text: { content: 'bold' }, annotations: { bold: true } },
{ text: { content: ', ' } },
{ text: { content: 'italic' }, annotations: { italic: true } },
{ text: { content: ', ' } },
{ text: { content: 'code' }, annotations: { code: true } },
{ text: { content: ', and ' } },
{
text: { content: 'a link', link: { url: 'https://notion.so' } },
annotations: { underline: true },
},
],
},
},Optimize Notion API usage to minimize rate-limit pressure, reduce engineering overhead, and maximize throughput.
Notion Cost Tuning
Overview
The Notion API is free with every workspace plan — there is no per-call pricing. The real "cost" is the 3 requests/second rate limit (per integration token) and engineering time wasted on inefficient patterns. Apply six strategies below to reduce request volume by 80-95%.
Notion workspace pricing (for context — API access is included at every tier):
| Plan | Price | API Access | Rate Limit |
|---|---|---|---|
| Free | $0 | Full API | 3 req/sec |
| Plus | $12/user/mo | Full API | 3 req/sec |
| Business | $28/user/mo | Full API | 3 req/sec |
| Enterprise | Custom | Full API | 3 req/sec |
The rate limit is identical across all plans. Optimization is about staying within 3 req/sec, not reducing a bill.
Prerequisites
@notionhq/clientv2.x installed (npm install @notionhq/client)- Integration token from notion.so/my-integrations
- Token shared with target pages/databases via the Connections menu in Notion
- For queue patterns:
p-queuev8+ (npm install p-queue) - For caching:
node-cacheorlru-cache(npm install lru-cache)
Instructions
Step 1: Audit Current Request Volume
Before optimizing, measure your baseline. Instrument the Notion client to track every API call by method, endpoint, and timestamp.
import { Client } from '@notionhq/client';
interface RequestEntry {
method: string;
endpoint: string;
timestamp: number;
durationMs: number;
}
const requestLog: RequestEntry[] = [];
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// Wrap any Notion call with tracking
async function tracked<T>(
method: string,
endpoint: string,
fn: () => Promise<T>,
): Promise<T> {
const start = Date.now();
try {
return await fn();
} finally {
requestLog.push({
method,
endpoint,
timestamp: start,
durationMs: Date.now() - start,
});
}
}
// Generate audit report
function auditReport() {
const last60s = requestLog.filter(r => r.timestamp > Date.now() - 60_000);
const byMethod = Object.groupBy(last60s, r => r.method);
console.table({
totalAllTime: requestLog.length,
lastMinute: last60s.length,
reqPerSecond: (last60s.length / 60).toFixed(2),
avgLatencyMs: (
last60s.reduce((sum, r) => sum + r.durationMs, 0) / last60s.length
).toFixed(0),
});
// Show hotspots — which methods consume the most budget
for (const [method, entries] ofImplement data handling, PII protection, and GDPR/CCPA compliance for Notion integrations.
Notion Data Handling
Overview
Handle sensitive data correctly when integrating with Notion: detect PII in page properties and block content, redact sensitive fields before logging or exporting, minimize data exposure with filter_properties, and implement GDPR/CCPA compliance patterns including right-of-access exports, right-of-deletion (archive or field clearing), and retention-based archival with audit logging.
Prerequisites
@notionhq/clientv2+ installed (npm install @notionhq/client)- Python alternative:
notion-client(pip install notion-client) - Understanding of which Notion databases contain personal data
- Audit logging infrastructure (structured logs, SIEM, or Notion audit database)
- Legal guidance on applicable regulations (GDPR, CCPA, HIPAA, etc.)
Instructions
Step 1: PII Detection in Notion Content
Notion pages can contain PII in any property type. Scan systematically:
import { Client } from '@notionhq/client';
import type { PageObjectResponse } from '@notionhq/client/build/src/api-endpoints';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// PII pattern matchers
const PII_PATTERNS = [
{ type: 'email', pattern: /[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g },
{ type: 'phone_us', pattern: /\b\d{3}[-.]?\d{3}[-.]?\d{4}\b/g },
{ type: 'phone_intl', pattern: /\+\d{1,3}[-.\s]?\d{4,14}/g },
{ type: 'ssn', pattern: /\b\d{3}-\d{2}-\d{4}\b/g },
{ type: 'credit_card', pattern: /\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b/g },
{ type: 'ip_address', pattern: /\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b/g },
];
interface PIIFinding {
propertyName: string;
piiType: string;
location: 'property' | 'content';
}
function scanPageForPII(page: PageObjectResponse): PIIFinding[] {
const findings: PIIFinding[] = [];
for (const [name, prop] of Object.entries(page.properties)) {
// Direct PII property types
if (prop.type === 'email' && prop.email) {
findings.push({ propertyName: name, piiType: 'email', location: 'property' });
}
if (prop.type === 'phone_number' && prop.phone_number) {
findings.push({ propertyName: name, piiType: 'phone', location: 'property' });
}
if (prop.type === 'people' && prop.people.length > 0) {
findings.push({ propertyName: name, piiType: 'user_reference', location: 'property' });
}
// Text properties may contain embedded PII
if (prop.type === 'rich_text' || prop.type === 'title') {
const textParts = prop.type === 'title' ? prop.title : prop.rich_text;
const text = textParts.map(t => t.plain_text).join('Collect Notion API diagnostic info for troubleshooting and support tickets.
Notion Debug Bundle
Overview
Collect diagnostic information for Notion API issues: SDK version, token validity, database access, page sharing status, rate limits, and platform health. The Notion API requires integrations to be explicitly invited to each page or database — most "not found" errors are sharing problems, not code bugs.
Prerequisites
@notionhq/clientinstalled (npm ls @notionhq/clientto verify)NOTIONTOKENenvironment variable set (internal integration token, starts withntn)curlandjqavailable for shell-based diagnostics
Instructions
Step 1: Quick Connectivity and Auth Check
#!/bin/bash
echo "=== Notion Debug Check ==="
echo "Generated: $(date -u +%Y-%m-%dT%H:%M:%SZ)"
# 1. SDK version
echo -e "\n--- SDK Version ---"
npm ls @notionhq/client 2>/dev/null || echo "SDK not found — run: npm install @notionhq/client"
# 2. Runtime and token status
echo -e "\n--- Runtime ---"
node --version 2>/dev/null || echo "Node.js not found"
echo "NOTION_TOKEN: ${NOTION_TOKEN:+SET (${#NOTION_TOKEN} chars)}"
TOKEN_PREFIX="${NOTION_TOKEN:0:4}"
if [ -n "$NOTION_TOKEN" ] && [ "$TOKEN_PREFIX" != "ntn_" ]; then
echo "WARNING: Token does not start with 'ntn_' — may be using legacy format"
fi
# 3. API connectivity — /v1/users/me as health check
echo -e "\n--- API Connectivity ---"
RESPONSE=$(curl -s -w "\n%{http_code}\n%{time_total}" \
https://api.notion.com/v1/users/me \
-H "Authorization: Bearer ${NOTION_TOKEN}" \
-H "Notion-Version: 2022-06-28" 2>&1)
HTTP_CODE=$(echo "$RESPONSE" | tail -1)
LATENCY=$(echo "$RESPONSE" | tail -2 | head -1)
BODY=$(echo "$RESPONSE" | head -n -2)
echo "HTTP Status: $HTTP_CODE"
echo "Latency: ${LATENCY}s"
if [ "$HTTP_CODE" = "200" ]; then
echo "Bot Name: $(echo "$BODY" | jq -r '.name // "unknown"')"
echo "Bot Type: $(echo "$BODY" | jq -r '.type // "unknown"')"
else
echo "Error Code: $(echo "$BODY" | jq -r '.code // "unknown"')"
echo "Message: $(echo "$BODY" | jq -r '.message // "unknown"')"
fi
# 4. Notion platform status
echo -e "\n--- Notion Platform Status ---"
curl -s https://status.notion.so/api/v2/status.json \
| jq -r '.status.description // "Could not reach status page"' 2>/dev/null \
|| echo "Could not reach status.notion.so"
# 5. Rate limit baseline (3 req/sec across all endpoints)
echo -e "\n--- Rate Limit Info ---"
echo "Notion enDeploy Node.
Deploy Notion-Integrated Applications
Ship Node.js apps that talk to the Notion API to Vercel, Railway, or Fly.io. This skill covers environment variable management, the Notion client singleton pattern for serverless, rate limit handling at 3 req/sec, health check endpoints that verify Notion connectivity, and caching strategies to reduce API calls.
Prerequisites
- Node.js >= 18 project with
@notionhq/clientinstalled (npm i @notionhq/client) - Working Notion integration tested locally with a valid
NOTIONTOKEN(starts withntn) - Platform CLI installed for your target:
vercel,railway, orfly - Database or page IDs your integration needs access to
Instructions
Step 1 — Prepare the Application for Production
Build a production-ready entry point with a Notion client singleton, rate limit handling, response caching, and a health check endpoint.
Notion client singleton (critical for serverless):
Serverless functions recycle containers unpredictably. Creating a new Client on every invocation wastes cold-start time and risks hitting rate limits. A module-level singleton reuses the client across warm invocations.
// src/notion-client.ts — singleton for serverless environments
import { Client, LogLevel, isNotionClientError, APIErrorCode } from '@notionhq/client';
let client: Client | null = null;
export function getNotionClient(): Client {
if (!client) {
if (!process.env.NOTION_TOKEN) {
throw new Error('NOTION_TOKEN environment variable is not set');
}
client = new Client({
auth: process.env.NOTION_TOKEN,
logLevel: process.env.NODE_ENV === 'production' ? LogLevel.WARN : LogLevel.DEBUG,
timeoutMs: 30_000,
});
}
return client;
}
Rate limit handler (Notion enforces 3 requests/second):
Notion returns HTTP 429 with a Retry-After header when you exceed the rate limit. The SDK retries automatically, but production apps should add queuing to avoid cascading failures under load.
// src/rate-limiter.ts — token bucket for 3 req/sec
export class NotionRateLimiter {
private queue: Array<{ resolve: () => void }> = [];
private activeRequests = 0;
private readonly maxPerSecond = 3;
async acquire(): Promise<void> {
if (this.activeRequests < this.maxPerSecond) {
this.activeRequests++;
return;
}
return new Promise((resolve) => {
this.queue.push({ resolve });
});
}
release(): void {
this.activeRequests--;
if (this.queue.length > 0) {
const next = this.queue.shift()!;
this.activeRequests++;
setTimeout(() => next.resolve(), 1000 / this.maxPerSecond);
Configure Notion enterprise access control with OAuth, workspace permissions, and audit logging.
Notion Enterprise RBAC
Overview
Implement enterprise-grade access control for Notion integrations. This covers the full OAuth 2.0 authorization flow for public integrations (multi-tenant), per-workspace token storage with encryption at rest, Notion's page-level permission model and how to handle ObjectNotFound vs RestrictedResource, an application-level role system (admin/editor/viewer) layered on top of Notion's permissions, comprehensive audit logging to a Notion database, and workspace deauthorization cleanup.
Prerequisites
- Notion public integration created at https://www.notion.so/my-integrations (for OAuth)
@notionhq/clientv2+ installed (npm install @notionhq/client)- Python alternative:
notion-client(pip install notion-client) - Database for storing per-workspace tokens (PostgreSQL, DynamoDB, etc.)
- HTTPS endpoint for OAuth callback (required by Notion)
Instructions
Step 1: OAuth 2.0 Authorization Flow
Notion uses OAuth 2.0 for public integrations to access external workspaces:
import { Client } from '@notionhq/client';
import crypto from 'crypto';
// Step 1: Build the authorization URL
function getAuthorizationUrl(state: string): string {
const params = new URLSearchParams({
client_id: process.env.NOTION_OAUTH_CLIENT_ID!,
response_type: 'code',
owner: 'user', // 'user' = user-level token, 'workspace' = workspace-level
redirect_uri: process.env.NOTION_REDIRECT_URI!,
state, // CSRF protection — must verify on callback
});
return `https://api.notion.com/v1/oauth/authorize?${params}`;
}
// Step 2: Exchange authorization code for access token
async function exchangeCodeForToken(code: string): Promise<{
access_token: string;
bot_id: string;
workspace_id: string;
workspace_name: string;
workspace_icon: string | null;
owner: { type: string; user?: { id: string; name: string } };
duplicated_template_id: string | null;
}> {
const credentials = Buffer.from(
`${process.env.NOTION_OAUTH_CLIENT_ID}:${process.env.NOTION_OAUTH_CLIENT_SECRET}`
).toString('base64');
const response = await fetch('https://api.notion.com/v1/oauth/token', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Basic ${credentials}`,
},
body: JSON.stringify({
grant_type: 'authorization_code',
code,
redirect_uri: process.env.NOTION_REDIRECT_URI,
}),
});
if (!response.ok) {
const error = await response.json();
throw new Error(`OAuth token exchange failed: ${error.error}`);
}
return response.json();
}
// Step 3: Create a Client for a specific workspace
function createWorkspaceClient(accessToken: Create a minimal working Notion API example.
Notion Hello World
Overview
Three minimal examples covering the Notion API core surfaces: searching for pages, creating a test page in a database, and verifying the created page by retrieving it back.
Prerequisites
- Completed
notion-install-authsetup NOTION_TOKENenvironment variable set (internal integration token from https://www.notion.so/my-integrations)- At least one database shared with your integration via the Connections menu
- Node.js 18+ with
@notionhq/clientor Python 3.8+ withnotion-client
Instructions
Step 1: Search for Pages in Your Workspace
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
async function searchPages(query: string) {
const { results } = await notion.search({
query,
filter: { property: 'object', value: 'page' },
sort: { direction: 'descending', timestamp: 'last_edited_time' },
page_size: 5,
});
for (const page of results) {
if (page.object === 'page' && 'properties' in page) {
// Title lives under a property with type "title"
const titleProp = Object.values(page.properties).find(
(p) => p.type === 'title'
);
const title = titleProp?.type === 'title'
? titleProp.title.map((t) => t.plain_text).join('')
: '(untitled)';
console.log(`Page: ${title} (${page.id})`);
}
}
return results;
}
// Usage: searchPages('meeting notes');
What this does: The search endpoint queries across all pages and databases your integration can access. The filter narrows results to pages only (use value: 'database' for databases). Results come back as partial page objects with properties included.
Step 2: Create a Test Page in a Database
async function createTestPage(databaseId: string) {
const page = await notion.pages.create({
parent: { database_id: databaseId },
properties: {
Name: {
title: [{ text: { content: 'Hello from the API!' } }],
},
},
// Optional: add inline content blocks
children: [
{
heading_2: {
rich_text: [{ text: { content: 'Getting Started' } }],
},
},
{
paragraph: {
rich_text: [
{ text: { content: 'This page was created via the ' } },
{ text: { content: 'Notion API' }, annotations: { bold: true } },
{ text: { content: ' at ' + new Date().toISOString() + '.' } },
],
},
},
],
});
console.log(`Created page: ${page.id}`);
console.log(`URL: ${page.urExecute Notion incident response procedures with triage, mitigation, and postmortem.
Notion Incident Runbook
Overview
Rapid incident response procedures for Notion API failures. This runbook covers a structured triage flow (under 5 minutes), automated health checks against both status.notion.so and your own integration, a decision tree for classifying failures (Notion-side vs. integration-side), per-error-type mitigation with real Client code, cached fallback patterns, communication templates, and postmortem structure.
Prerequisites
- Access to application monitoring dashboards and log aggregator
NOTION_TOKENenvironment variable set for diagnostic API callscurlandjqinstalled for quick CLI triage- Python alternative:
notion-client(pip install notion-client) - Communication channels configured (Slack webhook, PagerDuty, etc.)
Instructions
Step 1: Quick Triage (Under 5 Minutes)
Run this diagnostic script to determine if the issue is Notion-side or integration-side:
#!/bin/bash
# notion-triage.sh — run at first alert
set -euo pipefail
echo "=== Notion Incident Triage ==="
echo "Time: $(date -u +%Y-%m-%dT%H:%M:%SZ)"
# 1. Check Notion's public status page
echo -e "\n--- Notion Platform Status ---"
STATUS=$(curl -sf https://status.notion.so/api/v2/status.json \
| jq -r '.status.description' 2>/dev/null || echo "UNREACHABLE")
echo "Notion Status: $STATUS"
INCIDENTS=$(curl -sf https://status.notion.so/api/v2/incidents/unresolved.json \
| jq '.incidents | length' 2>/dev/null || echo "UNKNOWN")
echo "Active Incidents: $INCIDENTS"
if [ "$INCIDENTS" != "0" ] && [ "$INCIDENTS" != "UNKNOWN" ]; then
echo "INCIDENT DETAILS:"
curl -sf https://status.notion.so/api/v2/incidents/unresolved.json \
| jq -r '.incidents[] | " - \(.name) (\(.status)): \(.incident_updates[0].body)"'
fi
# 2. Test our integration authentication
echo -e "\n--- Integration Auth Check ---"
AUTH_HTTP=$(curl -sf -o /dev/null -w "%{http_code}" \
https://api.notion.com/v1/users/me \
-H "Authorization: Bearer ${NOTION_TOKEN}" \
-H "Notion-Version: 2022-06-28" 2>/dev/null || echo "000")
echo "Auth HTTP Status: $AUTH_HTTP"
if [ "$AUTH_HTTP" = "200" ]; then
BOT_NAME=$(curl -sf https://api.notion.com/v1/users/me \
-H "Authorization: Bearer ${NOTION_TOKEN}" \
-H "Notion-Version: 2022-06-28" | jq -r '.name')
echo "Bot Name: $BOT_NAME"
fi
# 3. Test database query (if test DB configured)
echo -e "\n--- API Responsiveness ---"
if [ -n "${NOTION_TEST_DATABASE_ID:-}" ]; then
QUERY_RESULT=$(curl -sf -o /dev/null -w "%{http_coInstall and configure the Notion API SDK with authentication.
Notion Install & Auth
Overview
Set up the official Notion SDK and configure authentication for internal integrations. The Node.js SDK is @notionhq/client (npm) and the Python SDK is notion-client (pip). Both wrap the Notion API at https://api.notion.com/v1 using API version 2022-06-28.
Prerequisites
- Node.js 18+ or Python 3.8+
- Package manager (npm, pnpm, yarn, or pip)
- A Notion account (free or paid)
- Access to My Integrations dashboard
Instructions
Step 1: Create Integration and Install SDK
Create an internal integration at https://www.notion.so/my-integrations:
- Click New integration
- Name it, select the workspace, and choose capabilities (Read content, Update content, Insert content)
- Copy the Internal Integration Secret (starts with
ntnorsecret)
Install the SDK:
# Node.js / TypeScript (official SDK)
npm install @notionhq/client
# Python (official SDK)
pip install notion-client
Step 2: Configure Authentication
Store the token in environment variables -- never hardcode it:
# Set environment variable
export NOTION_TOKEN="ntn_your_integration_secret_here"
# Or add to .env file (add .env to .gitignore)
echo 'NOTION_TOKEN=ntn_your_integration_secret_here' >> .env
Share pages with your integration: In Notion, open the page or database you want to access. Click the ... menu, select Connections, and add your integration. Without this step, all API calls return objectnotfound.
Step 3: Verify Connection
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
const me = await notion.users.me({});
console.log(`Authenticated as: ${me.name} (${me.type})`);
console.log(`Bot ID: ${me.id}`);
If the bot user is returned, authentication is working.
Output
- SDK package installed (
@notionhq/clientfor Node.js,notion-clientfor Python) - Environment variable
NOTION_TOKENconfigured - Integration connected to target pages/databases via Connections menu
- Verified API connectivity with
users.me()call
Error Handling
| Error | Cause | Solution | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
unauthorized |
Invalid or expired token | Regenerate at notion.so/my-integrations | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
objectnotfound |
Page not shared with integration | <
| Environment | Integration Name | Capabilities | Timeout | Log Level |
|---|---|---|---|---|
| Development | my-app-dev |
All (read+update+insert+delete) | 60s | DEBUG |
| Staging | my-app-staging |
Read + Update + Insert | 30s | WARN |
| Production | my-app-prod |
Minimum required only | 30s | ERROR |
TypeScript — environment-aware client factory:
import { Client, LogLevel } from '@notionhq/client';
interface NotionEnvConfig {
token: string;
databaseIds: Record<string, string>;
logLevel: LogLevel;
timeoutMs: number;
maxRetries: number;
}
const ENV_DEFAULTS: Record<string, Omit<NotionEnvConfig, 'token' | 'databaseIds'>> = {
development: { logLevel: LogLevel.DEBUG, timeoutMs: 60_000, maxRetries: 0 },
staging: { logLevel: LogLevel.WARN, timeoutMs: 30_000, maxRetries: 2 },
production: { logLevel: LogLevel.ERROR, timeoutMs: 30_000, maxRetries: 3 },
};
function getConfig(): NotionEnvConfig {
const env = process.env.NODE_ENV || 'development';
const defaults = ENV_DEFAULTS[env] ?? ENV_DEFAULTS.development;
const token = process.env.NOTION_TOKEN;
if (!token) {
throw new Error(
`NOTION_TOKEN not set for "${env}". ` +
`Set it in .env.${env} or your secret manager.`
);
}
return {
token,
databaseIds: {
tasks: process.env.NOTION_TASKS_DB_ID!,
users: process.env.NOTION_USERS_DB_ID!,
logs: process.env.NOTION_LOGS_DB_ID!,
},
...defaults,
};
}
export function createNotionClient(): Client {
const config = getConfig();
return new Client({
auth: confSet up observability for Notion integrations with metrics, traces, and alerts.
Notion Observability
Overview
Instrument Notion API calls with metrics, structured logging, and alerting. Track request rates, latencies, error rates, and rate limit headroom. This skill covers a full observability stack: an instrumented client wrapper, Prometheus metrics with histogram buckets tuned for Notion's typical 200-800ms latency, structured logging via pino, health check endpoints, and Prometheus alerting rules for error rate spikes, rate limit exhaustion, high latency, and service outages.
Prerequisites
@notionhq/clientv2+ installed (npm install @notionhq/client)- Python alternative:
notion-client(pip install notion-client) - Prometheus-compatible metrics backend (optional: Grafana, Datadog, or CloudWatch)
- Structured logging library:
pino(Node.js) orstructlog(Python)
Instructions
Step 1: Instrumented Notion Client Wrapper
Wrap every Notion API call with timing, error classification, and structured logging:
import { Client, isNotionClientError, APIErrorCode } from '@notionhq/client';
interface NotionMetrics {
requestCount: number;
errorCount: number;
rateLimitCount: number;
totalLatencyMs: number;
latencyBuckets: Map<string, number[]>;
lastError: { code: string; message: string; timestamp: string } | null;
}
class InstrumentedNotionClient {
private client: Client;
private metrics: NotionMetrics = {
requestCount: 0,
errorCount: 0,
rateLimitCount: 0,
totalLatencyMs: 0,
latencyBuckets: new Map(),
lastError: null,
};
constructor(auth: string, timeoutMs = 30_000) {
this.client = new Client({ auth, timeoutMs });
}
async call<T>(operation: string, fn: (client: Client) => Promise<T>): Promise<T> {
const start = performance.now();
this.metrics.requestCount++;
try {
const result = await fn(this.client);
const durationMs = Math.round(performance.now() - start);
this.metrics.totalLatencyMs += durationMs;
this.recordLatency(operation, durationMs);
console.log(JSON.stringify({
level: 'info',
service: 'notion',
operation,
durationMs,
status: 'ok',
timestamp: new Date().toISOString(),
}));
return result;
} catch (error) {
const durationMs = Math.round(performance.now() - start);
this.metrics.totalLatencyMs += durationMs;
this.metrics.errorCount++;
this.recordLatency(operation, durationMs);
let errorInfo: { code: string; message: string; status: number };
if (isNotionClientError(error)) {
errorInfo = { code: error.code, message: error.message, status: error.status };
if (error.code === APIErrorCode.RateLimited) {
this.metrics.rateLimitCount++;
}Optimize Notion API performance with caching, batching, parallel requests, and incremental sync.
Notion Performance Tuning
Overview
Optimize Notion API performance by minimizing API calls, caching responses with TTL-based invalidation, batching block appends, parallelizing requests within rate limits, selecting only needed properties, and implementing incremental sync patterns. Target latency benchmarks: Database Query p50=150ms, Page Create p50=200ms, Search p50=300ms.
Prerequisites
@notionhq/clientinstalled (npm install @notionhq/client)p-queuefor rate-limited parallelism (npm install p-queue)lru-cachefor TTL-based caching (npm install lru-cache)- Understanding of your access patterns (read-heavy vs write-heavy)
- Optional: Redis or
ioredisfor distributed caching across instances
Instructions
Step 1: Minimize API Calls and Reduce Payload
Avoid N+1 query patterns. Use page_size: 100 (the maximum) to reduce pagination requests. Select only the properties you need in database queries to shrink response payloads.
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// BAD: N+1 pattern — fetching content for every page individually
async function fetchAllBad(dbId: string) {
const pages = await notion.databases.query({ database_id: dbId });
for (const page of pages.results) {
// Each iteration is a separate API call — O(n) requests
const content = await notion.blocks.children.list({ block_id: page.id });
}
}
// GOOD: Use filter_properties to select only needed fields
async function fetchAllGood(dbId: string) {
const pages = await notion.databases.query({
database_id: dbId,
page_size: 100, // Maximum — reduces total pagination requests
filter: {
property: 'Status',
select: { equals: 'Active' },
},
// Select only the properties you need by property ID
// Find IDs via: notion.databases.retrieve({ database_id: dbId })
filter_properties: ['title', 'Status', 'Priority'],
});
// Properties are already in the response — no extra retrieve calls
for (const page of pages.results) {
if ('properties' in page) {
const title = page.properties.Name?.type === 'title'
? page.properties.Name.title.map(t => t.plain_text).join('')
: '';
const status = page.properties.Status?.type === 'select'
? page.properties.Status.select?.name
: undefined;
// Use properties directly — zero additional API calls
}
}
return pages;
}
// Batch block appends — up to 100 blocks per request
async function appendBlocks(pageId: string, items: string[]) {
const blocks = items.map(item => ({
object: 'block' as const,
type: 'paragraph' as constGovernance for Notion integrations: integration naming standards, page sharing policies, property naming conventions, database schema standards, and access audit scripts.
Notion Policy & Guardrails
Overview
Governance framework for Notion integrations at scale. Covers integration naming standards for consistent bot identification, page sharing policy enforcement to prevent accidental data exposure, property naming conventions for cross-team database consistency, database schema validation standards, and access audit scripts that scan which integrations have access to which pages. Uses Client from @notionhq/client for programmatic enforcement.
Prerequisites
@notionhq/clientv2.x installed (npm install @notionhq/client)- Python:
notion-clientinstalled (pip install notion-client) NOTION_TOKENenvironment variable set (admin-level integration recommended for audits)- CI/CD pipeline (GitHub Actions examples provided)
Instructions
Step 1: Integration Naming Standards and Token Management
Establish naming conventions for integrations so teams can identify which bot accessed what.
import { Client } from '@notionhq/client';
// Naming convention: {team}-{env}-{purpose}
// Examples: eng-prod-sync, marketing-staging-cms, data-prod-etl
interface IntegrationConfig {
name: string; // Must match: /^[a-z]+-[a-z]+-[a-z]+$/
token: string;
environment: 'dev' | 'staging' | 'prod';
owner: string; // Team or individual
capabilities: string[]; // What it's allowed to do
}
function validateIntegrationName(name: string): string[] {
const issues: string[] = [];
const pattern = /^[a-z]+-[a-z]+-[a-z]+$/;
if (!pattern.test(name)) {
issues.push(`Name "${name}" must match pattern: {team}-{env}-{purpose} (e.g., eng-prod-sync)`);
}
const [team, env] = name.split('-');
const validEnvs = ['dev', 'staging', 'prod'];
if (env && !validEnvs.includes(env)) {
issues.push(`Environment "${env}" must be one of: ${validEnvs.join(', ')}`);
}
return issues;
}
// Validate at startup — fail fast if misconfigured
async function validateIntegration(notion: Client, config: IntegrationConfig): Promise<void> {
const nameIssues = validateIntegrationName(config.name);
if (nameIssues.length > 0) {
throw new Error(`Integration naming violation:\n${nameIssues.join('\n')}`);
}
// Verify the token works and matches expected bot name
const me = await notion.users.me({});
if (me.type !== 'bot') {
throw new Error('Token is not a bot integration token');
}
console.log(`Integration validated: ${config.name} (bot: ${me.name})`);
}
// Token rotation tracking
interface TokenRegistry {
integrations: Array<{
name: string;
tokenPrefix: string; // First 8 chars for identification
createdDate: string;
rotateBy: striExecute Notion API production deployment checklist and readiness verification.
Notion API Production Deployment Checklist
Overview
Structured 12-section checklist for deploying Notion API integrations to production. Covers authentication security, capability scoping, page sharing, rate limit compliance, pagination correctness, error handling, API versioning, retry logic, monitoring, graceful degradation, data validation, and OAuth token lifecycle. Each section maps to a specific failure mode observed in production Notion integrations.
This skill produces a verified pass/fail report. Every item is actionable and testable — no aspirational guidance. Full code examples for each section are in references/code-examples.md.
Prerequisites
- Node.js 18+ with
@notionhq/clientv2.x installed - Working Notion integration tested in a development workspace
- Production Notion API token (internal) or OAuth credentials (public integration)
- Target databases and pages identified by ID
- Deployment platform configured (Vercel, Railway, AWS, etc.)
Verify SDK is installed:
node -e "const { Client } = require('@notionhq/client'); console.log('SDK loaded')" 2>/dev/null \
|| echo "MISSING: npm install @notionhq/client"
Instructions
Work through each section sequentially. Mark items pass or fail. A single fail in sections 1-6 is a deployment blocker.
Section 1: Token Stored in Environment Variables (Never Hardcoded)
Production tokens must never appear in source code, config files committed to git, or client-side bundles.
- [ ]
NOTION_TOKENloaded from environment variable or secret manager (AWS Secrets Manager, GCP Secret Manager, Vault, Vercel env vars) - [ ] No tokens in source code — verify:
grep -rn "ntn\|secret\|NOTION.=.ntn" --include=".ts" --include=".js" --include="*.env" . - [ ] No tokens in git history:
git log -p --all -S "ntn_" -- ".ts" ".js" "*.env" - [ ]
.envand.env.*files are in.gitignore - [ ] Token rotation procedure documented — who rotates, how to deploy new token without downtime
// CORRECT: Token from environment
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// WRONG: Hardcoded token — immediate security incident
const notion = new Client({ auth: 'ntn_R8dkf92jfKLsd9f2...' });
Fail criteria: Any token found in source, git history, or client bundle.
Section 2: Integration Has Minimum Required Capabilities
Notion integrations request capability scopes at creation time. Production integrations must follow least-privilege.
- [ ] Integration capabilities reviewed
Manage Notion API rate limits with exponential backoff, queue-based throttling, and batch optimization.
Notion Rate Limits
Overview
The Notion API enforces 3 requests per second per integration token across all endpoints and tiers. Exceeding this returns HTTP 429 with a Retry-After header. Detect with isNotionClientError() + APIErrorCode.RateLimited, implement exponential backoff with jitter, and use queue-based throttling for high-throughput workloads.
Prerequisites
@notionhq/clientv2.x (TypeScript) ornotion-client(Python)- Integration token in
NOTION_TOKENfrom notion.so/my-integrations - For queue patterns:
p-queuev8+ (npm install p-queue)
Instructions
Step 1 — Detect Rate Limits and Apply Exponential Backoff
| Aspect | Value |
|---|---|
| Rate limit | 3 req/s per integration token (all tiers) |
| Throttle response | HTTP 429 + Retry-After header (seconds) |
| Scope | Per token, not per user or workspace |
| Max block children | 1,000 per blocks.children.append |
| Max page size | 100 results per paginated request |
The SDK retries 429 automatically (2 retries, 3 total attempts). For heavier workloads, use custom backoff that honors Retry-After and adds jitter to prevent thundering herd:
import { Client, isNotionClientError, APIErrorCode } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
async function withBackoff<T>(
fn: () => Promise<T>,
maxRetries = 5, baseMs = 1000, maxMs = 32_000
): Promise<T> {
for (let i = 0; i <= maxRetries; i++) {
try { return await fn(); }
catch (err) {
if (i === maxRetries) throw err;
if (isNotionClientError(err) && err.code === APIErrorCode.RateLimited) {
const wait = parseInt((err as any).headers?.['retry-after'] ?? '1', 10);
await new Promise(r => setTimeout(r, wait * 1000));
continue;
}
if (isNotionClientError(err) && err.status && err.status < 500) throw err;
const delay = Math.min(baseMs * 2 ** i + Math.random() * 500, maxMs);
await new Promise(r => setTimeout(r, delay));
}
}
throw new Error('Exhausted retries');
}
Step 2 — Throttle with Queue-Based Request Management
Enforce the 3 req/s limit at the application level instead of relying on 429 responses:
import PQueue from 'p-queue';
const queue = new PQueue({
concurrency: 3, interval: 1000, intervalCap: 3,
carryoverConcurrencyCount: true,
});
async function throttledDesign and implement a production-ready Notion integration architecture with proper layering, caching, error handling, and testing strategies.
Notion Reference Architecture
Overview
Production-grade architecture for Notion integrations using @notionhq/client. This skill defines a four-layer architecture — client singleton, repository pattern, service layer, and caching — that scales from simple scripts to enterprise applications. It covers multi-integration setups (reader + writer tokens), event-driven processing, headless CMS patterns, and comprehensive testing strategies.
Notion API version: 2022-06-28 | Rate limit: 3 requests/second per integration | Max page size: 100
Prerequisites
- Node.js 18+ with TypeScript strict mode enabled
@notionhq/clientv2.x installed (npm install @notionhq/client)- A Notion internal integration created at https://www.notion.so/my-integrations
NOTION_TOKENenvironment variable set with the integration token- Target databases/pages shared with the integration via "Add connections"
Instructions
Step 1: Establish the Client Singleton with Retry and Rate Limiting
The client layer wraps @notionhq/client in a singleton pattern with built-in retry logic. Notion's SDK handles basic retries, but you need explicit rate limiting and configurable timeouts for production use.
my-notion-app/
├── src/
│ ├── notion/
│ │ ├── client.ts # Singleton + retry + rate limiter
│ │ ├── types.ts # Domain types mapped from Notion properties
│ │ ├── extractors.ts # Type-safe property extraction helpers
│ │ └── errors.ts # Error classification and retry decisions
│ ├── repositories/
│ │ ├── database.repo.ts # NotionDatabaseRepo — query/create/update
│ │ └── page.repo.ts # NotionPageRepo — page CRUD + blocks
│ ├── services/
│ │ ├── notion.service.ts # NotionService — business logic orchestration
│ │ ├── sync.service.ts # Polling/webhook sync coordination
│ │ └── cms.service.ts # Headless CMS content retrieval
│ ├── cache/
│ │ └── notion-cache.ts # TTL cache between app and Notion API
│ ├── events/
│ │ ├── queue.ts # Event queue for webhook/polling events
│ │ └── processors.ts # Event handlers (page.created, page.updated)
│ └── index.ts
├── tests/
│ ├── unit/
│ │ ├── extractors.test.ts
│ │ ├── database.repo.test.ts
│ │ └── notion.service.test.ts
│ └── integration/
│ └── notion-live.test.ts
├── .env.example
└── tsconfig.json
Create the client singleton with rate limiting:
// src/notion/client.ts
import { Client, LogLevel } from '@notionhq/client';
let readerClient: Client | null = null;
let writerClient: Client | null = null;
interface ClientOptions {
token: string;
logLevel?: LogLevel;
timeoutMs?: numberGraceful degradation when Notion is down: offline cache, retry with exponential backoff, circuit breaker, health checks, and fallback content.
Notion Reliability Patterns
Overview
Production-grade reliability patterns for Notion integrations. Covers graceful degradation with offline cache when Notion is unavailable, retry with exponential backoff for transient failures, circuit breaker to prevent cascade failures, health check endpoints for monitoring, and fallback content serving when the API is unreachable. All patterns use Client from @notionhq/client and handle Notion-specific error codes.
Prerequisites
@notionhq/clientv2.x installed (npm install @notionhq/client)lru-cachefor in-memory caching (npm install lru-cache)- Python:
notion-clientinstalled (pip install notion-client) NOTION_TOKENenvironment variable set- Understanding of circuit breaker and retry patterns
Instructions
Step 1: Retry with Exponential Backoff
The Notion SDK has built-in retries, but you can customize the behavior for better control over transient errors (429, 500, 502, 503).
import { Client, isNotionClientError, APIErrorCode } from '@notionhq/client';
// Classify errors as transient (retryable) vs permanent
function isTransientError(error: unknown): boolean {
if (isNotionClientError(error)) {
return (
error.code === APIErrorCode.RateLimited ||
error.code === APIErrorCode.InternalServerError ||
error.code === APIErrorCode.ServiceUnavailable ||
error.code === 'notionhq_client_request_timeout'
);
}
// Network errors are transient
if (error instanceof Error && error.message.includes('fetch failed')) {
return true;
}
return false;
}
async function retryWithBackoff<T>(
fn: () => Promise<T>,
opts: { maxRetries?: number; baseDelayMs?: number; label?: string } = {}
): Promise<T> {
const { maxRetries = 4, baseDelayMs = 1000, label = 'notion-call' } = opts;
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn();
} catch (error) {
if (!isTransientError(error) || attempt === maxRetries) {
throw error;
}
// Exponential backoff: 1s, 2s, 4s, 8s (with jitter)
const delay = baseDelayMs * Math.pow(2, attempt);
const jitter = delay * 0.2 * Math.random();
const waitMs = delay + jitter;
// Special handling for rate limits: use Retry-After header
if (isNotionClientError(error) && error.code === APIErrorCode.RateLimited) {
const retryAfter = parseInt((error as any).headers?.['retry-after'] ?? '1');
const rateLimitWait = retryAfter * 1000;
console.warn(`[${label}] Rate limited, waiting ${retryAfter}s (attempt ${attempt + 1}/${maxRetries})`);
await new Promise(r => setTimeout(r, rateLimitWait));Apply production-ready @notionhq/client SDK patterns for TypeScript and Python.
Notion SDK Patterns
Overview
Production-ready patterns for the official Notion SDK (@notionhq/client for TypeScript, notion-client for Python) covering client initialization, database queries with filters and sorts, cursor-based pagination, rich text construction, block manipulation, and type-safe error handling using SDK error codes.
Prerequisites
- Node.js 18+ with
@notionhq/clientv2.x installed, or Python 3.9+ withnotion-client - A Notion integration token (
NOTION_TOKEN) from notion.so/my-integrations - Target databases/pages shared with the integration (Share > Invite > select your integration)
- TypeScript 5+ with strict mode enabled (for TypeScript patterns)
Instructions
Step 1 — Initialize the Client and Query Databases
Set up the SDK client and execute filtered, sorted database queries.
TypeScript — Client initialization:
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
Database query with filter and sort:
const response = await notion.databases.query({
database_id,
filter: {
property: 'Status',
select: {
equals: 'Active',
},
},
sorts: [
{
property: 'Created',
direction: 'descending',
},
],
});
Compound filters combine conditions with and/or:
const response = await notion.databases.query({
database_id,
filter: {
and: [
{ property: 'Status', select: { equals: 'Active' } },
{ property: 'Priority', select: { does_not_equal: 'Low' } },
{ property: 'Assignee', people: { is_not_empty: true } },
],
},
sorts: [
{ property: 'Priority', direction: 'ascending' },
{ property: 'Created', direction: 'descending' },
],
});
Python — Client initialization and query:
from notion_client import Client
notion = Client(auth=os.environ["NOTION_TOKEN"])
results = notion.databases.query(
database_id=db_id,
filter={
"property": "Status",
"select": {"equals": "Active"},
},
sorts=[{"property": "Created", "direction": "descending"}],
)
Step 2 — Paginate Results and Manipulate Blocks
The Notion API returns at most 100 results per request. Use cursor-based pagination to retrieve all records.
Cursor-b
Search Notion workspaces and retrieve pages, databases, and block content using the Notion API.
Notion Search & Data Retrieval
Overview
Search across a Notion workspace, query databases with compound filters, retrieve individual pages, and extract nested block content. Covers the full read path: workspace-level search, database queries with filter/sort/pagination, page retrieval, and recursive block tree traversal.
Prerequisites
@notionhq/clientinstalled (npm install @notionhq/client)- Notion integration token with read access to target pages/databases
- Integration added to target pages via the Share menu in Notion
- Completed
notion-install-authsetup
Instructions
Step 1: Search the Workspace
Call notion.search() to find pages and databases. The integration only sees content explicitly shared with it.
import { Client } from '@notionhq/client';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
// Search for pages matching a query
const searchResults = await notion.search({
query: 'meeting notes',
filter: {
property: 'object',
value: 'page', // 'page' or 'database'
},
sort: {
direction: 'descending',
timestamp: 'last_edited_time',
},
page_size: 20,
});
for (const result of searchResults.results) {
if (result.object === 'page' && 'properties' in result) {
const titleProp = Object.values(result.properties)
.find(p => p.type === 'title');
const title = titleProp?.type === 'title'
? titleProp.title.map(t => t.plain_text).join('')
: 'Untitled';
console.log(`${title} (${result.id})`);
}
}
An empty query string returns all shared content. Results are eventually consistent — newly shared pages may take a few seconds to appear in the index.
Step 2: Query Databases with Filters
Call notion.databases.query() for structured queries. Filters support compound and/or logic. See filter-operators.md for every property type and operator.
// Single filter
const activeItems = await notion.databases.query({
database_id: 'your-database-id',
filter: {
property: 'Status',
select: { equals: 'Active' },
},
sorts: [
{ property: 'Priority', direction: 'descending' },
],
page_size: 50,
});
// Compound filter with AND
const highPriorityActive = await notion.databases.query({
database_id: 'your-database-id',
filter: {
and: [
{ property: 'Status', select: { equals: 'Active' } },
{ property: 'Priority', number: { greater_than: 3 } },
],
},
});
Step 3: Paginate, Retrieve Pages,
Apply Notion API security best practices for integration tokens, OAuth2 flows, least-privilege capabilities, and page-level access control.
Notion Security Basics
Overview
Security fundamentals for the Notion API: integration token management, internal vs public integration models, principle of least privilege for capabilities, page-level access auditing, token rotation, OAuth2 flows for public integrations, and webhook verification. All examples use @notionhq/client v2.x and target the 2022-06-28 API version.
Prerequisites
- Notion integration created at notion.so/my-integrations
- Node.js 18+ with
@notionhq/clientinstalled (npm install @notionhq/client) - Understanding of environment variables and
.envfile patterns - For public integrations: OAuth2 client ID and secret from the integration dashboard
Instructions
Step 1: Secure Token Storage and .env Management
Integration tokens are secrets with the same sensitivity as database passwords. Notion tokens use the ntn prefix (current) or secret prefix (legacy). Both grant full access to every page shared with the integration.
# .gitignore — add these patterns BEFORE creating .env
.env
.env.local
.env.*.local
.env.production
.env.staging
# .env.example — commit this template (no real values)
NOTION_TOKEN=ntn_your_internal_integration_token_here
NOTION_OAUTH_CLIENT_ID=
NOTION_OAUTH_CLIENT_SECRET=
NOTION_OAUTH_REDIRECT_URI=http://localhost:3000/auth/notion/callback
import { Client } from '@notionhq/client';
// Always load tokens from environment — never hardcode
const token = process.env.NOTION_TOKEN;
if (!token) {
throw new Error(
'NOTION_TOKEN is required. ' +
'Create an integration at https://www.notion.so/my-integrations ' +
'and set the token in your .env file.'
);
}
// Validate token format before using it
if (!token.startsWith('ntn_') && !token.startsWith('secret_')) {
throw new Error(
'NOTION_TOKEN has an unexpected format. ' +
'Internal integration tokens start with ntn_ (or legacy secret_).'
);
}
const notion = new Client({ auth: token });
Git secret scanning to catch accidental commits:
# .github/workflows/secret-scan.yml
name: Secret Scan
on: [push, pull_request]
jobs:
scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check for Notion tokens
run: |
# Scan for internal integration tokens
if grep -rE "(ntn_|secret_)[a-zA-Z0-9]{30,}" \
--include="*.ts" --include="*.js" --include="*.json" \
--include="*.yaml" --include="*.yml" --include="*.env" .; then
Upgrade @notionhq/client SDK versions and migrate between Notion API versions.
Notion Upgrade & Migration
Overview
Step-by-step guide for upgrading @notionhq/client (Node.js) and notion-client (Python) SDK versions, migrating between Notion API versions, handling breaking changes, and adopting newly released features. Covers the current stable API version 2022-06-28 and the SDK feature timeline through v2.x.
Prerequisites
- Existing project with
@notionhq/clientornotion-clientinstalled - Git repository with clean working tree (no uncommitted changes)
- Test suite covering Notion API calls (or willingness to add verification tests)
NOTION_TOKENenvironment variable configured
Instructions
Step 1: Audit Current Versions and API Surface
Determine what you are running today before changing anything.
# Node.js — check installed SDK version
npm ls @notionhq/client
# Node.js — check latest available
npm view @notionhq/client version
# Python — check installed SDK version
pip show notion-client 2>/dev/null | grep Version
# Python — check latest available
pip index versions notion-client 2>/dev/null | head -1
# Find which API version your code specifies
grep -rn "notionVersion\|Notion-Version\|notion_version" src/ lib/ app/ 2>/dev/null
Record the current SDK version and API version before proceeding. If no notionVersion is set explicitly, the SDK uses its built-in default (typically 2022-06-28 for current releases).
SDK version history — key milestones:
| SDK Version | Notable Additions |
|---|---|
2.2.0 |
Comments API support (notion.comments.create, notion.comments.list) |
2.2.3 |
Status property type in database schemas |
2.2.4 |
Unique ID property, verification property |
2.2.13 |
Improved TypeScript discriminated unions for block types |
2.2.15 |
Current stable — bug fixes, dependency updates |
API version timeline:
| API Version | Key Changes |
|---|---|
2022-02-22 |
Rich text standardization, consistent pagination |
2022-06-28 |
Current stable — most tutorials and production apps use this |
Step 2: Perform the Upgrade
Create an isolated branch, upgrade the package, and address breaking changes before merging.
Node.js upgrade:
# Create upgrade branch
git checkout -b upgrade/notionhq-client-$(npm view @notionhq/client vBuild change detection and event handling for Notion workspaces using polling, native webhooks, and third-party connectors.
Notion Webhooks & Event Handling
Overview
Notion offers three approaches to change detection, each with different trade-offs:
| Approach | Latency | Complexity | Reliability |
|---|---|---|---|
Polling with search / databases.query |
30s-5min (your poll interval) | Low | High — you control timing |
| Native webhooks (API 2025-02+) | Near real-time | Medium | Good — requires HTTPS endpoint, retry handling |
| Third-party connectors (Zapier, Make) | 1-15 min | Low (no-code) | Vendor-dependent |
Honest assessment: Notion's native webhook support arrived in mid-2025 and covers page, database, comment, and data source events. It works well for event notification but does not deliver full payloads — you still need API calls to fetch the changed data. For many use cases, especially incremental sync and backup, polling with lasteditedtime filters remains the most battle-tested pattern.
Prerequisites
@notionhq/clientv2.3+ installed (npm install @notionhq/client)- Notion integration created at https://www.notion.so/my-integrations
- Integration shared with target pages/databases (Connections menu in Notion)
NOTION_TOKENenvironment variable set to the integration's Internal Integration Secret- For native webhooks: HTTPS endpoint accessible from the internet
Instructions
Step 1: Polling-Based Change Detection
Polling is the most reliable approach and works with every Notion API version. Use notion.search() to discover recently edited content across the entire workspace, or notion.databases.query() with timestamp filters for targeted change detection on a specific database.
Workspace-Wide Change Feed
import { Client } from '@notionhq/client';
import type {
PageObjectResponse,
DatabaseObjectResponse,
} from '@notionhq/client/build/src/api-endpoints';
const notion = new Client({ auth: process.env.NOTION_TOKEN });
interface ChangeRecord {
id: string;
object: 'page' | 'database';
lastEdited: string;
title: string;
}
// Track the high-water mark for incremental polling
let lastPollTimestamp: string | null = null;
async function pollWorkspaceChanges(): Promise<ChangeRecord[]> {
const changes: ChangeRecord[] = [];
let cursor: string | undefined;
do {
const response = await notion.search({
sort: {
direction: 'descending',
timestamp: 'last_edited_time',
},
start_cursor: cursor,
page_size: 100,
});
for (const result of response.Ready to use notion-pack?
Related Plugins
000-jeremy-content-consistency-validator
Read-only validator that generates comprehensive discrepancy reports comparing messaging consistency across ANY HTML-based website (WordPress, Hugo, Next.js, React, Vue, static HTML, etc.), GitHub repositories, and local documentation. Detects mixed messaging without making changes.
002-jeremy-yaml-master-agent
Intelligent YAML validation, generation, and transformation agent with schema inference, linting, and format conversion capabilities
003-jeremy-vertex-ai-media-master
Comprehensive Google Vertex AI multimodal mastery for Jeremy - video processing (6+ hours), audio generation, image creation with Gemini 2.0/2.5 and Imagen 4. Marketing campaign automation, content generation, and media asset production.
004-jeremy-google-cloud-agent-sdk
Google Cloud Agent Development Kit (ADK) and Agent Starter Pack mastery - build containerized multi-agent systems with production-ready templates, deploy to Cloud Run/GKE/Agent Engine, RAG agents, ReAct agents, and multi-agent orchestration.
agent-context-manager
Automatically detects and loads AGENTS.md files to provide agent-specific instructions
ai-commit-gen
AI-powered commit message generator - analyzes your git diff and creates conventional commit messages instantly