Introduction
This guide explains how to effectively use our documentation with Large Language Models (LLMs) and AI coding assistants like Claude Code, ChatGPT, GitHub Copilot, and Cursor. We provide two complementary approaches for accessing our documentation with AI:- Claude Code Skills - Offline, comprehensive reference for IDE-integrated AI assistants
- Context7 - Live, always-up-to-date documentation for web-based and MCP-enabled AI tools
Option 1: Claude Code Skill (Recommended for IDEs)
For developers using AI coding assistants in their IDE, we provide a comprehensive Claude Code skill that packages our complete documentation, integration patterns, and best practices into an easily installable format.What is a Claude Code Skill?
A Claude Code skill is a packaged knowledge base that your AI coding assistant can reference while you work. It provides instant access to documentation, code patterns, and troubleshooting guidance without requiring internet access.What Does the Primer Skill Include?
- Complete Component Reference - All web component APIs, props, events, and methods
- React Integration Patterns - React 18 & 19 specific guidance with hooks and patterns
- Critical Best Practices - Common mistakes and how to avoid them (like object reference stability)
- SSR Support - Server-side rendering with Next.js and SvelteKit
- CSS Theming Guide - Custom styling and theme customization
- Troubleshooting - Solutions to common integration issues
Installation Instructions
Installation differs by platform. Choose your AI assistant:For Claude Code (claude.ai/code)
Via Marketplace (Recommended):~/.claude/skills/primer-web-components/
For Cursor
Use the same manual installation method as Claude Code, but copy to~/.cursor/skills/ instead.
For Other AI Assistants
Most AI coding assistants support directory-based skills. Check your tool’s documentation for the skills directory location and use the manual installation method above.Verifying Installation
Ask your AI assistant:When to Use the Claude Code Skill
- Working in your IDE with integrated AI assistance
- Need offline access to documentation
- Want best practices and common pitfalls highlighted automatically
- Focused on implementation and debugging
- Working with React, Next.js, or SvelteKit integrations
Option 2: Using Context7 for Live Documentation
We’ve partnered with Context7 to provide LLM-friendly versions of our documentation that are always up-to-date and optimized for AI consumption.What is Context7?
Context7 is a service that pulls up-to-date, version-specific documentation and code examples directly from the source, making it easier for LLMs to provide accurate assistance with our SDK.Benefits of Using Context7 for Our Documentation
- Always up-to-date: Documentation stays in sync with the latest version of our components
- No hallucinations: LLMs receive accurate API information, reducing incorrect code suggestions
- Context-aware: The LLM understands the full context of our component library
- Improved code examples: Get working code snippets that follow our best practices
- Reduced token usage: Optimized documentation that focuses on what matters
When to Use Context7
- Using web-based LLMs (ChatGPT, Claude.ai web interface)
- Need to verify against the absolute latest documentation
- Researching new features or recent API changes
- Working with tools that support MCP (Model Context Protocol)
- Want dynamic documentation fetching during conversations
How to Access Context7 Documentation
You can access our Context7 documentation in several ways:Method 1: Reference the Raw Link in Your Prompt
You can directly ask an LLM to retrieve and analyze our documentation by providing the Context7 link:Method 2: Copy the Documentation Text
Browse our LLM-friendly documentation versions at: https://context7.com/primer-io/examples Then copy the relevant sections and paste them into your conversation with the LLM.Method 3: Use the Context7 MCP Server
For an even better experience, you can use the Context7 MCP server with compatible AI tools. This allows the AI to automatically pull the latest documentation without you needing to copy/paste anything.Compatible Applications
- Cursor
- Windsurf
- VS Code (with MCP Support extension)
- Zed
- Claude Desktop
- Claude (claude.ai/code)
- BoltAI
Combining Both Approaches
For the best experience, use both the Claude Code skill and Context7 together:- Claude Code skill provides immediate access to patterns and best practices while coding
- Context7 ensures you’re working with the latest API documentation when needed
Example Prompts
Here are some effective prompt examples for using our documentation with LLMs:With Claude Code Skill (in your IDE)
Using Context7 Raw Link
Using Context7 MCP Server
General Documentation Request
Combining Both
Installing the Context7 MCP Server
The Context7 MCP server can be integrated with various development environments to provide seamless documentation access.Installation Instructions
The Context7 MCP server is available on GitHub at: https://github.com/upstash/context7 Each client requires slightly different configuration:For Cursor
Add this to your~/.cursor/mcp.json file:
For Claude Desktop
Add this to yourclaude_desktop_config.json file:
For Claude.ai/Code
Run this command:Best Practices for Using LLMs with Our Documentation
- Be specific: Clearly state which components you want to use and what functionality you need
- Provide context: Mention your framework (React, Vue, etc.) and any specific requirements
- Ask for step-by-step explanations: Request that the LLM explain how the code works
- Validate the output: Always verify generated code against our official documentation
- Iterate: If the initial response isn’t quite right, refine your prompt with more details