Try Live
Add Docs
Rankings
Pricing
Docs
Install
Install
Docs
Pricing
More...
More...
Try Live
Rankings
Enterprise
Create API Key
Add Docs
Agent Skills
https://github.com/microsoft/skills
Admin
Agent Skills provides a collection of skills, custom agents, AGENTS.md templates, and MCP
...
Tokens:
51,175
Snippets:
400
Trust Score:
9.5
Update:
1 week ago
Context
Skills
Chat
Benchmark
74.3
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# Agent Skills Agent Skills is a repository of domain-specific knowledge packages for AI coding agents working with Azure SDKs and Microsoft AI Foundry services. It provides 132+ skills, custom agents, reusable prompts, and pre-configured MCP (Model Context Protocol) servers that enable AI coding assistants like GitHub Copilot, Claude Code, and VS Code to generate accurate, production-ready code for Azure services. The repository follows a progressive disclosure pattern where agents load only skill metadata (name and description) at startup to keep context usage low, then load full instructions when a skill is activated by matching task requirements. Skills are organized by language (Python, .NET, TypeScript, Java, Rust) with naming suffixes (`-py`, `-dotnet`, `-ts`, `-java`, `-rust`) and categorized by product area (foundry, data, messaging, monitoring, identity, security, integration). --- ## Installing Skills Skills can be installed using the `npx skills` CLI tool, which provides an interactive wizard for selecting and installing skills to your project. ```bash # Interactive skill installation wizard npx skills add microsoft/skills # Manual installation via git clone git clone https://github.com/microsoft/skills.git cp -r agent-skills/.github/skills/azure-cosmos-db-py your-project/.github/skills/ # Create symlinks for multi-project setups ln -s /path/to/agent-skills/.github/skills/mcp-builder /path/to/your-project/.github/skills/mcp-builder # Share skills across different agent configs in the same repo ln -s ../.github/skills .opencode/skills ln -s ../.github/skills .claude/skills ``` --- ## SKILL.md File Format Every skill is a directory containing a `SKILL.md` file with YAML frontmatter for metadata and Markdown body for instructions. The frontmatter triggers skill loading while the body is loaded only when the skill activates. ```markdown --- name: azure-cosmos-db-py description: Cosmos DB patterns with FastAPI service layer, dual auth, partition strategies, and TDD. Use when building Python applications with Azure Cosmos DB NoSQL. license: MIT metadata: author: microsoft version: "1.0" compatibility: Requires Python 3.11+, azure-cosmos SDK --- # Azure Cosmos DB for Python ## When to Use This Skill Use this skill when: - Building FastAPI applications with Cosmos DB - Implementing document CRUD operations - Setting up partition key strategies ## Authentication Pattern Always use DefaultAzureCredential for production: ```python from azure.identity import DefaultAzureCredential from azure.cosmos import CosmosClient credential = DefaultAzureCredential() client = CosmosClient( url="https://myaccount.documents.azure.com:443/", credential=credential ) database = client.get_database_client("mydb") container = database.get_container_client("items") ``` ## Document Operations ```python # Create item item = {"id": "1", "category": "electronics", "name": "Laptop"} container.create_item(body=item) # Query with partition key items = container.query_items( query="SELECT * FROM c WHERE c.category = @category", parameters=[{"name": "@category", "value": "electronics"}], partition_key="electronics" ) ``` ``` --- ## Azure Identity Authentication The Azure Identity SDK provides `DefaultAzureCredential` which automatically chains multiple authentication methods. This is the recommended approach for all Azure SDK operations in production environments. ```python from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient # DefaultAzureCredential tries these methods in order: # 1. Environment variables (AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID) # 2. Managed Identity (when deployed to Azure) # 3. Azure CLI (az login) # 4. Visual Studio Code Azure Account extension # 5. Interactive browser authentication (fallback) credential = DefaultAzureCredential() # Use with Azure AI Projects client = AIProjectClient( endpoint="https://myresource.services.ai.azure.com/api/projects/myproject", credential=credential ) # Environment variables for local development # AZURE_AI_PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project> # AZURE_AI_MODEL_DEPLOYMENT_NAME=gpt-4o-mini ``` ```typescript // TypeScript/Node.js authentication import { DefaultAzureCredential } from "@azure/identity"; import { AIProjectClient } from "@azure/ai-projects"; const credential = new DefaultAzureCredential(); const client = new AIProjectClient( "https://myresource.services.ai.azure.com/api/projects/myproject", credential ); ``` ```csharp // .NET authentication using Azure.Identity; using Azure.AI.Projects; var credential = new DefaultAzureCredential(); var client = new AIProjectClient( new Uri("https://myresource.services.ai.azure.com/api/projects/myproject"), credential); ``` --- ## FastAPI Backend Pattern The multi-model Pydantic pattern provides clean separation between request models, response models, and database documents. This pattern integrates with FastAPI routers and Azure Cosmos DB services. ```python # models/project.py - Pydantic models following Base, Create, Update, Response, InDB pattern from datetime import datetime from typing import Optional from pydantic import BaseModel, Field class ProjectBase(BaseModel): """Base model with common fields.""" name: str = Field(..., min_length=1, max_length=200) description: Optional[str] = None visibility: str = "public" tags: list[str] = Field(default_factory=list) class Config: populate_by_name = True # Enables camelCase aliases class ProjectCreate(ProjectBase): """Request model for creation.""" workspace_id: str = Field(..., alias="workspaceId") class ProjectUpdate(BaseModel): """Request model for partial updates (all optional).""" name: Optional[str] = Field(None, min_length=1, max_length=200) description: Optional[str] = None class Project(ProjectBase): """Response model.""" id: str slug: str author_id: str = Field(..., alias="authorId") created_at: datetime = Field(..., alias="createdAt") class Config: from_attributes = True populate_by_name = True class ProjectInDB(Project): """Database document model with doc_type for Cosmos DB queries.""" doc_type: str = "project" ``` ```python # services/project_service.py - Service layer with Cosmos DB integration from typing import Optional from app.db.cosmos import get_container, query_documents, upsert_document from app.models.project import Project, ProjectCreate, ProjectInDB class ProjectService: def _use_cosmos(self) -> bool: return get_container() is not None async def get_project_by_id(self, project_id: str) -> Optional[Project]: if self._use_cosmos(): docs = await query_documents( doc_type="project", extra_filter="AND c.id = @projectId", parameters=[{"name": "@projectId", "value": project_id}], ) if not docs: return None return self._doc_to_project(docs[0]) return None async def create_project(self, data: ProjectCreate, user_id: str) -> Project: doc = ProjectInDB( id=str(uuid.uuid4()), **data.model_dump(), author_id=user_id, created_at=datetime.utcnow(), slug=slugify(data.name), ) await upsert_document(doc.model_dump()) return self._doc_to_project(doc.model_dump()) def _doc_to_project(self, doc: dict) -> Project: return Project(**doc) ``` ```python # routers/projects.py - FastAPI router with auth dependencies from typing import Optional from fastapi import APIRouter, Depends, HTTPException, status from app.auth.jwt import get_current_user, get_current_user_required from app.models.user import User from app.models.project import Project, ProjectCreate, ProjectUpdate from app.services.project_service import ProjectService router = APIRouter(prefix="/api", tags=["projects"]) @router.get("/projects/{project_id}", response_model=Project) async def get_project( project_id: str, current_user: Optional[User] = Depends(get_current_user), # Optional auth ) -> Project: """Get project by ID (public endpoint).""" service = ProjectService() project = await service.get_project_by_id(project_id) if project is None: raise HTTPException(status_code=status.HTTP_404_NOT_FOUND) return project @router.post("/projects", status_code=status.HTTP_201_CREATED, response_model=Project) async def create_project( data: ProjectCreate, current_user: User = Depends(get_current_user_required), # Required auth ) -> Project: """Create new project (requires authentication).""" service = ProjectService() return await service.create_project(data, current_user.id) @router.patch("/projects/{project_id}", response_model=Project) async def update_project( project_id: str, data: ProjectUpdate, current_user: User = Depends(get_current_user_required), ) -> Project: """Update project (requires authentication).""" service = ProjectService() return await service.update_project(project_id, data, current_user.id) ``` --- ## MCP Server Configuration MCP (Model Context Protocol) servers provide AI agents with access to external tools and data sources. Configuration is stored in `.vscode/mcp.json` and servers are automatically available when configured in your editor. ```json { "servers": { "microsoft-docs": { "url": "https://learn.microsoft.com/api/mcp", "type": "http" }, "context7": { "type": "stdio", "command": "npx", "args": ["-y", "@upstash/context7-mcp@latest"] }, "deepwiki": { "url": "https://mcp.deepwiki.com/sse", "type": "http" }, "github": { "url": "https://api.githubcopilot.com/mcp/", "type": "http" }, "playwright": { "command": "npx", "args": ["@playwright/mcp@latest"], "type": "stdio" }, "terraform": { "command": "docker", "args": ["run", "-i", "--rm", "hashicorp/terraform-mcp-server"], "type": "stdio" }, "memory": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-memory"], "env": { "MEMORY_FILE_PATH": "${input:memory_file_path}" }, "type": "stdio" }, "sequentialthinking": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-sequential-thinking"], "type": "stdio" }, "markitdown": { "command": "uvx", "args": ["markitdown-mcp"], "type": "stdio" } } } ``` ```text Agent Workflow Pattern: Search Before Implement 1. FIRST: Query microsoft-docs MCP → Search: "Azure AI Search vector index Python SDK" → Get: Current API signatures, required parameters 2. THEN: Load relevant skill → azure-search-documents-py 3. FINALLY: Implement → Use patterns from skill + current API from docs Decision Tree: ├─ Azure/Microsoft SDK task? │ └─ YES → Query microsoft-docs FIRST, then load skill ├─ Repository patterns task? │ └─ YES → Query context7, then load skill ├─ External code/repos task? │ └─ YES → Query deepwiki or github └─ Otherwise → Use skill-only approach ``` --- ## Creating New Skills New skills follow a structured workflow to ensure quality and discoverability. Skills must include SKILL.md with proper frontmatter, acceptance criteria for testing, and be categorized by language and product area. ```bash # 1. Create skill directory with SKILL.md mkdir -p .github/skills/azure-new-service-py cat > .github/skills/azure-new-service-py/SKILL.md << 'EOF' --- name: azure-new-service-py description: Azure New Service SDK patterns for Python. Use when building applications with Azure New Service. --- # Azure New Service for Python ## When to Use This Skill Use this skill when working with Azure New Service in Python applications. ## Authentication ```python from azure.identity import DefaultAzureCredential from azure.newservice import NewServiceClient credential = DefaultAzureCredential() client = NewServiceClient(endpoint="https://...", credential=credential) ``` ## Basic Operations ```python # Example usage result = await client.do_something(param="value") ``` EOF # 2. Create acceptance criteria mkdir -p .github/skills/azure-new-service-py/references cat > .github/skills/azure-new-service-py/references/acceptance-criteria.md << 'EOF' # Acceptance Criteria ## Correct Patterns - Uses `DefaultAzureCredential` for authentication - Uses async/await for SDK operations - Properly closes clients with context managers ## Incorrect Patterns - Hardcoded credentials - Synchronous operations where async is available - Missing error handling EOF # 3. Categorize with symlink cd skills/python/foundry ln -s ../../../.github/skills/azure-new-service-py new-service # 4. Create test scenarios mkdir -p tests/scenarios/azure-new-service-py cat > tests/scenarios/azure-new-service-py/scenarios.yaml << 'EOF' name: azure-new-service-py scenarios: - name: basic-usage prompt: "Create a basic Azure New Service client" expected_patterns: - "DefaultAzureCredential" - "NewServiceClient" - "async" EOF # 5. Run tests cd tests && pnpm harness azure-new-service-py --mock --verbose ``` --- ## Test Harness The test harness validates that skills produce correct code patterns using the GitHub Copilot SDK. It evaluates generated code against acceptance criteria defined for each skill and supports iterative improvement via Ralph Loop. ```bash # Install test dependencies cd tests pnpm install # List all skills with test coverage pnpm harness --list # Run tests for a specific skill in mock mode (for CI) pnpm harness azure-ai-projects-py --mock --verbose # Run with Ralph Loop (iterative improvement until quality threshold) pnpm harness azure-ai-projects-py --ralph --mock --max-iterations 5 --threshold 85 # Run unit tests pnpm test ``` ```yaml # tests/scenarios/azure-cosmos-py/scenarios.yaml name: azure-cosmos-py scenarios: - name: create-client prompt: "Create an Azure Cosmos DB client with DefaultAzureCredential" expected_patterns: - "from azure.identity import DefaultAzureCredential" - "from azure.cosmos import CosmosClient" - "DefaultAzureCredential()" - "CosmosClient(" - name: query-items prompt: "Query items from a Cosmos DB container with a partition key" expected_patterns: - "query_items" - "partition_key" - "parameters" - name: error-handling prompt: "Handle Cosmos DB exceptions properly" expected_patterns: - "try:" - "except" - "CosmosResourceNotFoundError" ``` --- ## Deep Wiki Plugin The deep-wiki plugin generates comprehensive documentation for repositories including catalogue structures, onboarding guides, VitePress sites with dark-mode Mermaid diagrams, and llms.txt files. ```bash # Install the deep-wiki plugin /plugin marketplace add microsoft/skills /plugin install deep-wiki@skills # Generate complete wiki /deep-wiki:generate # Generate specific outputs /deep-wiki:catalogue # JSON documentation structure /deep-wiki:page # Single documentation page /deep-wiki:onboard # Audience-tailored onboarding guides /deep-wiki:agents # AGENTS.md files for repository folders /deep-wiki:llms # llms.txt files for LLM discovery /deep-wiki:build # VitePress site packaging ``` ```markdown # Generated llms.txt format (following llmstxt.org specification) # Project Name > Brief project summary ## Onboarding - [Contributor Guide](wiki/onboarding/contributor-guide.md) - [Staff Engineer Guide](wiki/onboarding/staff-engineer-guide.md) ## Architecture - [System Overview](wiki/architecture/overview.md) - [Data Layer](wiki/architecture/data-layer.md) ## Getting Started - [Installation](wiki/getting-started/installation.md) - [Quick Start](wiki/getting-started/quick-start.md) ## Optional - [Changelog](wiki/changelog.md) - [Contributing](wiki/contributing.md) ``` --- ## Azure AI Projects Client The Azure AI Projects SDK provides a high-level client for working with Microsoft AI Foundry projects including agents, evaluations, connections, and OpenAI-compatible chat completions. ```python from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient from azure.ai.projects.models import AgentThread, ThreadMessage # Initialize client credential = DefaultAzureCredential() client = AIProjectClient( endpoint="https://myresource.services.ai.azure.com/api/projects/myproject", credential=credential ) # Create an agent agent = client.agents.create_agent( model="gpt-4o-mini", name="coding-assistant", instructions="You are a helpful coding assistant.", tools=[{"type": "code_interpreter"}] ) print(f"Created agent: {agent.id}") # Create a thread and send messages thread = client.agents.create_thread() message = client.agents.create_message( thread_id=thread.id, role="user", content="Write a Python function to calculate factorial" ) # Run the agent run = client.agents.create_run(thread_id=thread.id, agent_id=agent.id) # Poll for completion import time while run.status in ["queued", "in_progress"]: time.sleep(1) run = client.agents.get_run(thread_id=thread.id, run_id=run.id) # Get messages messages = client.agents.list_messages(thread_id=thread.id) for msg in messages.data: if msg.role == "assistant": print(msg.content[0].text.value) # Cleanup client.agents.delete_agent(agent.id) ``` --- ## Frontend UI Dark Theme Pattern The frontend-ui-dark-ts skill provides patterns for building dark-themed React applications using Vite, Fluent UI v9, Tailwind CSS, and Framer Motion. ```typescript // theme/brand.ts - Custom brand color palette import type { BrandVariants } from "@fluentui/react-components"; export const brandVariants: BrandVariants = { 10: "#020305", 20: "#111723", 30: "#16263D", 40: "#193253", 50: "#1B3F6A", 60: "#1B4C82", 70: "#18599B", 80: "#1267B4", 90: "#3174C2", 100: "#4F82C8", 110: "#6790CF", 120: "#7D9ED5", 130: "#92ACDC", 140: "#A6BBE2", 150: "#BAC9E9", 160: "#CDD8EF", }; ``` ```typescript // theme/dark-theme.ts - Dark theme configuration import { createDarkTheme, type Theme } from "@fluentui/react-components"; import { brandVariants } from "./brand"; const baseDarkTheme = createDarkTheme(brandVariants); export const darkTheme: Theme = { ...baseDarkTheme, colorNeutralBackground1: "#0a0a0a", colorNeutralBackground2: "#141414", colorNeutralBackground3: "#1e1e1e", colorNeutralBackground4: "#282828", colorNeutralBackground5: "#323232", colorNeutralBackground6: "#3c3c3c", }; ``` ```tsx // main.tsx - Application entry point with theme provider import { StrictMode } from "react"; import { createRoot } from "react-dom/client"; import { FluentProvider } from "@fluentui/react-components"; import { darkTheme } from "./theme/dark-theme"; import App from "./App"; import "./index.css"; createRoot(document.getElementById("root")!).render( <StrictMode> <FluentProvider theme={darkTheme}> <App /> </FluentProvider> </StrictMode> ); ``` ```tsx // App.tsx - Main component with Framer Motion animations import { motion } from "framer-motion"; import { Title1, Text } from "@fluentui/react-components"; function App() { return ( <motion.div initial={{ opacity: 0, y: 20 }} animate={{ opacity: 1, y: 0 }} transition={{ duration: 0.5 }} className="flex flex-col items-center justify-center min-h-[80vh] gap-4" > <Title1>My Foundry App</Title1> <Text>Built with Azure AI Foundry</Text> </motion.div> ); } export default App; ``` --- ## Agent Persona Definitions Custom agent personas in `.github/agents/` define role-specific expertise and patterns. Each agent has a markdown file with frontmatter metadata and expertise sections. ```markdown --- name: Backend Developer description: FastAPI/Python specialist for backend development with Pydantic, Cosmos DB, and Azure services tools: ["read", "edit", "search", "execute"] --- You are a **Backend Development Specialist**. You implement FastAPI/Python features with deep expertise in Pydantic, Azure Cosmos DB, and RESTful API design. ## Tech Stack Expertise - **Python 3.12+** with type hints - **FastAPI** for REST APIs - **Pydantic v2.9+** for validation - **Azure Cosmos DB** for document storage - **Azure Blob Storage** for media - **JWT** for authentication - **uv** for package management ## Key Patterns - Multi-model Pydantic pattern (Base, Create, Update, Response, InDB) - Router pattern with auth dependencies - Service layer pattern with Cosmos DB fallback ## Commands ```bash cd src/backend uv sync # Install dependencies uv run fastapi dev app/main.py # Start dev server (port 8000) uv run mypy app/ # Type check uv run pytest # Run tests ``` ## Rules ✅ Use multi-model Pydantic pattern ✅ Use camelCase aliases with `populate_by_name = True` 🚫 Never return raw dicts from endpoints 🚫 Never use untyped function parameters ``` --- ## Scaffold Foundry App Prompt The scaffold-foundry-app prompt creates a complete full-stack Azure AI Foundry application with React frontend, FastAPI backend, and Azure Developer CLI (azd) infrastructure. ```yaml # azure.yaml - azd configuration name: my-foundry-app metadata: template: foundry-fullstack services: frontend: project: ./src/frontend host: containerapp language: ts docker: path: ./Dockerfile remoteBuild: true backend: project: ./src/backend host: containerapp language: python docker: path: ./Dockerfile remoteBuild: true hooks: postprovision: shell: sh run: | echo "Setting up RBAC for managed identity..." ``` ```bash # Environment setup (.env.example) # Get values from https://ai.azure.com > Project Settings # Required: Your Foundry project endpoint AZURE_AI_PROJECT_ENDPOINT=https://<resource>.services.ai.azure.com/api/projects/<project> # Required: Model deployment name AZURE_AI_MODEL_DEPLOYMENT_NAME=gpt-4o-mini # Local Development ENVIRONMENT=development PORT=8000 FRONTEND_URL=http://localhost:5173 ``` ```bash # Deployment commands az login # Login to Azure azd auth login # Login to azd azd up # Full deployment (provision + deploy) azd deploy # Deploy app changes only azd down # Tear down all resources # Local development cd src/backend && uv sync && uv run fastapi dev app/main.py cd src/frontend && pnpm install && pnpm dev ``` --- ## Summary Agent Skills provides a comprehensive framework for enhancing AI coding agents with domain-specific knowledge for Azure SDKs and Microsoft AI Foundry services. The repository contains 132+ skills across Python, .NET, TypeScript, Java, and Rust, each packaged with SKILL.md instructions, acceptance criteria, and test scenarios. Skills are activated through progressive disclosure, where only metadata is loaded at startup and full instructions are loaded when tasks match skill descriptions. The primary use cases include: (1) Building full-stack applications with Azure AI Foundry using the scaffold prompt, (2) Implementing Azure SDK integrations with correct patterns via language-specific skills, (3) Generating comprehensive documentation with the deep-wiki plugin, (4) Connecting to external tools and data sources through pre-configured MCP servers. Integration patterns follow the "search before implement" workflow where agents query microsoft-docs MCP for current API signatures, load relevant skills for established patterns, then implement using both. Skills can be installed via `npx skills add microsoft/skills` and are designed to work with GitHub Copilot, Claude Code, VS Code extensions, and other skills-compatible AI coding agents.