### Install Connic SDK Source: https://connic.co/docs/v1/quickstart Installs the Connic Composer SDK using pip. This is a prerequisite for setting up your local repository. ```bash pip install connic-composer-sdk ``` -------------------------------- ### GitLab CI/CD Deployment Example Source: https://connic.co/docs/v1/platform/deployment An example GitLab CI/CD configuration to automate Connic deployments. This script installs the SDK and runs the `connic deploy` command, utilizing environment variables for authentication and environment ID. Ensure these variables are set as secrets in your GitLab project. ```yaml # Example: GitLab CI/CD (.gitlab-ci.yml) deploy: stage: deploy script: - pip install connic-composer-sdk - connic deploy --env $CONNIC_ENV_ID variables: CONNIC_API_KEY: $CONNIC_API_KEY CONNIC_PROJECT_ID: $CONNIC_PROJECT_ID ``` -------------------------------- ### Initialize Connic Project Source: https://connic.co/docs/v1/quickstart Initializes a new Connic project and navigates into the project directory. This command creates the necessary project structure, including an 'agents/' folder. ```bash connic init my-agents cd my-agents ``` -------------------------------- ### Connect and Chat with WebSocket (Python) Source: https://connic.co/docs/v1/connectors/websocket Provides a Python example using the `websockets` library to connect to a WebSocket endpoint, authenticate, send a message, and handle streaming responses. Requires `pip install websockets`. ```python import asyncio import websockets import json async def chat(): uri = "" async with websockets.connect(uri) as ws: # Authenticate await ws.send(json.dumps({"secret": ""})) response = await ws.recv() print(f"Connected: {json.loads(response)}") # Send message await ws.send(json.dumps({"message": "Hello!"})) # Receive streaming response while True: msg = await ws.recv() data = json.loads(msg) if data["type"] == "stream_chunk": print(data["chunk"], end="", flush=True) elif data["type"] == "stream_end": print(f"\nDone! Tokens: {data['token_usage']}") break asyncio.run(chat()) ``` -------------------------------- ### Install Connic Composer SDK and Initialize Project Source: https://connic.co/docs/v1/index This snippet shows how to install the Connic Composer SDK using pip and then initialize a new Connic project. It concludes with instructions on how to deploy by pushing to a connected Git repository. ```bash # Install the SDK pip install connic-composer-sdk # Create a new project connic init my-agents cd my-agents # Push to your connected repo to deploy git push origin ``` -------------------------------- ### Cron Schedule Examples (Bash) Source: https://connic.co/docs/v1/connectors/cron Illustrates various cron schedule expressions for different frequencies and times. These examples cover daily, hourly, weekday, and interval-based scheduling. ```bash # Every day at midnight 0 0 * * * # Every hour 0 * * * * # Every Monday at 9 AM 0 9 * * 1 # Every weekday at 9 AM 0 9 * * 1-5 # Every 15 minutes */15 * * * * # First day of every month at midnight 0 0 1 * * ``` -------------------------------- ### Define Assistant Agent Configuration Source: https://connic.co/docs/v1/quickstart Defines an assistant agent using YAML. This includes the agent's name, model, description, system prompt, temperature, and maximum concurrent runs. ```yaml # agents/assistant.yaml name: assistant model: gemini/gemini-2.5-flash description: "A helpful assistant" system_prompt: | You are a helpful assistant. Answer questions clearly and concisely. temperature: 0.7 max_concurrent_runs: 5 ``` -------------------------------- ### Python System Prompt Context Substitution Example Source: https://connic.co/docs/v1/composer/context Provides a Python code example illustrating how context variables are substituted into a system prompt. It also highlights that unmatched placeholders are safely left as-is in the final prompt. ```python # System prompt template system_prompt: | Hello {user_name}, your account ID is {user_id}. # If context = {"user_name": "Peter", "user_id": 123} # The agent sees: # "Hello Peter, your account ID is 123." # Unmatched placeholders are left as-is: # {unknown_var} stays as {unknown_var} in the prompt ``` -------------------------------- ### Commit and Push Agent Changes Source: https://connic.co/docs/v1/quickstart Commits the initial agent setup changes and pushes them to the main branch. This action triggers an automatic deployment in Connic. ```bash git add . git commit -m "Initial agent setup" git push origin main ``` -------------------------------- ### Sync Mode Response Example (JSON) Source: https://connic.co/docs/v1/connectors/overview An example of the JSON response for a Sync mode connector, which waits for agent completion and returns the result directly. It includes the run ID and the agent's output. ```json { "status": "ok", "result": { "run_id": "uuid", "output": "Response..." } } ``` -------------------------------- ### MCP Integration - Authentication Example Source: https://connic.co/docs/v1/composer/mcp Illustrates how to configure authentication headers for an MCP server using environment variables. ```APIDOC ## MCP Integration - Authentication Example ### Description This example demonstrates how to set authentication headers for an MCP server, utilizing environment variables for secure credential management. ### Method N/A (Configuration) ### Endpoint N/A (Configuration) ### Parameters #### Path Parameters N/A #### Query Parameters N/A #### Request Body N/A ### Request Example ```yaml mcp_servers: - name: github url: https://mcp.example.com/github headers: Authorization: "Bearer ${GITHUB_TOKEN}" ``` **Note:** Use `${VAR_NAME}` syntax for secrets. Configure variables in **Settings → Variables**. ``` -------------------------------- ### Example Project Structure for Schemas Source: https://connic.co/docs/v1/composer/output-schema Illustrates a typical project directory structure where schema files are placed in a dedicated `schemas/` directory and referenced by name in agent configurations. ```text my-project/ ├── agents/ │ └── invoice-extractor.yaml ├── schemas/ │ ├── invoice-data.json # Referenced as "invoice-data" │ └── customer-info.json # Referenced as "customer-info" └── tools/ └── ... ``` -------------------------------- ### Inbound Mode Response Example (JSON) Source: https://connic.co/docs/v1/connectors/overview This is an example of the JSON response received when using the Inbound mode for a connector. It confirms the agent run has been queued and provides the run IDs. ```json { "status": "ok", "run_ids": ["uuid-1", "uuid-2"] } ``` -------------------------------- ### Triggering Webhook with GET Request (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using curl to trigger a webhook via a GET request. Query parameters (except 'secret') are passed to the agent. ```bash curl "?secret=&prompt=hello" ``` -------------------------------- ### MCP Integration - Multiple Servers Example Source: https://connic.co/docs/v1/composer/mcp Demonstrates how to configure multiple MCP servers for an agent to access tools from different sources. ```APIDOC ## MCP Integration - Multiple Servers Example ### Description This example shows how to configure an agent to connect to multiple MCP servers simultaneously, aggregating tools from various sources. ### Method N/A (Configuration) ### Endpoint N/A (Configuration) ### Parameters #### Path Parameters N/A #### Query Parameters N/A #### Request Body N/A ### Request Example ```yaml mcp_servers: - name: docs url: https://mcp.context7.com/mcp - name: search url: https://mcp.example.com/search headers: X-API-Key: "${SEARCH_API_KEY}" - name: database url: https://mcp.internal.company.com/db headers: Authorization: "Bearer ${DB_TOKEN}" tools: - query - list_tables ``` **Note:** Each server's tools become available to the agent at runtime. ``` -------------------------------- ### Multi-line System Prompt Example (YAML) Source: https://connic.co/docs/v1/composer/agent-configuration Demonstrates how to write multi-line system prompts using the pipe character (|) in YAML. This preserves newlines and allows for complex instructions for LLM agents. ```yaml system_prompt: | This is a multi-line system prompt. You can write multiple paragraphs here. The pipe character (|) preserves newlines. Use this for complex instructions. ``` -------------------------------- ### Inbound Kafka Consumer Configuration and Example Source: https://connic.co/docs/v1/connectors/kafka Configure an inbound Kafka connector to consume messages from a topic and trigger agent runs. This section includes example configurations for agent processing and Python tools for order validation and risk scoring. ```yaml # agents/order-processor.yaml version: "1.0" name: order-processor type: llm model: gemini/gemini-2.5-flash description: "Validate orders and compute routing" system_prompt: | You receive an order event in JSON (from Kafka). 1) Call orders.validate_order 2) Call orders.score_risk 3) Return JSON with order_id, status, risk_score, route tools: - orders.validate_order - orders.score_risk output_schema: order-result.json ``` ```python # tools/orders.py from typing import Dict, Any def validate_order(order_id: str, items: list[str]) -> Dict[str, Any]: """Basic order validation.""" if not order_id or not items: return {"ok": False, "reason": "missing_fields"} return {"ok": True} async def score_risk(customer: str, total: float) -> Dict[str, Any]: """Return a simple risk score and routing hint.""" score = 0.02 if total < 100 else 0.12 route = "standard" if score < 0.1 else "manual_review" return {"risk_score": score, "route": route} ``` -------------------------------- ### Test Agent with HTTP Webhook Source: https://connic.co/docs/v1/quickstart Triggers the deployed agent using an HTTP webhook connector. This example demonstrates sending a JSON payload with a message to the webhook URL. ```bash curl -X POST \ -H "Content-Type: application/json" \ -H "X-Connic-Secret: " \ -d '{"message": "Hello, agent!"}' ``` -------------------------------- ### Start an Ephemeral Test Session with Connic CLI Source: https://connic.co/docs/v1/composer/testing Starts an ephemeral test session for local agent development. This session is automatically deleted when it ends and provides a fresh isolated environment for quick experiments. Changes are synced and reflected within 2-5 seconds. ```bash # Start an ephemeral test session (auto-deleted on exit) connic test ``` -------------------------------- ### Define Assistant Agent with Tools (YAML) Source: https://connic.co/docs/v1/composer/write-tools Defines an assistant agent with a specified model and a list of available tools. The agent's instructions guide its behavior, and tools are referenced using a specific format. ```yaml name: assistant model: gemini/gemini-2.5-flash description: "Assistant with calculator and search capabilities" instruction: | You are a helpful assistant with access to tools. Use the calculator for math and web_search for current info. tools: - calculator.add - calculator.multiply - search.web_search ``` -------------------------------- ### GET Requests Source: https://connic.co/docs/v1/connectors/webhook Trigger webhooks using GET requests with query parameters for simple interactions. ```APIDOC ## GET /webhook-url ### Description Send a GET request with query parameters to trigger an agent. All query parameters (except `secret`) are passed to the agent. ### Method GET ### Endpoint ### Parameters #### Query Parameters - **secret** (string) - Required - Your secret key. - **prompt** (string) - Required - The prompt for the agent. ### Request Example ```bash curl "?secret=&prompt=hello" ``` ### Response #### Success Response (200) - **result** (object) - Contains the agent's output. - **output** (string) - The agent's response. #### Response Example ```json { "result": { "output": "Hello there! How can I help you?" } } ``` ``` -------------------------------- ### Full JSON Schema Example with Nested Objects and Arrays Source: https://connic.co/docs/v1/composer/output-schema A comprehensive JSON Schema example demonstrating nested objects, arrays, and enums for defining complex data structures like invoice items. It includes descriptions, type specifications, and required fields. ```json { "type": "object", "description": "Extracted invoice data", "properties": { "vendor": { "type": "string", "description": "Vendor/company name" }, "date": { "type": "string", "description": "Invoice date (YYYY-MM-DD)" }, "total": { "type": "number", "description": "Total invoice amount" }, "currency": { "type": "string", "description": "Currency code", "enum": ["USD", "EUR", "GBP"] }, "items": { "type": "array", "description": "Line items", "items": { "type": "object", "properties": { "name": { "type": "string" }, "quantity": { "type": "integer" }, "price": { "type": "number" } } } } }, "required": ["vendor", "total"] } ``` -------------------------------- ### LLM Model Provider Configuration Examples (YAML) Source: https://connic.co/docs/v1/composer/agent-configuration Illustrates how to specify different LLM providers and models in agent configurations. It shows the required provider prefix for models from OpenAI, Anthropic, Google Gemini, Azure OpenAI, OpenRouter, AWS Bedrock, and Google Vertex AI. ```yaml # Using OpenAI model: openai/gpt-5.2 # Using Anthropic model: anthropic/claude-sonnet-4-5-20250929 # Using Google Gemini model: gemini/gemini-2.5-pro # Using Azure OpenAI (use your deployment name) model: azure/my-gpt5-deployment # Using OpenRouter (provider/model format) model: openrouter/anthropic/claude-sonnet-4.5 # Using AWS Bedrock model: bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0 # Using Google Vertex AI model: vertex_ai/gemini-2.5-pro ``` -------------------------------- ### Outbound Kafka Producer Configuration and Example Source: https://connic.co/docs/v1/connectors/kafka Configure an outbound Kafka connector to produce agent results to a Kafka topic upon completion of agent runs. This section illustrates the structure of the output payload and message key correlation. ```bash # Inbound message with key "order-123" Message received → Agent triggered → Run completes # Outbound message uses same key "order-123" Result published with key: "order-123" (source: original) # If no inbound key, uses run_id Result published with key: "550e8400-e29b..." (source: run_id) ``` -------------------------------- ### MCP Integration - Tool Filtering Example Source: https://connic.co/docs/v1/composer/mcp Shows how to restrict the available tools from an MCP server using the `tools` configuration. ```APIDOC ## MCP Integration - Tool Filtering Example ### Description This example illustrates how to specify a subset of tools to be exposed from an MCP server, enhancing security and control. ### Method N/A (Configuration) ### Endpoint N/A (Configuration) ### Parameters #### Path Parameters N/A #### Query Parameters N/A #### Request Body N/A ### Request Example ```yaml mcp_servers: - name: filesystem url: https://mcp.example.com/filesystem # Only allow specific tools tools: - read_file - list_directory ``` **Note:** This is useful for limiting agents to specific operations, like read-only access. ``` -------------------------------- ### Start a Named Test Session with Connic CLI Source: https://connic.co/docs/v1/composer/testing Starts a named, persistent test session for ongoing feature development. This environment persists after the session ends, allowing you to reuse it. Use unique names to avoid conflicts if multiple developers are working on the same feature. ```bash connic test my-feature ``` -------------------------------- ### Uploading Multiple Files via Webhook (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using curl to upload multiple files to a webhook in a single multipart/form-data request. Files are automatically extracted and sent to the model. ```bash curl -X POST \ -H "X-Connic-Secret: " \ -F "message=Compare these two documents" \ -F "file1=@document1.pdf" \ -F "file2=@document2.pdf" ``` -------------------------------- ### JSON Schema Data Types Source: https://connic.co/docs/v1/composer/output-schema Provides examples of common JSON Schema data types including string, number, integer, boolean, array, object, and null. Each type is shown with an example value and a brief description. ```markdown Type| Example Value| Description ---|---|--- string| "hello world"| Text values number| 42.5| Any numeric value (integers and decimals) integer| 42| Whole numbers only boolean| true / false| True or false values array| [1, 2, 3]| List of items (define item schema with `items`) object| {"key": "value"}| Nested structure (define fields with `properties`) null| null| Explicit null value ``` -------------------------------- ### Uploading a Single File via Webhook (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using curl to upload a single file to a webhook using multipart/form-data. The file content is sent directly to the LLM. ```bash curl -X POST \ -H "X-Connic-Secret: " \ -F "instructions=What is the total of this invoice?" \ -F "file=@/path/to/invoice.pdf" ``` -------------------------------- ### Knowledge Agent Configuration (YAML) Source: https://connic.co/docs/v1/composer/knowledge-tools Example YAML configuration for an agent that utilizes knowledge tools. It specifies the agent's name, model, description, system prompt, and the available tools (`query_knowledge`, `store_knowledge`, `delete_knowledge`). ```yaml # agents/knowledge-agent.yaml version: "1.0" name: knowledge-agent model: gemini/gemini-2.5-pro description: "Agent with persistent memory" system_prompt: | You are an assistant with access to a knowledge base. Always search the knowledge base first before answering. tools: - query_knowledge - store_knowledge - delete_knowledge ``` -------------------------------- ### Inbound Webhook - GET Request Source: https://connic.co/docs/v1/connectors/webhook Trigger an agent using GET request with query parameters. Suitable for simple triggers. ```APIDOC ## GET /webhook-url ### Description Triggers an agent using a GET request with query parameters. All query parameters (except `secret`) are passed to the agent. ### Method GET ### Endpoint `` ### Parameters #### Query Parameters - **secret** (string) - Required - Your secret key for authentication - **prompt** (string) - Example: "hello" - The input prompt for the agent. ### Request Example ```bash curl "?secret=&prompt=hello" ``` ### Response #### Success Response (200) - **status** (string) - Indicates the status of the request, e.g., "ok". - **dispatched_to** (integer) - The number of agents dispatched. - **run_ids** (array) - A list of unique identifiers for the agent runs. #### Response Example ```json { "status": "ok", "dispatched_to": 1, "run_ids": [ "550e8400-e29b-41d4-a716-446655440003" ] } ``` ``` -------------------------------- ### Calling an Agent Tool (JSON) Source: https://connic.co/docs/v1/connectors/mcp An example of a JSON payload used to invoke a specific agent tool via the MCP Server's `tools/call` method. It includes the tool name and its arguments, which can contain a message and optional payload. ```json { "jsonrpc": "2.0", "id": 1, "method": "tools/call", "params": { "name": "invoice_processor", "arguments": { "message": "Process this invoice and extract the total", "payload": { "invoice_id": "INV-12345", "customer": "Acme Corp" } } } } ``` -------------------------------- ### AWS SQS Outbound Payload Example (JSON) Source: https://connic.co/docs/v1/connectors/sqs Shows the structure of a message payload sent to an SQS queue by an outbound connector. It contains comprehensive run metadata, including agent output, status, and token usage. ```json { "run_id": "550e8400-e29b-41d4-a716-446655440000", "agent_name": "order-processor", "status": "completed", "output": "Order processed successfully. Total: $234.56", "error": null, "started_at": "2024-01-15T10:30:00Z", "ended_at": "2024-01-15T10:30:05Z", "token_usage": { "prompt_tokens": 150, "candidates_tokens": 50, "total_tokens": 200 } } ``` -------------------------------- ### Triggering Webhook with Form Data (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using curl to send a POST request with form data to a webhook URL. Each form field becomes a key-value pair in the payload. ```bash curl -X POST \ -H "X-Connic-Secret: " \ -F "message=Process this invoice" \ -F "customer_id=12345" ``` -------------------------------- ### Setting Context in Middleware (Python) Source: https://connic.co/docs/v1/composer/agent-configuration Example Python code for a middleware function that sets context values. These values can then be used in conditional tool expressions. ```python # middleware/assistant.py async def before(content: dict, context: dict) -> dict: # Set context values that tool conditions can check context["multiply_allowed"] = True context["admin"] = content["parts"][0]["text"].startswith("/admin") return content ``` -------------------------------- ### Outbound Mode Payload Example (JSON) Source: https://connic.co/docs/v1/connectors/overview This JSON structure represents the payload sent to an external URL when an agent run completes in Outbound mode. It includes details about the run and its output. ```json { "run_id": "uuid", "agent_name": "my-agent", "status": "completed", "output": "Agent response..." } ``` -------------------------------- ### Environment Variables in Raw Editor Format Source: https://connic.co/docs/v1/platform/environments Manage environment variables in a KEY=VALUE format using the raw editor. This is useful for bulk operations, importing variables from .env files, or making quick changes. Lines starting with '#' are treated as comments. ```bash # Raw editor format (KEY=VALUE) DATABASE_URL=postgresql://user:pass@host:5432/db OPENAI_API_KEY=sk-xxxxxxxxxxxx LOG_LEVEL=info # Lines starting with # are comments ``` -------------------------------- ### Deploy Agents using Connic CLI Source: https://connic.co/docs/v1/platform/deployment Deploys agents to Connic using the CLI. You can deploy to the default environment or specify a particular environment using the `--env` flag. The CLI automatically packages necessary project files like agents, tools, and schemas. ```bash # Deploy to default environment connic deploy # Deploy to a specific environment connic deploy --env ``` -------------------------------- ### Triggering Webhook with JSON Payload (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using curl to send a POST request with a JSON payload to a webhook URL. The entire JSON payload is passed as input to the agent. ```bash curl -X POST \ -H "Content-Type: application/json" \ -H "X-Connic-Secret: " \ -d '{ \ "prompt": "Analyze this data", \ "data": {"key": "value"} \ }' ``` -------------------------------- ### Connic CLI Authentication Source: https://connic.co/docs/v1/platform/deployment Authenticates the Connic CLI using an API key and project ID. This typically creates a local configuration file. For CI/CD environments, it's recommended to use environment variables `CONNIC_API_KEY` and `CONNIC_PROJECT_ID` instead. ```bash connic login ``` ```json { "api_key": "cnc_xxxxxxxxxxxx", "project_id": "your-project-uuid" } ``` -------------------------------- ### Upload File via Webhook (Bash) Source: https://connic.co/docs/v1/connectors/webhook Examples using cURL to upload files to a Connic Co webhook using 'multipart/form-data'. Files are automatically extracted and sent to the LLM as inline data. Supports various file types up to 10MB per file. ```bash # Upload a single file curl -X POST \ -H "X-Connic-Secret: " \ -F "instructions=What is the total of this invoice?" \ -F "file=@/path/to/invoice.pdf" # Upload multiple files curl -X POST \ -H "X-Connic-Secret: " \ -F "message=Compare these two documents" \ -F "file1=@document1.pdf" \ -F "file2=@document2.pdf" ``` -------------------------------- ### Dunning Agent Configuration Example (YAML) Source: https://connic.co/docs/v1/connectors/stripe This YAML configuration defines a dunning agent named 'dunning-agent' using the 'gemini-2.5-flash' model. The system prompt instructs the AI to act as a dunning specialist, analyzing customer history to generate personalized recovery messages upon payment failure. It notes that the Stripe event is passed as input and can be accessed in middleware. ```yaml # agents/dunning-agent.yaml name: dunning-agent model: gemini-2.5-flash system: | You are a dunning specialist AI. When a payment fails, analyze the customer's history and generate a personalized recovery message. # The Stripe event is passed as input # Access in middleware: input["data"]["object"]["last_payment_error"] ``` -------------------------------- ### Stripe Webhook Inbound Integration Source: https://connic.co/docs/v1/connectors/stripe This section details how Stripe events are received and processed by the Connic platform. It covers the setup process, event payload structure, and how to configure AI agents to handle these events. ```APIDOC ## Stripe Webhook Integration ### Description This connector allows you to receive Stripe webhook events asynchronously to trigger AI agents. It's ideal for AI-driven workflows such as intelligent dunning, personalized onboarding, and fraud detection. ### How It Works The Stripe connector listens for webhook events from your Stripe account. It verifies the event's signature to ensure authenticity and then triggers your linked AI agents, passing the complete event payload for processing. ### Setup Instructions 1. **Create Connector in Connic**: Create a new Stripe connector within your Connic project and give it a name. 2. **Copy Webhook URL**: Obtain the unique webhook URL provided on the connector's details page. 3. **Add Webhook in Stripe Dashboard**: Navigate to your Stripe Dashboard, go to **Developers > Webhooks**, and click **Add endpoint**. Paste the copied webhook URL and select the specific events you wish to receive. 4. **Paste Signing Secret**: Retrieve the signing secret from Stripe (it begins with `whsec_`) and enter it into your connector's settings in Connic. Click **Save**. The connector will not process events until the signing secret is provided and saved, ensuring webhook authenticity. ### Event Payload Structure Your AI agent will receive the full Stripe event object as input. Below are examples of common event payloads: **General Event Payload Example:** ```json { "id": "evt_1234567890", "object": "event", "type": "payment_intent.payment_failed", "data": { "object": { "id": "pi_1234567890", "amount": 2000, "currency": "usd", "customer": "cus_1234567890", "last_payment_error": { "code": "card_declined", "message": "Your card was declined." } } } } ``` **Subscription Event Example:** ```json { "id": "evt_1234567890", "object": "event", "type": "customer.subscription.updated", "data": { "object": { "id": "sub_1234567890", "customer": "cus_1234567890", "status": "active", "plan": { "id": "plan_pro", "amount": 4900, "interval": "month" } } } } ``` ### Example Agent Configuration Here's how you might configure an AI agent to process Stripe events, specifically for dunning: ```yaml # agents/dunning-agent.yaml name: dunning-agent model: gemini-2.5-flash system: | You are a dunning specialist AI. When a payment fails, analyze the customer's history and generate a personalized recovery message. # The Stripe event is passed as input # Access in middleware: input["data"]["object"]["last_payment_error"] ``` ### Signature Verification Connic automatically verifies incoming webhook requests using the provided Stripe signing secret. It checks the `Stripe-Signature` header and employs HMAC-SHA256 verification along with timestamp validation to prevent replay attacks. Rate limiting is also applied to protect against abuse. ``` -------------------------------- ### Sequential Agent Pipeline Configuration (YAML) Source: https://connic.co/docs/v1/composer/agent-configuration Configures a sequential agent that chains multiple agents together. This example shows a document processing pipeline where agents execute in order, passing output from one to the next. ```yaml version: "1.0" name: document-pipeline type: sequential description: "Processes documents through extraction and validation" # Agents execute in order, each receiving the previous agent's output agents: - assistant # First: extracts key information - invoice-processor # Then: validates and processes the data ``` -------------------------------- ### Authenticate MCP Server with Bearer Token Source: https://connic.co/docs/v1/composer/mcp This example demonstrates how to secure an MCP server connection using HTTP headers, specifically for authentication with a Bearer token. It utilizes environment variables for securely storing sensitive credentials like API tokens. ```yaml mcp_servers: - name: github url: https://mcp.example.com/github headers: Authorization: "Bearer ${GITHUB_TOKEN}" ``` -------------------------------- ### Run Connic Bridge with pip Source: https://connic.co/docs/v1/connectors/bridge This command installs and runs the Connic Bridge using pip. It requires a bridge token and allows specifying allowed hosts and ports using the `--allow` flag. This method is an alternative to Docker deployment. ```bash pip install connic-bridge connic-bridge \ --token cbr_your_token_here \ --allow kafka:9092 \ --allow postgres:5432 ``` -------------------------------- ### Trigger Deployment via Git Push Source: https://connic.co/docs/v1/platform/deployment This code snippet demonstrates how to commit and push changes to a Git repository, which automatically triggers a deployment in Connic if the branch is configured for deployment. Ensure your Git repository is connected to your Connic project and the branch is mapped to an environment. ```bash # Push to trigger deployment git add . git commit -m "Update agents" git push origin main ``` -------------------------------- ### Handle WebSocket Streaming Responses (JSON) Source: https://connic.co/docs/v1/connectors/websocket Illustrates the different event types received when handling streaming responses from a WebSocket connection. This includes acknowledgment, stream start, chunks of text, and the final end event with token usage. ```json // Streaming response events { "type": "ack", "id": "msg-001", "message_number": 1 } { "type": "stream_start", "id": "msg-001", "agent": "assistant" } { "type": "stream_chunk", "id": "msg-001", "agent": "assistant", "chunk": "The weather in " } { "type": "stream_chunk", "id": "msg-001", "agent": "assistant", "chunk": "New York is sunny." } { "type": "stream_end", "id": "msg-001", "agent": "assistant", "full_response": "The weather in New York is sunny.", "token_usage": { "prompt_tokens": 45, "candidates_tokens": 15 } } ``` -------------------------------- ### Send JSON Request via Webhook (Python, JavaScript) Source: https://connic.co/docs/v1/connectors/webhook Examples for sending a JSON payload to a Connic Co webhook. This is the most common method, where the entire JSON payload is passed as input to your agent. Requires the 'requests' library in Python and 'node-fetch' in JavaScript. ```python import requests url = "" secret = "" payload = {"query": "What is the status of order 123?"} resp = requests.post( url, json=payload, headers={"X-Connic-Secret": secret}, timeout=60, ) resp.raise_for_status() print(resp.json()["result"]["output"]) ``` ```javascript import fetch from "node-fetch"; const url = ""; const secret = ""; const resp = await fetch(url, { method: "POST", headers: { "Content-Type": "application/json", "X-Connic-Secret": secret, }, body: JSON.stringify({ query: "Summarize the latest ticket" }), }); if (!resp.ok) throw new Error(await resp.text()); const data = await resp.json(); console.log(data.result.output); ``` -------------------------------- ### Send JSON Request via Webhook (Bash) Source: https://connic.co/docs/v1/connectors/webhook Example using cURL to send a JSON payload to a Connic Co webhook. This method is useful for simple command-line integrations or scripting. Ensure the 'Content-Type' header is set to 'application/json'. ```bash curl -X POST \ -H "Content-Type: application/json" \ -H "X-Connic-Secret: " \ -d '{ "prompt": "Analyze this data", "data": {"key": "value"} }' ``` -------------------------------- ### Stripe Subscription Event Payload Example (JSON) Source: https://connic.co/docs/v1/connectors/stripe This JSON object illustrates a Stripe 'customer.subscription.updated' event. It contains details about the subscription, including its ID, customer ID, current status, and plan information such as the plan ID, amount, and interval. ```json { "id": "evt_1234567890", "object": "event", "type": "customer.subscription.updated", "data": { "object": { "id": "sub_1234567890", "customer": "cus_1234567890", "status": "active", "plan": { "id": "plan_pro", "amount": 4900, "interval": "month" } } } } ``` -------------------------------- ### Initialize Git Repository for Connic Deployment (Bash) Source: https://connic.co/docs/v1/composer/overview Initializes a Git repository, adds files, commits them, and sets up a remote origin for pushing to a Connic project. Pushing to the specified branch triggers automatic deployment. ```bash # Initialize git repository git init # Add your files git add . git commit -m "Initial agent setup" # Add your connected repository as remote git remote add origin # Push to trigger deployment git push origin ``` -------------------------------- ### AWS SQS Inbound Message Payload Example (JSON) Source: https://connic.co/docs/v1/connectors/sqs Illustrates the structure of a message payload consumed from an SQS queue, including custom data and enriched SQS metadata. The `_sqs` field contains details about the message's origin and processing. ```json { "order_id": "12345", "customer_email": "john@example.com", "items": ["widget-a", "widget-b"], "total": 234.56, "_sqs": { "message_id": "abc123-def456-ghi789", "receipt_handle": "AQEBw...", "queue_url": "https://sqs.us-east-1.amazonaws.com/123456789/orders", "approximate_receive_count": 1, "sent_timestamp": 1705312800000 } } ``` -------------------------------- ### Stripe Event Payload Example (JSON) Source: https://connic.co/docs/v1/connectors/stripe This JSON object represents a typical Stripe event payload, specifically a 'payment_intent.payment_failed' event. It includes event details, type, and the data object containing information about the payment intent, such as the amount, currency, and any payment errors. ```json { "id": "evt_1234567890", "object": "event", "type": "payment_intent.payment_failed", "data": { "object": { "id": "pi_1234567890", "amount": 2000, "currency": "usd", "customer": "cus_1234567890", "last_payment_error": { "code": "card_declined", "message": "Your card was declined." } } } } ``` -------------------------------- ### Revert a Bad Commit using Git Source: https://connic.co/docs/v1/platform/deployment This code snippet shows how to revert a specific commit in Git that introduced an issue, followed by pushing the change to keep the repository history in sync. This is a manual step to correct a deployment that went wrong, complementing Connic's rollback functionality. ```bash # Roll back by reverting a bad change git revert git push origin main ``` -------------------------------- ### Initialize API Source: https://connic.co/docs/v1/connectors/mcp Initializes the connection to the Connic server and retrieves server capabilities and protocol version. ```APIDOC ## Initialize ### Description Returns server capabilities and protocol version. ### Method GET ### Endpoint /initialize ### Parameters #### Query Parameters None ### Request Example ``` GET /initialize ``` ### Response #### Success Response (200) - **capabilities** (object) - Server capabilities. - **protocol_version** (string) - The protocol version supported by the server. #### Response Example ```json { "capabilities": { "feature1": true, "feature2": false }, "protocol_version": "1.0" } ``` ``` -------------------------------- ### Using Predefined Tools in Agent YAML Source: https://connic.co/docs/v1/composer/predefined-tools Demonstrates how to include predefined tools like query_knowledge, store_knowledge, and trigger_agent directly in an agent's YAML configuration file. No custom code is required for basic usage. ```yaml # agents/assistant.yaml version: "1.0" name: assistant model: gemini/gemini-2.5-pro description: "An assistant with knowledge and orchestration capabilities" system_prompt: | You have access to a persistent knowledge base. Search it before answering questions. tools: - query_knowledge # Predefined tool - no code needed - store_knowledge - trigger_agent ``` -------------------------------- ### Importing and Using Predefined Tools in Python Source: https://connic.co/docs/v1/composer/predefined-tools Shows how to import and utilize predefined Connic tools such as trigger_agent and query_knowledge within custom Python functions. This enables complex orchestration logic by combining tool functionalities. ```python # tools/orchestration.py from connic.tools import trigger_agent, query_knowledge async def research_and_summarize(topic: str) -> dict: """Research a topic and return a summary. Args: topic: The topic to research Returns: A dictionary with the research summary """ # First, check if we have relevant knowledge knowledge = await query_knowledge( query=f"Information about {topic}", max_results=5 ) # Build context from knowledge base context = "\n".join([r["content"] for r in knowledge.get("results", [])]) # Trigger the researcher agent with context result = await trigger_agent( agent_name="researcher", payload={"topic": topic, "context": context} ) return { "topic": topic, "summary": result["response"], "sources": len(knowledge.get("results", [])) } ``` -------------------------------- ### Configuring an Agent to Use web_search Source: https://connic.co/docs/v1/composer/predefined-tools Sets up a 'researcher' agent that includes the `web_search` tool. This agent is configured to find current information on the web and cite its sources, making it suitable for research-oriented tasks. ```yaml # agents/researcher.yaml version: "1.0" name: researcher model: openai/gpt-4o description: "Research agent with web search" system_prompt: | You are a research assistant with web search capabilities. Search the web to find current information and cite sources. tools: - web_search ``` -------------------------------- ### Predefined Tools Overview Source: https://connic.co/docs/v1/composer/predefined-tools An overview of the ready-to-use tools provided by Connic that can be imported directly into agents. ```APIDOC ## Predefined Tools Ready-to-use tools provided by Connic. Import these directly into your agents without writing any code. ### Available Predefined Tools | Tool Name | Description | |----------------|--------------------------------------------------| | query_knowledge| Search the knowledge base for relevant information | | store_knowledge| Store new information in the knowledge base | | delete_knowledge| Remove entries from the knowledge base | | trigger_agent | Trigger another agent within the same project | | web_search | Search the web for real-time information | ### How Predefined Tools Work Predefined tools are built-in capabilities that Connic provides out of the box. Unlike custom tools where you write Python functions, predefined tools are ready to use. Just add their name to your agent's `tools` list. * **No Code Required:** Just add the tool name to your YAML config. Implementation is handled by Connic. * **Secure by Default:** Tools run in isolated environments with proper authentication and access controls. * **Environment Scoped:** Data is isolated per environment. Dev and prod never mix. ``` -------------------------------- ### Connect to WebSocket and Authenticate (JavaScript) Source: https://connic.co/docs/v1/connectors/websocket Establishes a WebSocket connection and authenticates using a secret key. It also handles the initial connection confirmation, logging the session ID and available agents. ```javascript // Connect to WebSocket const ws = new WebSocket(''); // Authenticate (required by default) ws.onopen = () => { ws.send(JSON.stringify({ secret: '' })); }; // Handle connection confirmation ws.onmessage = (event) => { const data = JSON.parse(event.data); if (data.type === 'connected') { console.log('Session ID:', data.session_id); console.log('Agents:', data.agents); } }; ``` -------------------------------- ### Python: Creating Content Parts for Middleware Source: https://connic.co/docs/v1/composer/middleware Illustrates how to create different types of 'parts' for the 'content' dictionary in Python middleware. This includes creating text parts and file parts (like PDFs or images) from byte data. ```python # Creating parts as dicts # Text part text_part = {"text": "Analyze this document"} # File from bytes (PDFs, images, audio, video) with open("document.pdf", "rb") as f: pdf_part = {"data": f.read(), "mime_type": "application/pdf"} # Image from bytes with open("image.png", "rb") as f: image_part = {"data": f.read(), "mime_type": "image/png"} # Adding parts to content content["parts"].append(pdf_part) content["parts"].insert(0, text_part) ``` -------------------------------- ### Using Custom Python Tools in Agent YAML Source: https://connic.co/docs/v1/composer/predefined-tools Illustrates how to reference a custom Python tool, like `orchestration.research_and_summarize`, within an agent's YAML configuration. This allows agents to leverage the complex logic defined in Python. ```yaml tools: - orchestration.research_and_summarize ``` -------------------------------- ### MCP Server - Supported Methods Source: https://connic.co/docs/v1/connectors/mcp Lists the supported methods for interacting with the MCP Server. ```APIDOC ## Supported Methods ### Description The MCP Server supports the following methods for interaction: - **`initialize`**: Returns server capabilities and protocol version. - **`tools/list`**: Returns all linked agents exposed as tools. - **`tools/call`**: Invokes a specific agent tool by its name. - **`ping`**: Performs a health check and returns an empty object if the server is responsive. ``` -------------------------------- ### PostgreSQL Manual Notifications Source: https://connic.co/docs/v1/connectors/postgres Demonstrates how to send notifications manually from application code or directly using SQL commands. This is useful for triggering agents based on specific application events. ```sql -- Send a notification manually from your application SELECT pg_notify('new_customers', '{"customer_id": "12345", "action": "created"}'); -- Or using NOTIFY directly NOTIFY new_customers, '{"customer_id": "12345", "action": "created"}'; ``` -------------------------------- ### Customer Inquiry Pipeline Configuration (YAML) Source: https://connic.co/docs/v1/composer/agent-configuration Sets up a three-step sequential agent pipeline for customer inquiries. It includes validation, database lookup, and response formatting steps, with each step defined as a separate agent. ```yaml # agents/customer-inquiry.yaml version: "1.0" name: customer-inquiry type: sequential description: "Validate input → fetch account → format response" # Each step receives the previous agent's output agents: - validate-inquiry - fetch-account - format-response ``` ```yaml # agents/validate-inquiry.yaml version: "1.0" name: validate-inquiry type: tool description: "Validate and normalize customer inquiry payload" tool_name: validation.validate_inquiry ``` ```yaml # agents/fetch-account.yaml version: "1.0" name: fetch-account type: tool description: "Lookup account details from Postgres" tool_name: postgres.fetch_user_account ``` ```yaml # agents/format-response.yaml version: "1.0" name: format-response type: llm model: gemini/gemini-2.5-pro description: "Format a helpful customer response" system_prompt: | You receive validated inquiry data plus account details. Respond concisely and include next steps when relevant. ``` -------------------------------- ### Authenticate with Connic CLI Source: https://connic.co/docs/v1/composer/testing Authenticates the Connic CLI with your project by creating a `.connic` file containing your API key and project ID. This file should be added to your `.gitignore` to prevent accidental commits of sensitive credentials. ```bash connic login ``` -------------------------------- ### AWS SQS Inbound IAM Permissions (JSON) Source: https://connic.co/docs/v1/connectors/sqs Specifies the IAM permissions required for an inbound SQS connector to receive, delete, and get attributes of messages from an SQS queue. Ensure the resource ARN matches your queue. ```json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "sqs:ReceiveMessage", "sqs:DeleteMessage", "sqs:GetQueueAttributes" ], "Resource": "arn:aws:sqs:us-east-1:123456789:my-queue" } ] } ```