### Example Usage of structured Method Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Demonstrates calling the `structured` method with a prompt and a list of tools. This shows how to get structured outputs from the client. ```python c.structured(pr, tools=[sums]) ``` -------------------------------- ### Chat with Prefill Example Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Demonstrates using the `prefill` argument in the `Chat` class to guide the AI's response. ```python q = "Very concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' chat(q, prefill=pref) ``` -------------------------------- ### AsyncChat usage examples Source: https://github.com/answerdotai/claudette/blob/main/02_async.ipynb Demonstrates various ways to use the AsyncChat client, including basic conversational turns, using prefill for guided responses, streaming output, and integrating custom tools. ```python await c.structured(pr, sums) ``` ```python sp = "Always use tools if available, and calculations are requested." chat = AsyncChat(model, sp=sp) chat.c.use, chat.h ``` ```python await chat("I'm Jeremy") await chat("What's my name?") ``` ```python q = "Very concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' await chat(q, prefill=pref) ``` ```python chat = AsyncChat(model, sp=sp) r = await chat("I'm Jeremy", stream=True) async for o in r: print(o, end='') r.value ``` ```python pr = f"What is {a}+{b}?" chat = AsyncChat(model, sp=sp, tools=[sums]) r = await chat(pr) r ``` ```python await chat() ``` ```python fn = Path('samples/puppy.jpg') img = fn.read_bytes() Image(img) ``` ```python q = "In brief, what color flowers are in this image?" msg = mk_msg([img, q]) await c([msg]) ``` ```python chat = AsyncChat(model, sp=sp, cache=True) await chat("Lorem ipsum dolor sit amet" * 150) ``` ```python chat.use ``` ```python await chat("Whoops, sorry about that!") ``` ```python chat.use ``` -------------------------------- ### Async Call with Prefill Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Shows how to use the 'prefill' parameter to guide Claude's response, providing a specific starting phrase. ```python q = "Concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' await c(q, prefill=pref) ``` -------------------------------- ### Python: Chat with Prefill using Claudette Source: https://github.com/answerdotai/claudette/blob/main/README.txt Illustrates how to use the 'prefill' argument to provide Claude with a starting point for its response. This is useful for guiding the model's output format or content. ```Python from claudette.chat import Chat # Assuming 'model' is a pre-configured model object # model = ... chat = Chat(model) response = chat("Concisely, what is the meaning of life?", prefill='According to Douglas Adams,') print(response) ``` -------------------------------- ### Amazon Bedrock Client Setup Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Demonstrates how to initialize an AnthropicBedrock client using AWS credentials from environment variables and then create a Client instance for interacting with Claude models via Amazon Bedrock. Assumes `boto3` is installed. ```python import os from claudette.core import Client from claudette.providers.aws import AnthropicBedrock # Assuming models_aws is defined elsewhere and contains model identifiers # models_aws = [...] ab = AnthropicBedrock( aws_access_key=os.environ['AWS_ACCESS_KEY'], aws_secret_key=os.environ['AWS_SECRET_KEY'], ) client = Client(models_aws[0], ab) ``` -------------------------------- ### Using Prefill for Context Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Illustrates the use of the `prefill` parameter to provide initial context or a starting phrase for the AI's response. This is useful for guiding the conversation or response format. ```python q = "Very concisely, what is the meaning of life?" pref = 'According to Douglas Adams, ' c(q, prefill=pref) ``` -------------------------------- ### Chat Initialization Example Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Demonstrates how to initialize the `Chat` class with a model and a system prompt. ```python sp = "Never mention what tools you use." chat = Chat(model, sp=sp) ``` -------------------------------- ### Install Claudette Source: https://github.com/answerdotai/claudette/blob/main/index.ipynb Installs the Claudette library and its dependencies, including the Anthropic Python SDK if not already present. Ensure your ANTHROPIC_API_KEY environment variable is set. ```shell pip install claudette ``` -------------------------------- ### Install Claudette Source: https://github.com/answerdotai/claudette/blob/main/README.txt Installs the Claudette library using pip. This command also ensures that Anthropic's Python SDK is installed as a dependency if it's not already present. ```sh pip install claudette ``` -------------------------------- ### Python Example: Basic toolloop Usage Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Demonstrates how to initialize a Chat object and use the toolloop method to get an email address for a customer. This example shows the simplified interaction without manual tool message passing. ```Python chat = Chat(model, tools=tools) r = chat.toolloop('Can you tell me the email address for customer C1?') r ``` -------------------------------- ### Tool Loop Status Check Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Demonstrates how to follow up on a multi-stage tool use process by querying the status of a specific item (e.g., an order) using the `toolloop` again. ```Python for o in chat.toolloop('What is the status of order O2?'): display(o) ``` -------------------------------- ### Streaming Response with Prefill Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Combines streaming with the 'prefill' parameter to demonstrate receiving a guided, chunked response from Claude. ```python q = "Concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' async for o in (await c(q, prefill=pref, stream=True)): print(o, end='') ``` -------------------------------- ### Google Vertex AI Client Setup Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Illustrates setting up an AnthropicVertex client for Google Cloud's Vertex AI, requiring project ID and region. It then creates a Client instance for interacting with Claude models via Vertex AI. Assumes `google-cloud-aiplatform` and `google-auth` are installed. ```python from anthropic import AnthropicVertex import google.auth from claudette.core import Client # Assuming models_goog is defined elsewhere and contains model identifiers # models_goog = [...] # Get default project ID and region project_id = google.auth.default()[1] region = "us-east5" # Example region, adjust as needed gv = AnthropicVertex(project_id=project_id, region=region) client = Client(models_goog[-1], gv) ``` -------------------------------- ### Synchronous Tool Loop Usage Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Demonstrates the basic usage of the synchronous `toolloop` method. It initializes a `Chat` instance, provides an initial prompt, and iterates through the yielded assistant messages. ```Python chat = Chat(model, tools=tools) pr = 'Can you tell me the email address for customer C1?' r = chat.toolloop(pr) for o in r: display(o) ``` -------------------------------- ### Python: Chat with Prefill using Claudette Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Illustrates how to use the 'prefill' argument to provide Claude with a starting point for its response. This is useful for guiding the model's output format or content. ```Python from claudette.chat import Chat # Assuming 'model' is a pre-configured model object # model = ... chat = Chat(model) response = chat("Concisely, what is the meaning of life?", prefill='According to Douglas Adams,') print(response) ``` -------------------------------- ### CodeChat Initialization Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Example of initializing the `CodeChat` with a specific model, custom tools, system prompt, and execution confirmation settings. ```python model = models[1] chat = CodeChat(model, tools=[get_user], sp=sp, ask=True, temp=0.3) ``` -------------------------------- ### Python: Chat with Prefill using Claudette Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx.txt Illustrates how to use the 'prefill' argument to provide Claude with a starting point for its response. This is useful for guiding the model's output format or content. ```Python from claudette.chat import Chat # Assuming 'model' is a pre-configured model object # model = ... chat = Chat(model) response = chat("Concisely, what is the meaning of life?", prefill='According to Douglas Adams,') print(response) ``` -------------------------------- ### Import Core Libraries Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Imports essential modules from Claudette, fastcore, and Anthropic for building AI agents and handling tool use. Includes utilities for data handling and asynchronous operations. ```python #|default_exp toolloop import os # os.environ['ANTHROPIC_LOG'] = 'debug' from claudette.core import * from fastcore.utils import * from fastcore.meta import delegates from fastcore.xtras import save_iter from functools import wraps from anthropic.types import TextBlock, Message, ToolUseBlock from IPython.display import display, Markdown, clear_output from pprint import pprint ``` -------------------------------- ### Install Claudette Source: https://github.com/answerdotai/claudette/blob/main/README.md Installs the Claudette library using pip. This will also install the Anthropic Python SDK if it's not already present. ```shell pip install claudette ``` -------------------------------- ### Streaming Output with Tool Loop Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Shows how to use the `toolloop` with the `stream=True` option to receive output incrementally. The example processes streaming messages, distinguishing between different message types and displaying content as it arrives. ```Python orders, customers = _get_orders_customers() chat = Chat(model, tools=tools) r = chat.toolloop('Please cancel all orders for customer C1 for me.', stream=True) for o in r: if isinstance(o, (dict,Message,list)): print(o) else: for x in o: print(x, end='') display(o.value) ``` -------------------------------- ### Chat Interaction with Prefill Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Shows how to use the 'prefill' parameter to guide the model's response, providing a specific starting phrase. ```python q = "Concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' await chat(q, prefill=pref) ``` -------------------------------- ### Basic Async Call Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Demonstrates initializing the AsyncClient and making a simple asynchronous call to Claude with a greeting. ```python c = AsyncClient(model, log=True) c.use c.model = models[1] await c('Hi') ``` -------------------------------- ### Setup Environment Variables Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Imports the 'os' module to manage environment variables. Includes a commented-out line to enable debug logging for the Anthropic SDK, useful for inspecting HTTP requests and responses. ```python import os # os.environ['ANTHROPIC_LOG'] = 'debug' ``` -------------------------------- ### Python Usage Example with Patched __add__ Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Provides an example of using the patched `__add__` method for `Usage` and `ServerToolUsage` objects in Python. It shows how to combine instances using the `+` operator. ```Python server_tool_usage(1) + server_tool_usage(2) r.usage+r.usage + usage(server_tool_use=server_tool_usage(1)) ``` -------------------------------- ### Install Claudette Library Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Installs the Claudette Python library using pip. This command also ensures that Anthropic's Python SDK is installed if it's not already present. ```sh pip install claudette ``` -------------------------------- ### Install Claudette Library Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx.txt Installs the Claudette Python library using pip. This command also ensures that Anthropic's Python SDK is installed if it's not already present. ```sh pip install claudette ``` -------------------------------- ### Python Example: AsyncChat Client Initialization Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Demonstrates how to initialize the AsyncChat client from the Claudette library. It shows setting a system prompt ('sp') and passing it during client instantiation, along with a model name. ```python sp = "Never mention what tools you use." # Assuming 'model' is defined elsewhere # chat = AsyncChat(model, sp=sp) # print(chat.c.use, chat.h) ``` -------------------------------- ### Streaming Chat with Prefill Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Combines streaming with the `prefill` option to get a chunked response that starts with a specific phrase. ```python q = "Very concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' r = chat(q, prefill=pref, stream=True) for o in r: print(o, end='') r.value ``` -------------------------------- ### Async Tool Loop Usage Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Demonstrates how to use the asynchronous `toolloop` method with an `AsyncChat` instance. It requires an `async for` loop to process the yielded results from the asynchronous operations. ```Python orders, customers = _get_orders_customers() tools = [get_customer_info, get_order_details, cancel_order] chat = AsyncChat(model, tools=tools) r = chat.toolloop('Can you tell me the email address for customer C1?') async for o in r: print(o) ``` -------------------------------- ### Claudette: Python Tool Use Example Source: https://github.com/answerdotai/claudette/blob/main/README.txt Demonstrates how Claudette uses Python docstrings for parameter and return value descriptions, making function definitions user-friendly. It also shows how to initialize the Chat constructor with a list of tools and optionally force the use of a specific tool via `tool_choice`. ```python def sums( a:int, # First thing to sum b:int=1 # Second thing to sum ) -> int: # The sum of the inputs "Adds a + b." print(f"Finding the sum of {a} and {b}") return a + b chat = Chat(model, sp=sp, tools=[sums], tool_choice='sums') ``` -------------------------------- ### CodeChat Initialization Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Initializes the CodeChat instance with a specific model, custom tools (like `get_user`), a system prompt, and execution parameters. ```Python model = models[1] chat = CodeChat(model, tools=[get_user], sp=sp, ask=True, temp=0.3) ``` -------------------------------- ### Import necessary libraries Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Imports essential modules for shell interaction, delegation, and exception handling required for the code interpreter. ```Python from toolslm.shell import get_shell from fastcore.meta import delegates import traceback ``` -------------------------------- ### Claudette: Python Tool Use Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Demonstrates how Claudette uses Python docstrings for parameter and return value descriptions, making function definitions user-friendly. It also shows how to initialize the Chat constructor with a list of tools and optionally force the use of a specific tool via `tool_choice`. ```python def sums( a:int, # First thing to sum b:int=1 # Second thing to sum ) -> int: # The sum of the inputs "Adds a + b." print(f"Finding the sum of {a} and {b}") return a + b chat = Chat(model, sp=sp, tools=[sums], tool_choice='sums') ``` -------------------------------- ### Tool Loop with Division by Zero Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Demonstrates how the `toolloop` handles potential errors, such as division by zero, when executing tools. The output shows how the model might report or handle such exceptions. ```Python chat = Chat(model, tools=mydiv) r = chat.toolloop('Try dividing 1 by 0 and see what the error result is') ``` -------------------------------- ### System Prompt Configuration Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Defines the system prompt string (`sp`) for the AI model, outlining its role, pre-imported modules, and behavior regarding tool execution and error handling. ```Python sp = f'''You are a knowledgable assistant. Do not use tools unless needed. Don't do complex calculations yourself -- use code for them. The following modules are pre-imported for `run_cell` automatically: {CodeChat.imps} Never mention what tools you are using. Note that `run_cell` interpreter state is *persistent* across calls. If a tool returns `#DECLINED#` report to the user that the attempt was declined and no further progress can be made. In that case, do *not* attempt to run any further code -- stop execution *IMMEDIATELY* and tell the user it was declined. When using a tool, *ALWAYS* before every use of every tool, tell the user what you will be doing and why.''' ``` -------------------------------- ### Initialize and Interact with Claudette Chat Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Initializes a Claudette Chat instance with a specified model and the defined tools. Demonstrates making a query, handling the tool call response, and executing a more complex request involving multiple tool calls. ```python tools = [get_customer_info, get_order_details, cancel_order] # Assuming 'model' is already defined and configured # model = models[1] # Example model selection chat = Chat(model, tools=tools) r = chat('Can you tell me the email address for customer C1?') print(r.stop_reason) print(r.content) r = chat() # Responds to the tool call contents(r) chat = Chat(model, tools=tools) r = chat('Please cancel all orders for customer C1 for me.') print(r.stop_reason) print(r.content) ``` -------------------------------- ### Initialize Chat with System Prompt Source: https://github.com/answerdotai/claudette/blob/main/index.ipynb Sets up a chat session with a specified system prompt. The system prompt guides the AI's behavior throughout the conversation. ```python chat = Chat(model, sp="""You are a helpful and concise assistant.""" ) ``` -------------------------------- ### Async Anthropic Client Setup Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Shows how to initialize and use the asynchronous client for Anthropic's messaging API. It includes setting up the client and making a message creation request. ```python from anthropic import AsyncAnthropic # Assuming 'models' is a list of available models, e.g., models = ['claude-3-5-sonnet-20240620'] model = models[0] # Or the desired model cli = AsyncAnthropic() m = {'role': 'user', 'content': "I'm Jeremy"} r = await cli.messages.create(messages=[m], model=model, max_tokens=100) r ``` -------------------------------- ### Claudette: Python Tool Use Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx.txt Demonstrates how Claudette uses Python docstrings for parameter and return value descriptions, making function definitions user-friendly. It also shows how to initialize the Chat constructor with a list of tools and optionally force the use of a specific tool via `tool_choice`. ```python def sums( a:int, # First thing to sum b:int=1 # Second thing to sum ) -> int: # The sum of the inputs "Adds a + b." print(f"Finding the sum of {a} and {b}") return a + b chat = Chat(model, sp=sp, tools=[sums], tool_choice='sums') ``` -------------------------------- ### Toolloop Execution Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Demonstrates executing a prompt using `toolloop` with the `CodeChat` instance, including tracing and continuation functions, to create a `checksum` function. ```python pr = '''Create a 1-line function `checksum` for a string `s`, that multiplies together the ascii values of each character in `s` using `reduce`.''' chat.toolloop(pr, temp=0.2, trace_func=_show_cts, cont_func=_cont_decline) ``` -------------------------------- ### Send Message with Prefill Source: https://github.com/answerdotai/claudette/blob/main/README.md Utilizes the `prefill` parameter to guide the assistant's response by providing a starting text. This is useful for setting the tone or providing context for the model's output. ```python chat("Concisely, what is the meaning of life?", prefill='According to Douglas Adams,') ``` -------------------------------- ### Multi-stage Tool Loop Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Shows a practical application of the `toolloop` for a multi-stage task, such as cancelling multiple orders for a customer. The function handles the sequence of tool calls required to complete the request. ```Python orders, customers = _get_orders_customers() chat = Chat(model, tools=tools) r = chat.toolloop('Please cancel all orders for customer C1 for me.') for o in r: display(o) ``` -------------------------------- ### Tool Loop with Limited Steps Example Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Illustrates the behavior of `toolloop` when the `max_steps` parameter is set, limiting the number of iterations. This example uses a division function to perform a sequence of calculations. ```Python def mydiv(a:float, b:float): """Divide two numbers""" return a / b chat = Chat(model, tools=[mydiv]) r = chat.toolloop('Please calculate this sequence using your tools: 43/23454; 652/previous result; 6843/previous result; 321/previous result', max_steps=2) for o in r: display(o) ``` -------------------------------- ### Initialize Anthropic Client Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Instantiates the Anthropic client object, which is the primary interface for interacting with the Anthropic API. This requires the 'anthropic' library to be installed. ```python cli = Anthropic() ``` -------------------------------- ### Prepare Tool Use Parameters Source: https://github.com/answerdotai/claudette/blob/main/02_async.ipynb Sets up variables for a tool-use scenario, including input values for the `sums` function and defining the system prompt and the desired tool choice. ```python a,b = 604542,6458932 pr = f"What is {a}+{b}?" sp = "You are a summing expert." ``` -------------------------------- ### Async Call with Prefill Text Source: https://github.com/answerdotai/claudette/blob/main/02_async.ipynb Makes an asynchronous API call with a specific question and a prefill string to guide the beginning of the response. ```python q = "Very concisely, what is the meaning of life?" pref = 'According to Douglas Adams,' await c(q, prefill=pref) ``` -------------------------------- ### Example List Manipulation Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb A simple Python snippet demonstrating list manipulation by appending empty dictionaries. ```python foo = [] foo.append({}) foo.append({}) foo ``` -------------------------------- ### Example Query for Summation Source: https://github.com/answerdotai/claudette/blob/main/README.md Sets up two large integer variables `a` and `b` and constructs a natural language query `pr` asking for their sum. This query is intended to be processed by Claude, triggering tool use. ```Python a,b = 604542,6458932 pr = f"What is {a}+{b}?" pr ``` -------------------------------- ### Explain Configuration File with Tools Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Uses the chat object to explain a configuration file (`_quarto.yml`) by leveraging its tools. It then finds `ToolUseBlock` within the response. ```python r = chat('Please explain very concisely what my _quarto.yml does. It is in the current path. Use your tools') find_block(r, ToolUseBlock) ``` -------------------------------- ### Message Structure Example Source: https://github.com/answerdotai/claudette/blob/main/README.txt Provides an example of the internal message structure used by the Claudette library, detailing fields such as message ID, content (text and type), model used, role, stop reason, and usage statistics. This structure is part of the API response. ```APIDOC Message: id: string (e.g., `msg_014rVQnYoZXZuyWUCMELG1QW`) content: list[dict] - type: string (e.g., 'text') - text: string (e.g., 'Claudette is a high-level wrapper for Anthropic\'s Python SDK...') model: string (e.g., `claude-3-5-sonnet-20241022`) role: string (e.g., 'assistant') stop_reason: string (e.g., 'end_turn') stop_sequence: string | None (e.g., None) type: string (e.g., 'message') usage: dict - input_tokens: integer (e.g., 4) - output_tokens: integer (e.g., 179) - cache_creation_input_tokens: integer (e.g., 7205) - cache_read_input_tokens: integer (e.g., 0) ``` -------------------------------- ### get_user Function Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb A mock function to retrieve the current username, demonstrating how custom tools can be integrated into the chat session for multi-stage tool use. ```Python def get_user()->str: """Get the username of the user running this session""" print("Looking up username") return 'Jeremy' ``` -------------------------------- ### Chat Interaction Example Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Shows a basic conversation flow using the `Chat` class, including asking a question and then asking for the user's name. ```python chat("I'm Jeremy") chat("What's my name?") ``` -------------------------------- ### Streaming Response Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Illustrates how to receive and process a response from Claude in real-time using the 'stream=True' option. ```python async for o in (await c('Hi', stream=True)): print(o, end='') ``` -------------------------------- ### _quarto.yml Configuration Explanation Source: https://github.com/answerdotai/claudette/blob/main/README.md This YAML snippet represents the configuration settings found within a `_quarto.yml` file. It details project setup, HTML styling, website features, and metadata sources for a Quarto documentation website, likely for a Python package using nbdev. ```yaml # _quarto.yml Configuration Summary: # Project Configuration: # - Sets up a website project type # - Includes all .txt files as resources # - Configures preview server on port 3000 without auto-opening browser # HTML Formatting: # - Uses the "cosmo" theme with custom CSS from `styles.css` # - Enables table of contents, code tools, and syntax highlighting # - Sets a custom layout with wider body (1800px) and narrower sidebar (180px) # - Uses "arrow" highlight style with custom code block styling # - Keeps markdown files during rendering # Website Features: # - Enables social media cards (Twitter and Open Graph) # - Adds a search-enabled navbar with primary background # - Uses a floating sidebar style # - Links to GitHub issues via repo-actions # Metadata Sources: # - Pulls additional configuration from `nbdev.yml` and `sidebar.yml` files # Overall Purpose: # Setup for a documentation website, likely for a Python package using nbdev, # focusing on code display, readability, and navigation. ``` -------------------------------- ### Tool Loop Handling Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Illustrates how to manage tool loop functionality for complex, multi-step interactions within the Claudette library. This snippet shows the basic import for tool loop handling. ```python import os # Placeholder for tool loop logic # The actual implementation would involve defining tools and managing their execution flow. ``` -------------------------------- ### Select Model for Example Usage Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Assigns the second model from the 'models' list to the 'model' variable. This model is then used in subsequent examples, often chosen for its representative features like being the latest Sonnet version. ```python model = models[1]; model ``` -------------------------------- ### Prepare Prompt and System Prompt Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Sets up the user's prompt string and a system prompt that instructs Claude to always use tools for calculations. ```python a,b = 604542,6458932 pr = f"What is {a}+{b}?" sp = "Always use tools when calculations are required." ``` -------------------------------- ### Python Example: toolloop with tracing Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Illustrates using the toolloop method with the `trace_func=print` argument to visualize the multi-stage tool processing. This helps in understanding the flow of tool calls and responses during execution. ```Python chat = Chat(model, tools=tools) r = chat.toolloop('Please cancel all orders for customer C1 for me.', trace_func=print) r ``` -------------------------------- ### Initiate Dialog with Claude Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Starts a dialog with Claude by formatting the initial prompt and passing the tool schemas and tool choice. The response `r` contains Claude's decision, potentially including a `ToolUseBlock`. ```python msgs = mk_msgs(pr) r = c(msgs, sp=sp, tools=tools, tool_choice=choice) r ``` -------------------------------- ### Access Available Models Source: https://github.com/answerdotai/claudette/blob/main/index.ipynb Retrieves a list of models available through the Claudette library, which are sourced from the Anthropic SDK. The example selects the 'Sonnet 4' model. ```python models # Select a model, e.g., Sonnet 4 model = models[1] model ``` -------------------------------- ### Python Example: Tool Integration and Async Chat Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Illustrates the usage of the Claudette library for asynchronous chat with tool integration. It shows how to prepare messages, define tools using 'get_schema', make tool choices, and process responses from the asynchronous client. ```python a,b = 604542,6458932 pr = f"What is {a}+{b}?" sp = "You are a summing expert." tools=[get_schema(sums)] choice = mk_tool_choice('sums') # Assuming 'c' is an initialized client instance # r = await c(pr, sp=sp, tools=tools, tool_choice=choice) # tr = mk_toolres(r, ns=globals()) # msgs += tr # contents(await c(msgs, sp=sp, tools=tools)) ``` -------------------------------- ### Message Structure Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx.txt Provides an example of the internal message structure used by the Claudette library, detailing fields such as message ID, content (text and type), model used, role, stop reason, and usage statistics. This structure is part of the API response. ```APIDOC Message: id: string (e.g., `msg_014rVQnYoZXZuyWUCMELG1QW`) content: list[dict] - type: string (e.g., 'text') - text: string (e.g., 'Claudette is a high-level wrapper for Anthropic\'s Python SDK...') model: string (e.g., `claude-3-5-sonnet-20241022`) role: string (e.g., 'assistant') stop_reason: string (e.g., 'end_turn') stop_sequence: string | None (e.g., None) type: string (e.g., 'message') usage: dict - input_tokens: integer (e.g., 4) - output_tokens: integer (e.g., 179) - cache_creation_input_tokens: integer (e.g., 7205) - cache_read_input_tokens: integer (e.g., 0) ``` -------------------------------- ### Message Structure Example Source: https://github.com/answerdotai/claudette/blob/main/llms-ctx-full.txt Provides an example of the internal message structure used by the Claudette library, detailing fields such as message ID, content (text and type), model used, role, stop reason, and usage statistics. This structure is part of the API response. ```APIDOC Message: id: string (e.g., `msg_014rVQnYoZXZuyWUCMELG1QW`) content: list[dict] - type: string (e.g., 'text') - text: string (e.g., 'Claudette is a high-level wrapper for Anthropic\'s Python SDK...') model: string (e.g., `claude-3-5-sonnet-20241022`) role: string (e.g., 'assistant') stop_reason: string (e.g., 'end_turn') stop_sequence: string | None (e.g., None) type: string (e.g., 'message') usage: dict - input_tokens: integer (e.g., 4) - output_tokens: integer (e.g., 179) - cache_creation_input_tokens: integer (e.g., 7205) - cache_read_input_tokens: integer (e.g., 0) ``` -------------------------------- ### First Toolloop Execution: Create checksum function Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Executes the `toolloop` with a prompt to create a one-line checksum function using `reduce`. This tests the code interpreter's ability to define functions. ```Python pr = '''Create a 1-line function `checksum` for a string `s`, that multiplies together the ascii values of each character in `s` using `reduce`.''' for o in chat.toolloop(pr, cont_func=_cont_decline): display(o) ``` -------------------------------- ### Async Tool Loop Implementation Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Provides the asynchronous implementation of the `toolloop` method for the `AsyncChat` class. It mirrors the synchronous version but uses `async`/`await` for non-blocking operations, suitable for asynchronous applications. ```Python #| export from claudette.asink import AsyncChat #| exports @patch @delegates(AsyncChat.__call__) def toolloop( self: AsyncChat, pr, # Prompt to pass to Claude max_steps=10, # Maximum number of tool requests to loop through cont_func: callable = noop, # Function that stops loop if returns False final_prompt = _final_prompt, # Prompt to add if last message is a tool call **kwargs ): """Add prompt `pr` to dialog and get a response from Claude, automatically following up with `tool_use` messages""" @save_iter async def _f(o): init_n = len(self.h) r = await self(pr, **kwargs) yield r if len(self.last)>1: yield self.last[1] for i in range(max_steps-1): if self.c.stop_reason != 'tool_use': break r = await self(final_prompt if i==max_steps-2 else None, **kwargs) yield r if len(self.last)>1: yield self.last[1] if not cont_func(*self.h[-3:]): break o.value = self.h[init_n+1:] return _f() ``` -------------------------------- ### Define Mock Customer and Order Data Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Sets up mock data structures for customers and orders, including an entity relationship between them. This data is used to simulate a real-world scenario for the AI agent. ```python def _get_orders_customers(): orders = { "O1": dict(id="O1", product="Widget A", quantity=2, price=19.99, status="Shipped"), "O2": dict(id="O2", product="Gadget B", quantity=1, price=49.99, status="Processing"), "O3": dict(id="O3", product="Gadget B", quantity=2, price=49.99, status="Shipped")} customers = { "C1": dict(name="John Doe", email="john@example.com", phone="123-456-7890", orders=[orders['O1'], orders['O2']]), "C2": dict(name="Jane Smith", email="jane@example.com", phone="987-654-3210", orders=[orders['O3']]) } return orders, customers orders, customers = _get_orders_customers() ``` -------------------------------- ### Client Initialization Source: https://github.com/answerdotai/claudette/blob/main/README.txt Initializes the Claudette client with a specified model. This is the first step before making any structured data requests. ```python cli = Client(model) ``` -------------------------------- ### Web Search Usage Tracking Source: https://github.com/answerdotai/claudette/blob/main/README.md Illustrates how web search usage is tracked separately from normal token usage and provides an example of the usage statistics format. ```python # Accessing usage statistics # Assuming 'chat' is an instance of a chat object print(chat.usage) ``` ```APIDOC Usage Statistics Format: {'cache_creation_input_tokens': int, 'cache_read_input_tokens': int, 'input_tokens': int, 'output_tokens': int, 'server_tool_use': {'web_search_requests': int}, 'service_tier': str} Example: In: 7302; Out: 325; Cache create: 0; Cache read: 0; Total Tokens: 7627; Search: 1 Note: Web searches have separate pricing (e.g., $10 per 1,000 requests as of May 2024). ``` -------------------------------- ### Chat with Prefill Source: https://github.com/answerdotai/claudette/blob/main/index.ipynb Utilizes the 'prefill' parameter to guide Claude's response by providing the initial words of the expected output. This helps steer the AI's generation. ```python chat("Concisely, what is the meaning of life?", prefill='According to Douglas Adams,') ``` -------------------------------- ### Initialize Chat and Execute Tool Loop in Python Source: https://github.com/answerdotai/claudette/blob/main/03_text_editor.ipynb This snippet demonstrates how to initialize a `Chat` object with specific tools and then execute a `toolloop` command. It shows the setup for interacting with a language model that can utilize external tools, such as a text editor configuration, to process user queries. ```Python c = Chat(model, tools=[text_editor_conf['sonnet']], ns=mk_ns(str_replace_based_edit_tool)) c.toolloop('Please explain what my _quarto.yml does. Use your tools') ``` -------------------------------- ### Google Vertex AI Client Initialization Source: https://github.com/answerdotai/claudette/blob/main/index.ipynb Demonstrates initializing a client for Anthropic models via Google Vertex AI. This requires creating an `AnthropicVertex` instance with project details and then passing it to the `Client` constructor. ```python from anthropic import AnthropicVertex import google.auth from claudette.chat import Client # Assuming 'models_goog' is a list of available models # models_goog = [...] # project_id = google.auth.default()[1] # gv = AnthropicVertex(project_id=project_id, region="us-east5") # client = Client(models_goog[-1], gv) ``` -------------------------------- ### CodeChat Class Definition Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Defines the CodeChat class, inheriting from Chat, to manage a persistent IPython session. It pre-imports common modules and appends a `run_cell` tool. ```Python @delegates() class CodeChat(Chat): imps = 'os, warnings, time, json, re, math, collections, itertools, functools, dateutil, datetime, string, types, copy, pprint, enum, numbers, decimal, fractions, random, operator, typing, dataclasses' def __init__(self, model: Optional[str] = None, ask:bool=True, **kwargs): super().__init__(model=model, **kwargs) self.ask = ask self.tools.append(self.run_cell) self.shell = get_shell() self.shell.run_cell('import '+self.imps) ``` -------------------------------- ### System Prompt for Tool Use Source: https://github.com/answerdotai/claudette/blob/main/README.txt Sets a system prompt to guide Claude's behavior, specifically instructing it not to mention the tools it uses when providing answers. This is crucial for a more natural user experience. ```python sp = "Never mention what tools you use." ``` -------------------------------- ### Streaming Chat Response Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Example of using the `stream=True` option to receive the AI's response in chunks. ```python chat = Chat(model, sp=sp) for o in chat("I'm Jeremy", stream=True): print(o, end='') ``` -------------------------------- ### Select Model for Use Source: https://github.com/answerdotai/claudette/blob/main/README.txt Demonstrates how to select a specific model from the `models` list for use in subsequent API calls. This example selects the second model in the list, 'claude-3-5-sonnet-20241022'. ```python model = models[1] ``` -------------------------------- ### Execute Async Call with Tool Choice Source: https://github.com/answerdotai/claudette/blob/main/02_async.ipynb Performs an asynchronous API call, providing the formatted messages, system prompt, the available tools, and a specific tool choice to guide the model's response. ```python msgs = mk_msgs(pr) r = await c(msgs, sp=sp, tools=tools, tool_choice=choice) r ``` -------------------------------- ### Continuation Function for Toolloop Source: https://github.com/answerdotai/claudette/blob/main/01_toolloop.ipynb Defines a continuation function `_cont_decline` that checks the response from a tool call. If the response content is '#DECLINED#', it stops the toolloop. ```Python def _cont_decline(call, resp, asst): return resp['content'][0]['content'] != '#DECLINED#' ``` -------------------------------- ### Example Chat Interaction with Web Search Source: https://github.com/answerdotai/claudette/blob/main/00_core.ipynb Demonstrates a typical interaction with the Anthropic API using a configured chat model that includes the web search tool. It shows how to send a query and receive a response. ```Python chat = Chat(model, sp='Be concise in your responses.', tools=[search_conf()], cache=True) pr = 'What is the weather in San Diego?' r = chat(pr) r ```