### Hello Semantic Kernel Tutorial Source: https://github.com/dotnet/ai-samples/blob/main/README.md Introduces the basic usage of Semantic Kernel for .NET applications. This tutorial covers the initial setup and a simple 'Hello World' example to get started with AI integration. ```dotnet This snippet is a placeholder for the code found at the GitHub link: ./src/build-2024/01%20-%20Hello%20Semantic%20Kernel ``` -------------------------------- ### Run Azure OpenAI Examples Application Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Demonstrates how to run the Azure OpenAI Examples application from the command line using the .NET CLI. This is the primary method to start the sample application and select different examples. ```dotnetcli dotnet run ``` -------------------------------- ### Deploy Azure Resources with Azure Developer CLI Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/README.md Deploys necessary Azure resources, including Azure OpenAI service and models, using the Azure Developer CLI. This command automates the setup process for the quickstart applications. It requires prior setup of prerequisites like .NET SDK, Azure CLI, and Azure OpenAI access. ```bash azd up ``` -------------------------------- ### Install `aieval` dotnet tool Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Installs the `aieval` dotnet tool for your repository. Replace `` with the desired package version. This command also creates a `.config/dotnet-tools.json` manifest if it doesn't exist. ```bash dotnet tool install Microsoft.Extensions.AI.Evaluation.Console --version --create-manifest-if-needed ``` -------------------------------- ### Run HikerAI Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/semantic-kernel/02-HikerAI/README.md Command to execute the HikerAI .NET console application after setting up the API key. This starts the application, which will then prompt for or use configured inputs to get hiking recommendations from OpenAI. ```bash dotnet run ``` -------------------------------- ### Run Ollama Examples Application Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaExamples/README.md Command to execute the .NET application from the terminal. This command starts the sample application, allowing users to select and run different Ollama integration examples. ```dotnetcli dotnet run ``` -------------------------------- ### Example Application Output Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md Illustrates the console output when the .NET web application starts successfully. It shows listening addresses and environment details. ```console Building... info: Microsoft.Hosting.Lifetime[14] Now listening on: http://localhost: info: Microsoft.Hosting.Lifetime[0] Application started. Press Ctrl+C to shut down. info: Microsoft.Hosting.Lifetime[0] Hosting environment: Development info: Microsoft.Hosting.Lifetime[0] Content root path: ``` -------------------------------- ### Run .NET Web API Project Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIWebAPI/README.md Command to navigate to the project directory and start the .NET Web API application. This command compiles and runs the application, making it available for testing. ```dotnetcli dotnet run ``` -------------------------------- ### Run .NET Application Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Command to execute a .NET application from the terminal. This is typically used to start the sample application after cloning the repository. ```dotnetcli dotnet run ``` -------------------------------- ### Get aieval Tool Help Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Displays help information for the 'aieval' dotnet tool, including available commands and options for managing cached responses and evaluation results. ```dotnet dotnet aieval --help ``` -------------------------------- ### Run Ollama Web API Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaWebAPI/README.md Command to run the Ollama Web API application from the project directory using the .NET CLI. This starts the local web server. ```dotnetcli dotnet run ``` -------------------------------- ### Ollama Model Download Commands Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaExamples/README.md Commands to download necessary models (llama3.1 for chat, all-minilm for embeddings) using the Ollama CLI. These models are prerequisites for running the provided .NET examples. ```bash ollama pull llama3.1 // chat ollama pull all-minilm // embeddings ``` -------------------------------- ### Run OpenAI Examples Project using .NET CLI Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Demonstrates how to execute the OpenAI examples project from the command line using the .NET CLI. This is a common way to run .NET applications and test their functionality. ```dotnetcli dotnet run ``` -------------------------------- ### Sample Chat Client Implementation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Provides a sample implementation of the IChatClient interface. This serves as a basic example for interacting with chat models. ```csharp // Sample implementation of IChatClient interface. // See ./SampleChatClient.cs ``` -------------------------------- ### Run HikerAI Pro Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/semantic-kernel/04-HikerAIPro/README.md Executes the .NET console application to start the HikerAI Pro sample. ```bash dotnet run ``` -------------------------------- ### Middleware Combination Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates combining multiple middleware components, including prompt caching, OpenTelemetry, and tool calling. This shows flexible middleware orchestration. ```csharp // Use prompt caching, OpenTelemetry and tool calling middleware // See ./Middleware.cs ``` -------------------------------- ### Middleware Combination Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Illustrates combining multiple middleware components, such as prompt caching, OpenTelemetry, and tool calling, for enhanced AI functionality. This showcases a flexible middleware pipeline. ```csharp // Example demonstrating the combination of multiple middleware components (caching, OpenTelemetry, tool calling). // Highlights the flexibility of the middleware pipeline for advanced AI features. // See Middleware.cs for full implementation details. ``` -------------------------------- ### Chat Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates how to use the IChatClient interface to send and receive chat messages. This is a fundamental example for chat interactions. ```csharp // Use IChatClient to send and receive chat messages // See ./Chat.cs ``` -------------------------------- ### Dependency Injection Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Illustrates registering an IChatClient and middleware using Dependency Injection. This follows standard .NET practices for service management. ```csharp // Register an IChatClient and middleware using Dependency Injection // See ./DependencyInjection.cs ``` -------------------------------- ### Chat Example using IChatClient in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Illustrates the basic usage of the `IChatClient` interface to send and receive chat messages. This is a fundamental example for interacting with chat-based AI models. ```csharp // Example usage of IChatClient for sending and receiving chat messages // Requires Microsoft.Extensions.AI.OpenAI NuGet package // See Chat.cs for full implementation details. ``` -------------------------------- ### Running Unit Tests Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Command to build and execute all unit tests for the .NET AI samples project. ```bash dotnet test src/microsoft-extensions-ai-evaluation/api ``` -------------------------------- ### Run .NET Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/semantic-kernel/03-ChattingAboutMyHikes/README.md Executes the .NET console application. This command requires the .NET SDK to be installed and the application to be built. It starts the application, which then interacts with Azure OpenAI to process hike data and display results in the console. ```bash dotnet run ``` -------------------------------- ### Run Azure AI Inference Web API (dotnetcli) Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-ai-inference/AzureAIWebAPI/README.md Command to run the Azure AI Inference Web API application from the terminal. This requires the .NET SDK to be installed and configured. ```dotnetcli dotnet run ``` -------------------------------- ### Example Console Interaction Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/01 Hello Semantic Kernel.md Demonstrates a sample interaction with the console application. It shows a user providing a name and then asking a follow-up question, illustrating the stateless nature of the chat. ```console Q: Hi my name is Alice Hello Alice, pleased to meet you! How can I assist you today? Q: What is my name? I'm sorry, I cannot provide your name as I do not have that information. If you would like me to refer to you by a specific name during our conversation, please let me know. Q: ``` -------------------------------- ### Streaming Chat Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Shows how to use IChatClient for streaming chat messages. This enables real-time, incremental responses from the chat model. ```csharp // Use IChatClient to send and receive a stream of chat messages // See ./Streaming.cs ``` -------------------------------- ### Logging Chat Client Usage Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates how to use the Logging Chat Client implementation. This example highlights the logging capabilities added to chat operations. ```csharp // Use LoggingChatClient implentation // See ./Logging.cs ``` -------------------------------- ### Prompt Caching Middleware Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Illustrates how to integrate prompt caching middleware with the AI client. Caching can significantly improve performance and reduce costs by reusing previous responses for identical prompts. ```csharp // Example of integrating prompt caching middleware with AI clients. // Improves performance and reduces costs by storing and retrieving previous responses. // See Caching.cs for full implementation details. ``` -------------------------------- ### OpenTelemetry Middleware Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Illustrates the integration of OpenTelemetry middleware. This allows for distributed tracing and telemetry collection for AI operations. ```csharp // Use OpenTelemetry middleware // See ./OpenTelemetry.cs ``` -------------------------------- ### Prompt Caching Middleware Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates the usage of prompt caching middleware. This middleware can improve performance by caching responses to identical prompts. ```csharp // Use prompt caching middleware // See ./Caching.cs ``` -------------------------------- ### Run the Application Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-ai-inference/AzureAIInferenceExamples/README.md Command to execute the .NET application from the project directory. This command compiles and runs the main entry point of the application. ```dotnetcli dotnet run ``` -------------------------------- ### Text Embedding Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates the usage of the text embedding generator. This example focuses on generating vector representations of text. ```csharp // Use text embedding generator // See ./TextEmbedding.cs ``` -------------------------------- ### Chat with Conversation History Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Illustrates using IChatClient with conversation history management. This example shows how to maintain context in chat interactions. ```csharp // Use IChatClient alongside conversation history to send and receive chat messages // See ./ConversationHistory.cs ``` -------------------------------- ### Run .NET AI Evaluation Unit Tests Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Builds and executes all unit tests for the AI evaluation examples. This command can be run from the command line within the specified project directory to perform evaluations and generate reports. ```dotnet dotnet test ``` -------------------------------- ### Tool Calling Middleware Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Demonstrates how to implement tool calling capabilities using middleware. This allows AI models to interact with external tools or functions to perform specific actions. ```csharp // Example of implementing tool calling middleware for AI models. // Enables AI to interact with external tools or functions to execute tasks. // See ToolCalling.cs for full implementation details. ``` -------------------------------- ### Using Semantic Kernel in a Web App Tutorial Source: https://github.com/dotnet/ai-samples/blob/main/README.md Guides on integrating Semantic Kernel into a .NET web application. This tutorial focuses on practical implementation for building AI-enhanced web experiences. ```dotnet This snippet is a placeholder for the code found at the GitHub link: ./src/build-2024/07%20-%20Using%20Semantic%20Kernel%20in%20WebApp ``` -------------------------------- ### Install Semantic Kernel NuGet Package Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/01 Hello Semantic Kernel.md Adds the Microsoft.SemanticKernel NuGet package to the project using the dotnet CLI. This makes the Semantic Kernel library's functionalities available for use in the application. ```shell dotnet add package Microsoft.SemanticKernel ``` -------------------------------- ### Streaming Chat Messages Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Demonstrates how to receive chat messages as a stream of data from the `IChatClient`. This is useful for providing real-time feedback to users as responses are generated. ```csharp // Example usage of IChatClient for receiving chat messages in a streaming fashion. // Provides real-time response generation for a better user experience. // See Streaming.cs for full implementation details. ``` -------------------------------- ### OpenTelemetry Middleware Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Shows how to use OpenTelemetry middleware for monitoring and tracing AI interactions. This allows for better observability into application performance and request flows. ```csharp // Example of integrating OpenTelemetry middleware for monitoring AI interactions. // Provides observability into application performance and request tracing. // See OpenTelemetry.cs for full implementation details. ``` -------------------------------- ### Tool Calling Middleware Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Shows how to use tool calling middleware. This enables AI models to invoke external tools or functions based on user requests. ```csharp // Use tool calling middleware // See ./ToolCalling.cs ``` -------------------------------- ### Logging Embedding Generator Usage Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Demonstrates how to use the Logging Embedding Generator implementation. This example highlights the logging capabilities added to embedding generation. ```csharp // Use LoggingEmbeddingGenerator implementation // See ./Logging.cs ``` -------------------------------- ### Chat with Azure OpenAI Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Demonstrates how to use the IChatClient interface to send and receive chat messages with Azure OpenAI. This is a fundamental example for conversational AI interactions. ```csharp // Example usage of IChatClient for chat functionality // Requires setup with Azure OpenAI endpoint and model deployment. // See Chat.cs for full implementation details. ``` -------------------------------- ### Text Embedding with Caching Example Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Shows how to use the text embedding generator with caching middleware. This combines embedding generation with performance optimization. ```csharp // Use text embedding generator with caching middleware // See ./TextEmbeddingCaching.cs ``` -------------------------------- ### Install Logging NuGet Packages Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/04 Add Logging.md Installs the necessary Microsoft.Extensions.Logging and Microsoft.Extensions.Logging.Console NuGet packages using the .NET CLI. These packages provide the core logging framework and console output capabilities. ```shell dotnet add package Microsoft.Extensions.Logging dotnet add package Microsoft.Extensions.Logging.Console ``` -------------------------------- ### Dependency Injection for IChatClient in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Shows how to register and use the `IChatClient` and associated middleware through dependency injection. This follows standard .NET practices for managing service lifecycles and configurations. ```csharp // Example of registering IChatClient and middleware using Dependency Injection. // Adheres to .NET conventions for service management and configuration. // See DependencyInjection.cs for full implementation details. ``` -------------------------------- ### Configure Azure OpenAI Environment Variables Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Sets up essential environment variables required to connect to Azure OpenAI services for running AI evaluation examples. These variables specify the endpoint, model deployment name, and a path for storing evaluation data. ```shell SET EVAL_SAMPLE_AZURE_OPENAI_ENDPOINT=https://.openai.azure.com/ SET EVAL_SAMPLE_AZURE_OPENAI_MODEL= SET EVAL_SAMPLE_STORAGE_ROOT_PATH= ``` -------------------------------- ### Text Embedding with Caching Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Illustrates combining text embedding generation with caching middleware. This optimizes performance for repeated embedding requests, reducing computation and cost. ```csharp // Example combining text embedding generation with caching middleware. // Optimizes performance for repeated embedding requests by leveraging cached results. // See TextEmbeddingCaching.cs for full implementation details. ``` -------------------------------- ### Dependency Injection for IChatClient Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Illustrates how to register an IChatClient and associated middleware using .NET's built-in Dependency Injection system. This promotes clean architecture and testability. ```csharp // Example of registering IChatClient and middleware via Dependency Injection. // Facilitates easy integration into ASP.NET Core or other DI-enabled applications. // See DependencyInjection.cs for full implementation details. ``` -------------------------------- ### Storage Root Path Configuration Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Environment variable to specify the root directory for caching LLM responses, evaluation data, and generated reports for the Reporting API Examples. ```bash EVAL_SAMPLE_STORAGE_ROOT_PATH= ``` -------------------------------- ### Sample Embedding Generator Implementation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Provides a sample implementation of the IEmbeddingGenerator interface. This demonstrates how to generate text embeddings. ```csharp // Sample implementation of IEmbeddingGenerator interface. // See ./SampleEmbeddingGenerator.cs ``` -------------------------------- ### Install Web Search Plugin Package Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/05 Add Plugin (Bing Search).md Installs the Microsoft.SemanticKernel.Plugins.Web NuGet package, which is required for web search functionality. The --prerelease flag is used to include pre-release versions. ```shell dotnet add package Microsoft.SemanticKernel.Plugins.Web --prerelease ``` -------------------------------- ### Run Web API Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIWebAPI/README.md Command to execute the Azure OpenAI Web API application from the project directory using the .NET CLI. ```dotnetcli dotnet run ``` -------------------------------- ### Example Console Output Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/05 Add Plugin (Bing Search).md This is an example of the output received after running the application with the Bing Search plugin enabled. It demonstrates how the AI model, using web search results, responds to a user prompt about Microsoft Build 2024 announcements. ```console 1. **Windows AI Features**: Microsoft announced new AI features for Windows, enhancing its capabilities for AI-enabled applications and integrating more deeply with cloud services. 2. **Microsoft Copilot Expansion**: Updates to Microsoft's AI chatbot Copilot, including new capabilities and the introduction of Copilot+ PCs. 3. **Developer Tools**: Novel tools for developers, making it easier to innovate with cost-efficient and user-friendly cloud solutions. 4. **.NET 9 Preview 4**: Announcement of new features in .NET 9 from the .NET sessions. 5. **Microsoft Teams Tools**: New tools and updates for Microsoft Teams to improve collaboration and productivity. 6. **Surface Products**: Announcements of the new Surface Pro 11 and Surface Laptop 7. 7. **Real-Time Intelligence**: Enhancements in AI applications for businesses, allowing for in-the-moment decision making and efficient data organization at ingestion. 8. **GPT-4o**: General availability for using GPT-4o with Azure credits, aimed at aiding developers to build and deploy applications more efficiently. 9. **Rewind Feature**: Introduction of Rewind, a new feature to improve the search functionality on Windows PCs, making it as efficient as web searches. These announcements span various areas from AI advancements and new hardware to enhanced developer tools and software capabilities. ``` -------------------------------- ### C# Semantic Kernel Prompt Execution Source: https://github.com/dotnet/ai-samples/blob/main/src/local-models/phi3-llama3/README.md Demonstrates executing different types of prompts using Semantic Kernel with a local LLM. Includes examples for question-answering and code completion prompts. ```csharp var prompt = @"Instruction: A skier slides down a frictionless slope of height 40m and length 80m, what's the skier's speed at the bottom? Output:"; var response = await kernel.InvokePromptAsync(prompt); Console.WriteLine(response.GetValue()); ``` ```csharp var codePrompt = @"Complete the following code ```python def print_prime(n): # print all prime numbers less than n"; var response = await kernel.InvokePromptAsync(prompt); Console.WriteLine(response.GetValue()); ``` -------------------------------- ### Download Ollama Models Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaWebAPI/README.md Commands to download necessary models (llama3.1 for chat, all-minilm for embeddings) using the Ollama CLI. These models are required for the Web API to function correctly. ```bash ollama pull llama3.1 // chat ollama pull all-minilm // embeddings ``` -------------------------------- ### Combined Middleware Usage Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Shows how to combine multiple middleware components, such as prompt caching, OpenTelemetry, and tool calling, with the OpenAI client. This allows for a comprehensive and configurable AI interaction pipeline. ```csharp // Example usage combining prompt caching, OpenTelemetry, and tool calling middleware. // Demonstrates flexible middleware orchestration. // See Middleware.cs for full implementation details. ``` -------------------------------- ### Install Semantic Kernel and Swashbuckle Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md Installs the necessary NuGet packages for Semantic Kernel and Swashbuckle.AspNetCore. Semantic Kernel provides AI capabilities, while Swashbuckle is used for API documentation generation. ```shell dotnet add package Microsoft.SemanticKernel dotnet add package Swashbuckle.AspNetCore ``` -------------------------------- ### Example Weather Forecast JSON Output (JSON) Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md This JSON object represents an example response from the `/WeatherForecast` endpoint, showing the date, temperature in Celsius, and an AI-generated summary of the weather conditions for that temperature. ```json { "date": "2024-06-17", "tempratureC": 4, "summary": "At 4 degrees Celsius, the weather is cool and slightly chilly. It is above freezing, so there is no ice, but it's cold enough that you might want a jacket or sweater when outdoors. This temperature is typical for late autumn or early spring in temperate regions." } ``` -------------------------------- ### Environment Variables for LLM Connection Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Configuration parameters required for connecting to various LLM providers. These environment variables should be set based on your selected LLM service. ```bash EVAL_SAMPLE_OPENAI_API_KEY= EVAL_SAMPLE_OPENAI_MODEL= ``` ```bash # Example for Azure OpenAI AZURE_OPENAI_API_KEY= AZURE_OPENAI_ENDPOINT= AZURE_OPENAI_DEPLOYMENT_NAME= ``` ```bash # Example for Ollama OLLAMA_BASE_URL= OLLAMA_MODEL= ``` -------------------------------- ### Text Embedding Generation Example in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Demonstrates the use of text embedding generation capabilities. This involves converting text into numerical vector representations, useful for similarity searches and semantic analysis. ```csharp // Example usage of text embedding generator. // Converts text into numerical vector representations for similarity searches. // See TextEmbedding.cs for full implementation details. ``` -------------------------------- ### Install Http NuGet Package Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/06 Modifying Kernel Behavior with Dependency Injection.md Installs the Microsoft.Extensions.Http NuGet package, offering extensions for configuring and managing HttpClient instances within .NET applications. ```shell dotnet add package Microsoft.Extensions.Http ``` -------------------------------- ### Logging Chat Client Implementation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Extends IChatClient with logging functionality using DelegatingChatClient. This implementation adds logging to chat operations. ```csharp // Sample implementation of DelegatingChatClient that extends IChatClient with logging functionality. // See ./LoggingChatClient.cs ``` -------------------------------- ### Run HikerAI Pro Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/semantic-kernel/04-HikerAIPro/README.md Executes the HikerAI Pro .NET console application. This command starts the application, which then interacts with Azure OpenAI for hiking recommendations based on weather conditions. ```Bash dotnet run ``` -------------------------------- ### Chat with Conversation History in C# Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIExamples/README.md Shows how to maintain and utilize conversation history when interacting with the `IChatClient`. This enables more context-aware and coherent multi-turn conversations. ```csharp // Example usage of IChatClient with conversation history management // Enables multi-turn chat interactions by preserving context. // See ConversationHistory.cs for full implementation details. ``` -------------------------------- ### Install Http Resilience NuGet Package Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/06 Modifying Kernel Behavior with Dependency Injection.md Installs the Microsoft.Extensions.Http.Resilience NuGet package, enabling the application of resilience patterns such as retries and circuit breakers to HttpClient requests. ```shell dotnet add package Microsoft.Extensions.Http.Resilience ``` -------------------------------- ### Prompt Caching Middleware Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Demonstrates the integration of prompt caching middleware with the OpenAI client. This middleware can significantly improve performance and reduce costs by caching responses to identical prompts. ```csharp // Example usage of prompt caching middleware with IChatClient. // Optimizes repeated queries by returning cached results. // See Caching.cs for full implementation details. ``` -------------------------------- ### Restore aieval Tool Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Restores the 'aieval' dotnet tool. This command should be run from the directory containing the tool's manifest file to make it available on the command line. ```dotnet dotnet tool restore ``` -------------------------------- ### Run HikerAI Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/semantic-kernel/02-HikerAI/README.md Executes the .NET console application to get hiking recommendations from Azure OpenAI. Ensure Azure resources are deployed and accessible. ```Bash dotnet run ``` -------------------------------- ### Install Compliance Redaction NuGet Package Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/06 Modifying Kernel Behavior with Dependency Injection.md Installs the Microsoft.Extensions.Compliance.Redaction NuGet package, which provides functionality for redacting sensitive information from logs and HTTP headers. ```shell dotnet add package Microsoft.Extensions.Compliance.Redaction ``` -------------------------------- ### Restore `aieval` dotnet tool Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Restores the `aieval` dotnet tool and other local tools defined in the `.config/dotnet-tools.json` file. This command is used by other users to set up the tool in their environment after the manifest is committed. ```bash dotnet tool restore ``` -------------------------------- ### Logging Embedding Generator Implementation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/abstraction-implementations/AbstractionImplementationExamples/README.md Extends IEmbeddingGenerator with logging functionality using DelegatingEmbeddingGenerator. This implementation adds logging to embedding generation. ```csharp // Sample implementation of DelegatingEmbeddingGenerator that extends IEmbeddingGenerator with logging functionality. // See ./LoggingEmbeddingGenerator.cs ``` -------------------------------- ### Run .NET Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/extensions-ai/01-HikeBenefitsSummary/README.md Executes the .NET console application to summarize hiking benefits. Ensure you are in the `01-HikeBenefitsSummary` directory before running. This command initiates the local application which communicates with Azure OpenAI. ```bash dotnet run ``` -------------------------------- ### Sample Console Output Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/03 Add Plugin (Function Call).md Illustrates the expected output when running the application and interacting with the AI. The example shows a user asking for their age, and the AI responding with the correct information, demonstrating successful plugin invocation. ```console Q: My name is Alice. How old am I? Alice, you are 25 years old. Q: ``` -------------------------------- ### OpenTelemetry Middleware Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Illustrates how to integrate OpenTelemetry middleware for monitoring and tracing AI interactions. This allows for detailed insights into request latency, success rates, and other performance metrics. ```csharp // Example usage of OpenTelemetry middleware with OpenAI client. // Enables observability for AI service calls. // See OpenTelemetry.cs for full implementation details. ``` -------------------------------- ### Run the Web Application Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md Executes the .NET application from the terminal. This command starts the web server, making the application accessible via HTTP. ```console dotnet run ``` -------------------------------- ### Initialize Kernel with OpenAI Chat Completion Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/01 Hello Semantic Kernel.md Initializes the Semantic Kernel and configures it to use the OpenAI chat completion service. It requires the model name and the OpenAI API key, which is retrieved from environment variables. ```csharp var kernel = Kernel.CreateBuilder() .AddOpenAIChatCompletion(openAIChatCompletionModelName, Environment.GetEnvironmentVariable("OPENAI_API_KEY")) .Build(); ``` -------------------------------- ### Example Console Output with Trace Logging - Console Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/04 Add Logging.md This console output demonstrates the verbose logging generated when the application runs with trace-level logging enabled, showing details like chat history, model settings, and token usage from the Semantic Kernel. ```console Q: Hello trce: Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIChatCompletionService[0] ChatHistory: [{"Role":{"Label":"user"},"Items":[{"$type":"TextContent","Text":"Hello"}]}], Settings: {"temperature":1,"top_p":1,"presence_penalty":0,"frequency_penalty":0,"max_tokens":null,"stop_sequences":null,"results_per_prompt":1,"seed":null,"response_format":null,"chat_system_prompt":null,"token_selection_biases":null,"ToolCallBehavior":null,"User":null,"logprobs":null,"top_logprobs":null,"model_id":null} info: Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIChatCompletionService[0] Prompt tokens: 8. Completion tokens: 9. Total tokens: 17. Hello! How can I help you today? Q: ``` -------------------------------- ### Run the .NET Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/extensions-ai/01-HikeBenefitsSummary/README.md Command to execute the .NET console application after configuration. This will read the benefits.md file and send it to OpenAI for summarization. ```bash dotnet run ``` -------------------------------- ### Chat with Conversation History Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Illustrates how to maintain and utilize conversation history when interacting with Azure OpenAI using the IChatClient. This enables more context-aware and coherent dialogues. ```csharp // Example usage of IChatClient with conversation history management. // Essential for multi-turn conversations. // See ConversationHistory.cs for full implementation details. ``` -------------------------------- ### Text Embedding with Caching Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Shows how to combine text embedding generation with caching middleware. This optimizes performance for repeated embedding requests, reducing latency and cost. ```csharp // Example usage of text embedding generator with caching middleware. // Caches embedding results for identical text inputs. // See TextEmbeddingCaching.cs for full implementation details. ``` -------------------------------- ### Run .NET Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/semantic-kernel/01-HikeBenefitsSummary/README.md Executes the .NET console application to summarize hiking benefits. Ensure you are in the `01-HikeBenefitsSummary` directory before running. This command initiates the local application which communicates with Azure OpenAI. ```bash dotnet run ``` -------------------------------- ### Example JSON Response from /WeatherForecast Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md Demonstrates the JSON output received when accessing the /WeatherForecast endpoint in a web browser. It includes the date, temperature, and an AI-generated weather summary. ```json { "date": "2024-06-17", "tempratureC": 4, "summary": "At 4 degrees Celsius, the weather is cool and slightly chilly. It is above freezing, so there is no ice, but it's cold enough that you might want a jacket or sweater when outdoors. This temperature is typical for late autumn or early spring in temperate regions." } ``` -------------------------------- ### Streaming Chat Messages Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Shows how to receive chat messages from Azure OpenAI as a stream of data using the IChatClient. This provides a more interactive user experience by displaying responses as they are generated. ```csharp // Example usage of IChatClient for streaming chat responses. // Improves perceived performance and user engagement. // See Streaming.cs for full implementation details. ``` -------------------------------- ### Tool Calling Middleware Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Demonstrates the use of tool calling middleware, enabling the AI model to invoke external tools or functions. This extends the AI's capabilities beyond text generation. ```csharp // Example usage of tool calling middleware with OpenAI client. // Allows AI to interact with external systems or APIs. // See ToolCalling.cs for full implementation details. ``` -------------------------------- ### Generate Evaluation Report with aieval Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Generates an evaluation report using the 'aieval' dotnet tool. It requires the path to the storage root directory and an output path for the report file. The '--open' flag automatically opens the generated report in the default browser. ```dotnet dotnet aieval report -p -o \report.html --open ``` -------------------------------- ### Run Console Application Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/semantic-kernel/01-HikeBenefitsSummary/README.md Executes the .NET console application from the terminal. The application will then process the 'benefits.md' file and send it to OpenAI for summarization. ```bash dotnet run ``` -------------------------------- ### Text Embedding Generation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIExamples/README.md Demonstrates how to use the text embedding generator functionality provided by the OpenAI client. This is useful for tasks like semantic search, clustering, and classification. ```csharp // Example usage of the text embedding generator. // Converts text into numerical vector representations. // See TextEmbedding.cs for full implementation details. ``` -------------------------------- ### Test Chat Endpoint Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaWebAPI/README.md PowerShell script to send a POST request to the '/chat' endpoint of the running Web API. It sends a prompt and retrieves the AI's response. ```powershell $response = Invoke-RestMethod -Uri 'http://localhost:5078/chat' -Method Post -Headers @{'Content-Type'='application/json'} -Body '"What is AI?"'; $response.message.contents.text ``` -------------------------------- ### Configure Logging Services Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/04 Add Logging.md Configures the application's service collection to include logging services. This example sets up the console logger and specifies the minimum log level to 'Trace', capturing all diagnostic information. ```csharp // Add logging services to the builder builder.Services.AddLogging(b => b.AddConsole().SetMinimumLevel(LogLevel.Trace)); ``` -------------------------------- ### Configure Azure AI Inference Settings (JSON) Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-ai-inference/AzureAIWebAPI/README.md Configuration file for setting up Azure AI Inference parameters, including API key, endpoint, and model ID. This file is typically named appsettings.local.json and should be placed in the project directory. ```json { "Logging": { "LogLevel": { "Default": "Information", "Microsoft.AspNetCore": "Warning" } }, "AllowedHosts": "*", "AI": { "AzureAIInference": { "Key": "YOUR-GH-PAT-TOKEN", "Chat": { "Endpoint": "https://models.inference.ai.azure.com", "ModelId": "gpt-4o-mini" } } } } ``` -------------------------------- ### appsettings.local.json Configuration Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIWebAPI/README.md Configuration file for the Azure OpenAI Web API, specifying logging levels, allowed hosts, and Azure OpenAI service details including endpoint and model IDs for chat and embedding. ```json { "Logging": { "LogLevel": { "Default": "Information", "Microsoft.AspNetCore": "Warning" } }, "AllowedHosts": "*", "AI": { "AzureOpenAI": { "Endpoint": "YOUR-AZURE-OPENAI-ENDPOINT", "Chat": { "ModelId": "gpt-4o-mini" }, "Embedding": { "ModelId": "text-embedding-3-small" } } } } ``` -------------------------------- ### Configure Azure AI Foundry Evaluation Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Set environment variables to configure .NET AI evaluation samples to use the Azure AI Foundry Evaluation service. These variables link the samples to your Azure subscription, resource group, and AI project. ```cmd set EVAL_SAMPLE_AZURE_SUBSCRIPTION_ID= set EVAL_SAMPLE_AZURE_RESOURCE_GROUP= set EVAL_SAMPLE_AZURE_AI_PROJECT= ``` -------------------------------- ### HTTP Response Headers Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/06 Modifying Kernel Behavior with Dependency Injection.md Example of HTTP response headers received from a server, including rate limiting information, caching status, and content type. ```http Response Headers: Date: Mon, 17 Jun 2024 19:30:53 GMT Transfer-Encoding: chunked Connection: keep-alive openai-organization: jakerad openai-processing-ms: 9081 openai-version: 2020-10-01 Strict-Transport-Security: max-age=15724800; includeSubDomains x-ratelimit-limit-requests: 500 x-ratelimit-limit-tokens: 30000 x-ratelimit-remaining-requests: 499 x-ratelimit-remaining-tokens: 29286 x-ratelimit-reset-requests: 120ms x-ratelimit-reset-tokens: 1.428s X-Request-ID: req_d2364bb08052043963f866fce4212356 CF-Cache-Status: DYNAMIC Server: cloudflare CF-RAY: 895574b51ce0766a-SEA Alt-Svc: h3=":443" Content-Type: application/json ``` -------------------------------- ### Ollama Web API Configuration Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaWebAPI/README.md Local application settings file for the Ollama Web API. It specifies the Ollama server endpoint and the model IDs to be used for chat and embedding operations. ```json { "Logging": { "LogLevel": { "Default": "Information", "Microsoft.AspNetCore": "Warning" } }, "AllowedHosts": "*", "AI": { "Ollama": { "Chat": { "Endpoint": "http://localhost:11434/", "ModelId": "llama3.1" }, "Embedding": { "Endpoint": "http://localhost:11434/", "ModelId": "all-minilm" } } } } ``` -------------------------------- ### Test Embedding Endpoint Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/ollama/OllamaWebAPI/README.md PowerShell script to send a POST request to the '/embedding' endpoint of the running Web API. It sends text and retrieves the corresponding vector embedding. ```powershell $response = Invoke-RestMethod -Uri 'http://localhost:5078/embedding' -Method Post -Headers @{'Content-Type'='application/json'} -Body '"What is AI?"'; $response.vector ``` -------------------------------- ### Configure OpenAI API Settings in appsettings.local.json Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIWebAPI/README.md This JSON file configures the OpenAI API key and model IDs for the Web API. It specifies the default chat and embedding models to be used by the application. ```json { "Logging": { "LogLevel": { "Default": "Information", "Microsoft.AspNetCore": "Warning" } }, "AllowedHosts": "*", "AI": { "OpenAI": { "Key": "YOUR-API-KEY", "Chat": { "ModelId": "gpt-4o-mini" }, "Embedding": { "ModelId": "text-embedding-3-small" } } } } ``` -------------------------------- ### Configure OpenAI API Key Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/extensions-ai/01-HikeBenefitsSummary/README.md Commands to initialize user secrets and set your OpenAI API key for the .NET application. This is required before running the sample. ```bash dotnet user-secrets init dotnet user-secrets set OpenAIKey ``` -------------------------------- ### Configure Azure Storage Reporting Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai-evaluation/api/INSTRUCTIONS.md Set environment variables to configure .NET AI evaluation samples to use Azure storage providers for storing evaluation results and cached LLM responses. These variables specify the storage account endpoint and container name. ```cmd set EVAL_SAMPLE_AZURE_STORAGE_ACCOUNT_ENDPOINT= set EVAL_SAMPLE_AZURE_STORAGE_CONTAINER= ``` -------------------------------- ### Configure OpenAI Key Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/openai/semantic-kernel/01-HikeBenefitsSummary/README.md Initializes user secrets for the .NET project and sets the OpenAI API key. This is a prerequisite for running the application with your OpenAI credentials. ```bash dotnet user-secrets init dotnet user-secrets set OpenAIKey ``` -------------------------------- ### Get Chat Message Content with Settings Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/03 Add Plugin (Function Call).md Retrieves a chat response from the service using the configured `settings` and the `kernel`. This step integrates the plugin functionality into the chat interaction flow. ```csharp var response = await chatService.GetChatMessageContentAsync(chatHistory, settings, kernel);// Get chat response based on chat history ``` -------------------------------- ### Test OpenAI Chat API with PowerShell Source: https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/openai/OpenAIWebAPI/README.md This PowerShell script sends a POST request to the '/chat' endpoint of the local Web API to test the chat functionality. It sends a JSON payload containing a prompt and displays the text content of the response. ```powershell $response = Invoke-RestMethod -Uri 'http://localhost:5208/chat' -Method Post -Headers @{'Content-Type'='application/json'} -Body '"What is AI?"'; $response.message.contents.text ``` -------------------------------- ### OpenAI Chat Completion Function Call Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/06 Modifying Kernel Behavior with Dependency Injection.md Example of a function call to the OpenAI Chat Completion service, including the query for a web search tool. This demonstrates how the Semantic Kernel library invokes external functions. ```C# trce: Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIChatCompletionService[0] Function call requests: WebSearchEnginePlugin-Search({"query":"major Microsoft announcements Build 2024"}) ``` -------------------------------- ### Define WeatherForecast Endpoint with AI Summary Source: https://github.com/dotnet/ai-samples/blob/main/src/build-2024/docs/Exercise/07 Using Semantic Kernel in WebApp.md Maps a GET request to the /WeatherForecast endpoint. It generates a random temperature and uses Semantic Kernel to create a short AI-generated summary for that temperature. The response is a WeatherForecast record. ```csharp app.MapGet("/WeatherForecast", async (Kernel kernel) => { int temp = Random.Shared.Next(-20, 55); return new WeatherForecast ( DateOnly.FromDateTime(DateTime.Now), temp, await kernel.InvokePromptAsync($"Short description of weather at {temp} degrees Celsius") // This description will be generated by the AI model for the given temperature. ); }); app.Run(); internal record WeatherForecast(DateOnly Date, int TempratureC, string Summary); ``` -------------------------------- ### Configure Local LLM Kernel via User Secrets Source: https://github.com/dotnet/ai-samples/blob/main/src/llm-eval/README.md Demonstrates configuring a Semantic Kernel to connect to local LLMs, such as Phi-3 or Llama 3, hosted via Ollama. It uses User Secrets for configuration and specifies the model, endpoint, and an API key (often a placeholder for local setups). ```csharp public static Kernel CreateKernelEval() { var config = new ConfigurationBuilder().AddUserSecrets().Build(); var builder = Kernel.CreateBuilder(); builder.AddOpenAIChatCompletion( modelId: "phi3", endpoint: new Uri("http://localhost:11434"), apiKey: "api"); return builder.Build(); } ``` -------------------------------- ### Run HikerAI Console App (Bash) Source: https://github.com/dotnet/ai-samples/blob/main/src/azure-openai-sdk/02-HikerAI/README.md Execute the HikerAI .NET console application from your terminal. This command initiates the application, which then communicates with an Azure OpenAI Service to fetch hiking suggestions. ```bash dotnet run ``` -------------------------------- ### Cleaning Up Azure Resources with Azure Developer CLI Source: https://github.com/dotnet/ai-samples/blob/main/src/quickstarts/azure-openai/README.md This command utilizes the Azure Developer CLI (azd) to de-provision and delete all Azure resources that were previously deployed by `azd up`. It ensures a clean removal of all associated services and models, helping to manage costs and resource allocation. ```bash azd down ```