Building Your First AI Agent: Connecting Semantic Kernel to Real Data in .NET
Introduction
Over the last two posts, we decoded the AI terminologies and looked at the .NET AI Toolbelt. We learned that if we want an AI to actually do things—like checking a database instead of just generating text—we need an AI Agent. We also learned that Semantic Kernel (SK) is our go-to orchestrator for this job in .NET.
Today, we move from theory to code.
We are going to build a simple Inventory Management Agent. Instead of our AI hallucinating answers about stock levels, we are going to give it a “Plugin” so it can query a mock database itself. Let’s dive in.
The Problem with Basic LLMs
If you ask a standard LLM, “How many wireless keyboards do we have in stock?”, it will likely respond with something like, “As an AI, I don’t have access to your inventory…”
This is because the AI is trapped in its training data. To fix this, we need to build a bridge between the LLM and our business logic. In Semantic Kernel, this bridge is called a Plugin.
Step 1: The Mock Database
First, let’s pretend we have a database. In a real-world enterprise application, you would be injecting an Entity Framework Core DbContext here. For this tutorial, we will use a simple in-memory repository to keep things clear.
public class InventoryRepository
{
private readonly Dictionary<string, int> _inventory = new(StringComparer.OrdinalIgnoreCase)
{
{ "Wireless Keyboard", 42 },
{ "USB-C Monitor", 15 },
{ "Standing Desk", 5 }
};
public int GetStockLevel(string productName)
{
return _inventory.TryGetValue(productName, out int stock) ? stock : 0;
}
}
Step 2: Creating the Semantic Kernel Plugin
Now, we need to expose this repository to the AI. We do this by creating a Plugin class.
This is the most important part to understand: The AI does not read your C# execution code. It reads your attributes. You must describe your methods and parameters in plain English so the AI knows exactly when and how to use this tool.
using System.ComponentModel;
using Microsoft.SemanticKernel;
public class InventoryPlugin
{
private readonly InventoryRepository _repository;
public InventoryPlugin(InventoryRepository repository)
{
_repository = repository;
}
// These attributes are what the AI actually "reads"
[KernelFunction, Description("Gets the current stock level for a specific product.")]
public int CheckStock([Description("The exact name of the product to check")] string productName)
{
Console.WriteLine($"\n[SYSTEM: AI autonomously called CheckStock for '{productName}']");
return _repository.GetStockLevel(productName);
}
}
Step 3: Wiring Up the Agent
Now we bring it all together. We will configure Semantic Kernel, add our AI provider (like OpenAI or a local model), load our Plugin, and enable Automatic Function Calling.
Automatic Function Calling is the magic that turns a standard chatbot into an Agent. It tells the AI: “If a user asks a question, check your toolbox (Plugins) first. If a tool can help, pause, execute the C# tool, read the result, and then answer the user.”
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// 1. Initialize our mock database
var inventoryRepo = new InventoryRepository();
// 2. Build the Kernel
var builder = Kernel.CreateBuilder()
.AddOpenAIChatCompletion("gpt-4o", "YOUR_API_KEY"); // Or swap for local Ollama!
// 3. Add our custom plugin to the Kernel's toolbox
builder.Plugins.AddFromObject(new InventoryPlugin(inventoryRepo));
var kernel = builder.Build();
// 4. Enable Agentic behavior (Automatic Tool Calling)
OpenAIPromptExecutionSettings settings = new()
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
};
// 5. Ask the AI a question!
string userPrompt = "I need to fulfill an order. How many wireless keyboards and standing desks do we have left?";
Console.WriteLine($"User: {userPrompt}");
var response = await kernel.InvokePromptAsync(userPrompt, new(settings));
Console.WriteLine($"\nAI Agent: {response}");
The Result
When you run this code, something amazing happens. You won’t just get a text response. The console output will look like this:
User: I need to fulfill an order. How many wireless keyboards and standing desks do we have left?
[SYSTEM: AI autonomously called CheckStock for ‘wireless keyboard’]
[SYSTEM: AI autonomously called CheckStock for ‘standing desk’]
AI Agent: We currently have 42 Wireless Keyboards and 5 Standing Desks in stock.
The Takeaway
Notice what happened here. We didn’t write any if/else statements to parse the user’s text. We didn’t write regex to extract the product names.
The AI understood the intent, realized it had a tool (CheckStock) that could find the answer, extracted the product names on its own, called our C# method twice, aggregated the data, and formatted a perfect response.
This is the power of Agentic AI. You build the robust, secure, and tested C# microservices, and you let Semantic Kernel handle the natural language orchestration.
Happy Coding! 🙂

