AI System Design / Software Engineering

Hands-On AI Orchestration with Semantic Kernel

February 10, 20254 min read

Beyond the Prompt: Engineering Smarter AI with Semantic Kernel

As the adoption of Large Language Models (LLMs) grows, developers need a structured way to orchestrate prompts, manage AI skills, and integrate AI capabilities into software systems. This is where Semantic Kernel comes in - a powerful SDK from Microsoft that helps developers build AI-first apps using semantic functions, native code, and LLMs like OpenAI or Azure OpenAI.

In this blog, we'll explore what Semantic Kernel is, how it works, and how you can use it to build scalable, modular, and context aware AI applications.


What is Semantic Kernel?

Semantic Kernel (SK) is an open-source SDK that allows you to integrate AI services (like OpenAI GPT models) with traditional programming constructs such as functions, plugins, memory, and workflows.

It is available in:

  • C#
  • Python
  • Java (preview)

With SK, you can combine semantic functions (prompt templates) and native functions (code-based logic) in one seamless pipeline—enabling hybrid AI systems.


Key Features

  • Semantic functions: Use prompt engineering as first-class components
  • Pluggable AI models: Support for OpenAI, Azure OpenAI, Hugging Face, and more
  • Memory store: Persistent vector memory for context-aware reasoning
  • Planner: Auto-generates plans (sequences of steps) using AI
  • Skills architecture: Organize functions into reusable modules

How It Works

Here’s a simplified flow of a Semantic Kernel application:

User Request
     ↓
Semantic Kernel
     ↓
[Semantic Functions + Native Functions + Memory + Planner]
     ↓
External LLM APIs (OpenAI, Azure OpenAI)
     ↓
Response

You can define prompts (semantic functions), call native APIs (C#/Python code), retrieve memory from vector stores, and even generate plans automatically.

Installing Semantic Kernel (Python)

pip install semantic-kernel

Example: Build a Chatbot with Semantic Kernel

import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion

# Create kernel
kernel = sk.Kernel()
kernel.add_chat_service(
    "chat-gpt",
    OpenAIChatCompletion("gpt-4", api_key="YOUR_OPENAI_API_KEY")
)

# Create a semantic function from a prompt
prompt_template = "You're a helpful assistant. Answer: {{$input}}"
chat_function = kernel.create_semantic_function(prompt_template)

# Run the function
output = chat_function("What is Semantic Kernel?")
print(output)

Using Semantic Functions and Native Functions Together

You can define a semantic skill using a .txt prompt file and then call it from Python or C#. Native functions (like API calls or math logic) can be registered as well.

Example: Mixing prompts and Python logic:

def get_user_name() -> str:
    return "Rahul"

kernel.register_python_function(get_user_name, "user_skill", "get_user_name")

Adding Memory (Vector Store Integration)

Semantic Kernel supports:

  • Volatile memory (in-memory)
  • Persistent memory with plugins like:
    1. Azure Cognitive Search
    2. Pinecone
    3. Redis

Add memory to your kernel:

from semantic_kernel.memory.memory_store_base import MemoryStoreBase
kernel.register_memory_store(MyCustomVectorStore())

then:

  • Store summaries, documents, chat history
  • Query memory for relevant context
  • Automatically ground LLM responses with context

Using the Planner for Auto-Generated Workflows

The Planner uses LLMs to generate step-by-step workflows (plans) based on user intent.

planner = SequentialPlanner(kernel)
plan = planner.create_plan("Translate this text and summarize it in one line.")

This is useful for:

  • Task chaining
  • Intelligent agents
  • Zero-shot orchestration

Architecture Overview

                      ┌────────────┐
   ┌──────────────┐   │ Semantic   │
   │ User Input   ├──►│  Kernel    ├──┐
   └──────────────┘   └────┬───────┘  │
                           ▼          ▼
         ┌────────┐  ┌──────────┐ ┌──────────┐
         │ Prompt │  │  Memory  │ │  Native  │
         │Engine  │  │ (Vector) │ │ Function │
         └────────┘  └──────────┘ └──────────┘
                           ▼
                   External LLM (e.g., GPT-4)

Use Cases for Semantic Kernel

  • AI copilots: Embed AI into productivity tools (e.g., Office, VS Code)
  • Multi-step agents: Automate goal-driven behavior with planning
  • Conversational AI: Build advanced chatbots with memory and context
  • RAG systems: Retrieve content and augment prompts dynamically
  • Data enrichment: Combine AI and native logic to tag, summarize, classify

Further deep-dive


Author:
Rahul Majumdar
Artificial IntelligenceSystem DesignSystem Architecture