This repository contains a collection of Semantic Kernel demonstration projects. Semantic Kernel is a powerful framework for building AI applications using Large Language Models (LLMs) and other AI models. These demonstrations showcase various use cases and features of Semantic Kernel, providing insights into its capabilities and potential applications.
A collection of .NET samples demonstrating various Semantic Kernel features:
- Kernel Prompting: Basic prompt handling and streaming
- Kernel Dependency Injection: Using DI with Semantic Kernel
- Kernel Prompt Template: Working with prompt templates and variables
- Chat Service with Chat History: Managing conversation history
- Prompt Execution Settings: Configuring prompt execution
- Grounding Prompts with Plugins: Using plugins for context
- Plugin Function Calling: Implementing and using plugins
- Using OpenAPI Plugins: Integrating external APIs via OpenAPI
- Using Kernel Filters: Implementing kernel filters
- Aspire Dashboard + OpenTelemetry: Monitoring and telemetry
- Multiple AI Models: Working with different AI providers
- Multi-Modality: Handling text, images, and audio
- .NET 8.0+
- OpenAI API key (for OpenAI-based samples)
- Ollama (for local AI model samples)
- Docker (for specific samples)
- ONNX runtime (for specific samples)
-
Clone the repository
-
Set up your OpenAI API key:
cd Beginner-Hands-On/Samples/KernelPrompting dotnet user-secrets set "OpenAI:ApiKey" "YOUR_OPENAI_API_KEY"
-
Set up Ollama (choose one):
- Docker:
docker run -d -e OLLAMA_KEEP_ALIVE=-1 -v C:\temp\ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
- Windows: Download from ollama.com/download
- Docker:
-
Install required Ollama models:
ollama pull llama3.2 ollama pull phi4
Each sample includes its own README with specific setup instructions and requirements.