Microsoft’s Semantic Kernel, an open source SDK for integrating large language models (LLMs) with conventional programming languages, will add capabilities such as plugin testing, dynamic planners, and streaming in the fall time frame.
The fall 2023 release plan was revealed in a roadmap for Semantic Kernel published by Microsoft’s Semantic Kernel team. As part of this effort, the team said it was adopting the OpenAI plugin standard, which would allow plugins to work across OpenAI, Semantic Kernel, and the Microsoft platform.
The team also intends to enhance planners, so these can handle global-scale deployments. A planner is a function that orchestrates the steps needed to fulfill a user’s request, or “ask.” Users of planners should expect features such as cold storage plans for consistency and dynamic planners that automatically discover plugins.
Another goal cuted in the fall roadmap is integration with vector databases such as Pinecone, Redis, Weaviate, and Chroma, along with Azure Cognitive Search and Services. The team is also developing a document chunking service and enhancing the Semantic Kernel Tools extension for Visual Studio Code.
Telemetry and AI safety were cited as key aspects of the plan. With end-to-end telemetry, developers get insights into goal-oriented AI plan creation, token usage, and errors. Integration of Azure Content Safety hooks promises a streamlined approach to ensure AI safety.
The Semantic Kernel repo is on GitHub. In addition to rolling out the roadmap, Microsoft on July 12 announced improvements to endpoint management in the Semantic Kernel Tools extension for the Visual Studio Code editor, allowing developers to more quickly and easily switch between different AI models.