This year’s Microsoft Ignite developer conference might as well be called AIgnite, with over half of the almost 600 sessions featuring artificial intelligence in some shape or form.
Generative AI, in particular, is at the heart of many of the new product announcements Microsoft is making at the event, including new AI capabilities for wrangling large language models (LLMs) in Azure, new additions to its Copilot range of generative AI assistants, new hardware, and a new tool to help developers deploy small language models (SLMs) too.
Here’s some of the top AI news CIOs will want to take away from Microsoft Ignite 2023.
1. Bing Chat Enterprise goes away
When OpenAI released ChatGPT Enterprise in September, there was speculation that it could cause trouble for Microsoft’s Bing Chat Enterprise, launched just two months prior. Sure enough, Bing Chat Enterprise will soon disappear — but it’s only the name that’s going away: The product lives on and will be known simply as Copilot.
With the name change will come new capabilities, including — for organizations using Microsoft’s Entra cloud-based identity management service — the ability to protect commercial data used within the chatbot. The new Copilot will be generally available from Dec. 1, 2023.
2. Copilots a go-go
Of course, Microsoft product naming could never be so simple, and there won’t simply be one Copilot. There’s also Copilot in Dynamics 365, Copilot for Microsoft 365, Copilot in GitHub, Copilot in Viva, and now: Copilot for Service and Copilot for Sales.
Copilot for Service is intended to help agents in contact centers, ingesting customer information and knowledgebase articles and integrating with Teams, Outlook, and third-party systems, including Salesforce, ServiceNow, and Zendesk.
Confusingly, Microsoft already offers a Sales Copilot; Copilot for Sales is a different product that includes a license for Copilot for Microsoft 365, and helps sales staff prepare for customer meetings by creating custom briefing documents.
3. Copilots for sysadmins
It’s not just Microsoft 365 users that get a copilot: Admins will have one too. A forthcoming update will see the addition of Copilot to the Edge for Business management interface, helping admins with recommended policies and extensions for the workplace browser. Other Microsoft 365 apps are already covered, including SharePoint and Teams. There’s also a new adoption dashboard for Microsoft Viva to help track how the introduction of Copilot features in Microsoft 365 applications is changing the way users work.
4. Additions to Copilot for Microsoft 365
About those new features: Copilot is already starting to enrich Microsoft 365 apps, but there’s a wave of new features arriving next year, so CIOs will need to be ready with answers to users’ questions.
Starting next year Teams, for example, will be able to take live meeting transcripts, summarize them as notes, and organize those notes on a whiteboard, suggesting more ideas to add to the whiteboard as the meeting progresses. Meeting notes will also become interactive documents, enabling participants to ask for more detailed information on a particular point after the meeting has ended. Organizations concerned about the risks of maintaining such written records will be able to turn the feature off by default or per meeting.
The additions to Copilot for Microsoft 365 won’t stop there, though, as Microsoft is opening it up to plugins and connectors from third-party vendors, enabling it to source and cite data from Jira, Trello, Confluence, Freshworks, and others.
5. Copilot Studio: A copilot for creating copilots
Inevitably, just as OpenAI has made it possible to customize chatbots — GPTs — by means of a ChatGPT-like interface, Microsoft has created Copilot Studio, a conversational copilot for creating and customizing more copilots. It will provide access to Azure features, including speech recognition and sentiment analysis, and the ability to add more sophisticated features through Power Platform connectors and Power Automate workflows, all with governance features so that IT is still in control.
6. People don’t want to give up their copilots
Each year, Microsoft publishes its Work Trend Index, which this time included a survey and observational studies of early Copilot users.
Those surveyed were clearly wowed by the tool: 70% said they were more productive with it, 77% said they didn’t want to give it up, and 22% said the tool saved them more than 30 minutes a day.
“These time savings are pretty extraordinary, but it will be key for everyone to invest their refunded time wisely,” said Frank X. Shaw, Microsoft’s chief communications officer, in a video presentation of the survey findings ahead of the event.
The observational study revealed that Copilot users were able to find information 27% faster and were able to catch up on missed meetings almost four times faster.
Perhaps a future iteration of Copilot could coach users on what to do with the time it saves them: While around half of those saving more than 30 minutes a day with it said they spent the time saved on focused work, one-sixth said they spent it in … more meetings.
7. Generative AI credentials
The domain is so new, it’s hard to evaluate who knows what, so Microsoft is stepping in to offer new credentials in its Microsoft Applied Skills to encompass AI. They’ll cover developing generative AI with Azure OpenAI Service; creating document processing systems with Azure AI Document Intelligence; building natural language processing tools with Azure AI Language; and building Azure AI Vision systems.
8. Streamlining generative AI operations on Azure
At Build in May, Microsoft announced Azure AI Studio, a unified system for building generative AI applications, and six months later it’s finally launching a preview of the platform. (Generative AI technology is advancing fast but not, it seems, all that fast.) Developers will be able to select from a range of proprietary and open-source LLMs; choose data sources, including Microsoft Fabric OneLake and Azure AI Search for vector embeddings, enabling responses to be fine-tuned with real-time data without having to retrain the whole model; and monitor their models’ performance once deployed.
9. New Azure chips for enterprise AI workloads
Microsoft is updating its Azure infrastructure with new chips tailored for AI workloads. To accelerate AI model training and generative inferencing, the ND MI300 v5 virtual machines will soon run on AMD’s latest GPU, the Instinct MI300X, while the new NVL variant of Nvidia’s H100 chip will power the NC H100 v5VMs, currently in preview. These will offer more memory per GPU to improve data processing efficiency.
But Microsoft is also adding custom chips of its own. It designed Azure Maia to accelerate AI training and inferencing workloads such as OpenAI models, GitHub Copilot, and ChatGPT. Maia has a companion, Azure Cobalt, for general (non-AI) workloads.
10. Easier development of small gen AI apps with Windows AI Studio
Azure AI Studio focuses on LLMs, but there’s growing interest in the use of less resource-intensive generative AI models trained for specific tasks — and small enough to run locally on a PC or mobile device. To help developers to customize and deploy such SLMs, Microsoft will soon release Windows AI Studio, which will provide the option of running models in the cloud or on the network edge, and include prompt-orchestration capabilities to keep things in sync wherever they run.
11. Using generative AI for knowledge management
Microsoft’s Viva Engage enterprise communication tools offers a way for employees to learn from their peers by searching a database of answers to frequently asked questions provided by subject matter experts. An update to Answers in Viva due to roll out before year end will add an option to generate those answers — and even the questions they respond to — using AI, based on training files imported from other sources. This could offer enterprises a quick way to switch from a legacy knowledge management platform, or to share resources held in another system.