Published on November 16, 2023, 5:09 pm

Microsoft Embraces Generative AI at Annual Developer Conference

Microsoft’s annual developer conference, Ignite, is putting the spotlight on artificial intelligence (AI) this year. Over half of the nearly 600 sessions are centered around AI in some capacity. One particular focus is generative AI, which is at the core of several new product announcements by Microsoft.

Among these announcements is the introduction of new AI capabilities for managing large language models (LLMs) in Azure. Microsoft is also expanding its range of generative AI assistants called Copilot, with new additions to support various applications such as Dynamics 365, Microsoft 365, GitHub, and Viva. Additionally, a new tool will be available to help developers deploy small language models (SLMs).

One significant change revealed during the conference is the rebranding of Bing Chat Enterprise as Copilot. The name change brings with it new capabilities, including enhanced data protection for organizations using Microsoft’s Entra cloud-based identity management service. This updated version of Copilot will be generally available starting from December 1st, 2023.

However, Copilot isn’t limited to a single application or use case. There are now distinct versions for different purposes within organizations. For instance, Copilot for Service aims to assist contact center agents by integrating customer information and knowledgebase articles with collaboration tools like Teams and Outlook. On the other hand, Copilot for Sales helps sales staff prepare for customer meetings by generating custom briefing documents.

Moreover, not only do Microsoft 365 users benefit from Copilot’s capabilities; administrators also have their own version. An upcoming update will add Copilot support to the Edge for Business management interface, providing recommended policies and extensions for the workplace browser. Meanwhile, an adoption dashboard for Microsoft Viva will help track how Copilot features impact user workflows.

Looking ahead to next year, there are even more features slated for release. For example, Teams will soon offer live meeting transcripts that can be summarized as notes and organized on a whiteboard. Participants can interact with these notes, requesting more information on specific points after the meeting concludes. Microsoft is also opening up Copilot for Microsoft 365 to third-party plugins and connectors, enabling integration with tools like Jira, Trello, Confluence, and Freshworks.

In addition to expanding its generative AI capabilities, Microsoft is addressing the need for AI skills development by offering credentials in Microsoft Applied Skills. These credentials cover topics such as developing generative AI with Azure OpenAI Service, building document processing systems with Azure AI Document Intelligence, creating natural language processing tools with Azure AI Language, and constructing Azure AI Vision systems.

Furthermore, Microsoft is introducing Azure AI Studio, a unified platform for building generative AI applications. Developers will have access to a range of proprietary and open-source LLMs, as well as various data sources. Monitoring performance once models are deployed is also simplified within the platform.

To optimize infrastructure for AI workloads, Microsoft is incorporating new chips tailored for these tasks into its Azure infrastructure. These include AMD’s Instinct MI300X GPU for accelerated model training and inferencing in virtual machines like ND MI300 v5. Moreover, the introduction of Nvidia’s NVL variant of the H100 chip will enhance data processing efficiency in NC H100 v5VMs.

However, Microsoft isn’t relying solely on third-party chips; they are also investing in custom chips designed specifically for their needs. One such example is Azure Maia, which accelerates training and inferencing workloads for models like GitHub Copilot and ChatGPT. Its companion chip, Azure Cobalt, is focused on general non-AI workloads.

While Azure AI Studio emphasizes large-scale language models (LLMs), there’s growing interest in using less resource-intensive generative AI models trained for specific tasks that can run locally on PCs or mobile devices. To address this, Microsoft plans to release Windows AI Studio, offering developers the option to run models on the cloud or at the network edge.

Lastly, Microsoft is enhancing its Viva Engage enterprise communication tools with AI capabilities. An update to Answers in Viva will allow for generating answers and questions using AI based on training files from various sources. This feature offers a seamless transition from legacy knowledge management platforms and simplifies resource sharing within organizations.

Microsoft’s commitment to embracing generative AI and integrating it into their products and services demonstrates their dedication to advancing the field of AI. With a range of new features and capabilities on the horizon, CIOs can prepare their organizations for an AI-driven future by leveraging Microsoft’s offerings.

Share.

Comments are closed.