The AI landscape continues to evolve, and the recent announcement by Microsoft introduces a comprehensive platform designed to revolutionize AI development, deployment, and governance. This platform comprises three groundbreaking services: Microsoft Fabric, Prompt Flow, and Semantic Kernel. Read on to explore the capabilities of each service and discuss their implications for AI and user experiences at large.

Microsoft Fabric

At the core of AI lies data analysis. Microsoft Fabric goes beyond traditional data analysis by offering a comprehensive suite of features, including data ingestion, pipeline management, and data governance. What sets Microsoft Fabric apart is its intuitive natural language interface. This groundbreaking interface allows users to effortlessly interact with their data, posing questions and creating visualizations with ease. By leveraging Microsoft Fabric, AI developers and data scientists can unlock the full potential of their data, fueling their AI solutions with unparalleled effectiveness.

Prompt Flow

As we move into the future, we expect a paradigm shift in application development, with less dependence on code and a greater emphasis on prompt engineering. While this innovative approach brings powerful benefits, it also introduces unique challenges concerning security, safety, testing, and deployment. Prompt Flow, although not yet available to the public, is among the first tools we’ve encountered that addresses these challenges by providing an end-to-end process for managing these types of applications – from their development stage to deployment. It offers an extensive toolkit for integrating applications, custom machine learning models, ChatGPT, and traditional data sources, ensuring these components operate together in a secure, safe, and reliable manner.

Semantic Kernel

Semantic Kernel is an open-source framework designed for building sophisticated applications utilizing large-language models such as ChatGPT. Despite only recently becoming publicly available, this platform has been in development for over a year, long before models like GPT-4 were known to the public. Today, it forms the foundational architecture for many of Microsoft’s AI products. It enables developers with the capability to marry AI models with native functions and memory, as well as access external data sources and APIs. The potential applications for this technology are vast, ranging from constructing custom large-language-model powered pipelines for analysis of large amounts of text to creating the next generation of chatbots.

For the more technical, we highly recommend watching this recent presentation on it at Microsoft Build which discusses the tectonic shift this new type of “semantic” development heralds.

Azure OpenAI

To deliver these powerful AI services, Microsoft’s Azure OpenAI service follows a meticulous data processing and governance approach.

Here are the key aspects of data processing within Azure OpenAI. To learn more about privacy and security measures check out this article.

Types of processed data:

Azure OpenAI processes three primary types of data: prompts and completions submitted by users, training data provided by customers for fine-tuning models, and results data from the training process.

Data processing functions:

The data undergoes three main types of processing within Azure OpenAI: creating fine-tuned models using customer training data, generating completions and embeddings from user prompts, and analyzing prompts and completions to detect abuse or harmful content.

Encryption and storage:

Fine-tuned models are stored encrypted for the customer, ensuring data security. Prompts and completions may be stored encrypted for up to 30 days, with access limited to authorized employees for debugging or abuse detection. Customers have the option to opt out of abuse monitoring and logging, ensuring their data privacy.

Content filtering and logging:

Content filtering functions differently and does not log or store data. Customers can request modifications to content filtering as needed. Customer managed keys encrypt data at rest, except for the data logged for 30 days.

Employee access and customer lockbox:

Strict access controls are in place to safeguard customer data. Rare employee access to data requires Customer Lockbox and follows a stringent access request process. However, it’s important to note that abuse monitoring data may still be accessed by authorized personnel for security purposes.

Conclusion:

Microsoft’s comprehensive AI platform, comprising Microsoft Fabric, Prompt Flow, and Semantic Kernel, marks a significant advancement in the AI space. With these services, Microsoft aims to empower data analysis, streamline complex AI workflows, and enhance custom AI development. Coupled with the robust data processing and governance measures implemented in AzureO3 is at the forefront of the AI revolution, and we’re excited to leverage the power of Microsoft’s comprehensive AI platform to create exceptional experiences. By partnering with O3, your brand can harness the capabilities of Microsoft Fabric,  Prompt Flow, and Semantic Kernel to unlock the full potential of AI in your business.

About O3

Since 2005, our team has been pushing the boundaries of innovation with its deep understanding of the current and emerging digital ecosystem. Learn more about us, our work or innovation at O3.