Microsoft Build 2025: Azure AI Foundry Grows Up

Build 2025 was Microsoft's most AI-focused developer conference yet. We break down the announcements that actually matter for enterprise AI teams and what they mean in practice.


Microsoft Build 2025: Azure AI Foundry Grows Up

Microsoft Build is always a window into where enterprise AI is heading — not just the research frontier, but the tooling and platform investments that shape what's actually buildable at scale. Build 2025 was the most consequential in years, and a clear signal that Azure AI Foundry is now Microsoft's central bet for enterprise AI development.

Here's what was announced, and what it means for teams building AI systems today.

Azure AI Foundry: From Platform to Ecosystem

When Microsoft announced Azure AI Foundry at Ignite 2024, it was a unification story — bringing together Azure OpenAI Service, Azure Machine Learning, and the AI Studio experience under one roof. At Build 2025, it grew substantially.

The model catalogue expansion

Azure AI Foundry's model catalogue now includes over 1,800 models — from OpenAI's full GPT family, to Llama 4, Phi-4, Mistral, Cohere, and a growing list of domain-specific models for healthcare, legal, and finance.

The significance: you can now evaluate and swap models within a single managed platform with consistent APIs, data residency guarantees, and billing. This is the architectural flexibility we've been advocating for — building against an abstraction rather than a specific model endpoint.

Phi-4 Silica: On-Device AI on the NPU

One of the more interesting announcements was Phi-4 Silica, a model optimised for Windows Copilot+ PCs — running directly on the Neural Processing Unit (NPU) in Qualcomm Snapdragon processors, entirely offline.

Why this matters: truly private AI that runs locally, with no internet connection required, at reasonable quality. For field workers, sensitive data environments, or simply low-connectivity contexts that are common across East Africa, on-device AI changes what's possible. Watch this space.

Microsoft 365 Copilot Gets Deeper

The Copilot wave across Microsoft 365 moved from "interesting feature" to "meaningful productivity layer" at Build 2025.

Copilot Studio: Agents Without Code

Microsoft's low-code Copilot Studio received significant upgrades. You can now build multi-step agents that:

  • Connect to SharePoint, Dynamics 365, Salesforce, and custom APIs
  • Take autonomous actions (send emails, update records, schedule meetings) based on triggers
  • Handle multi-turn conversations with memory across sessions
  • Route to human agents when confidence is low

For organisations already in the Microsoft ecosystem, this is the fastest path to deploying AI that does something useful — not just answers questions. The barrier to entry has dropped substantially.

Teams AI Library v2

For developers building custom AI into Teams experiences, the Teams AI Library v2 brings better state management, improved action handling, and tighter integration with the Semantic Kernel orchestration framework.

GitHub Copilot Workspace: AI-Native Development Loops

Microsoft announced GitHub Copilot Workspace moving to general availability — a fully AI-native development environment where you describe a task and the AI proposes an implementation plan, opens the relevant files, writes the code, runs tests, and iterates.

Having used the preview, the experience is genuinely different from autocomplete-style assistance. It's closer to pair programming with a junior developer who reads the whole codebase before starting.

For teams with a mix of technical skill levels — common in organisations outside major tech hubs — Copilot Workspace meaningfully expands who can contribute to a codebase.

Azure AI Agent Service: Enterprise-Grade Agent Infrastructure

Perhaps the most important announcement for teams building production AI: Azure AI Agent Service moved from preview to general availability.

This is Microsoft's answer to the question "how do I run autonomous AI agents reliably in production?" It provides:

  • Persistent state — agents maintain conversation context and task state across sessions
  • Tool calling — native integration with code interpreter, file search, Azure Functions, and external APIs
  • Tracing and observability — full audit logs of agent decisions, tool calls, and responses
  • Scale — managed infrastructure that handles concurrency without you managing queues or containers

For organisations exploring agentic AI beyond a prototype, this removes a significant amount of undifferentiated heavy lifting. You write the agent logic; Azure manages the infrastructure.

What This Means for Your AI Strategy

Build 2025 reinforces a direction we've been recommending to clients for several months:

1. If you're in the Microsoft ecosystem, Azure AI Foundry is the right foundation. The breadth of the model catalogue, the compliance infrastructure, and the integration with M365 and Azure services create a coherent platform that's hard to replicate by stitching together independent tools.

2. Agents are moving from experiment to production. The tooling now exists to run reliable, observable, auditable AI agents at enterprise scale. If you've been waiting for the infrastructure to mature, it has.

3. The no-code/low-code layer is serious now. Copilot Studio's capabilities mean that for well-scoped internal tools, a technical business analyst can now build and maintain AI workflows. This has real implications for how AI delivery teams are structured.

If you're mapping your AI roadmap and want to understand how these announcements translate into specific opportunities for your organisation, we'd love to talk. Book a session and let's dig in.