18.6 C
New York
Friday, June 27, 2025

Accelerating agentic workflows with Azure AI Foundry, NVIDIA NIM, and NVIDIA AgentIQ


In collaboration with Microsoft and NVIDIA, we have built-in NVIDIA NIM microservices and NVIDIA AgentIQ toolkit into Azure AI Foundry—unlocking unprecedented effectivity, efficiency, and price optimization on your AI initiatives. 

I’m excited to share a serious leap ahead in how we develop and deploy AI. In collaboration with NVIDIA, we’ve built-in NVIDIA NIM microservices and NVIDIA AgentIQ toolkit into Azure AI Foundry—unlocking unprecedented effectivity, efficiency, and price optimization on your AI initiatives. 

A brand new period of AI effectivity 

In right now’s fast-paced digital panorama, scaling AI purposes calls for extra than simply innovation—it requires streamlined processes that ship speedy time-to-market with out compromising on efficiency. With enterprise AI initiatives typically taking 9 to 12 months to maneuver from conception to manufacturing, each effectivity acquire counts. Our integration is designed to vary that by simplifying each step of the AI improvement lifecycle. 

NVIDIA NIM on Azure AI Foundry 

NVIDIA NIM™, a part of the NVIDIA AI Enterprise software program suite, is a collection of easy-to-use microservices engineered for safe, dependable, and high-performance AI inferencing. Leveraging strong applied sciences equivalent to NVIDIA Triton Inference Server™, TensorRT™, TensorRT-LLM, and PyTorch, NIM microservices are constructed to scale seamlessly on managed Azure compute. 

They supply: 

  • Zero-configuration deployment: Stand up and working rapidly with out-of-the-box optimization. 
  • Seamless Azure integration: Works effortlessly with Azure AI Agent Service and Semantic Kernel. 
  • Enterprise-grade reliability: Profit from NVIDIA AI Enterprise assist for steady efficiency and safety. 
  • Scalable inference: Faucet into Azure’s NVIDIA accelerated infrastructure for demanding workloads. 
  • Optimized workflows: Speed up purposes starting from giant language fashions to superior analytics. 
A screenshot of a computer

Deploying these providers is easy. With only a few clicks—whether or not choosing fashions just like the Llama-3.3-70B-NIM or others from the mannequin catalog in Azure AI Foundry—you’ll be able to combine them instantly into your AI workflows and begin constructing generative AI purposes that work flawlessly inside the Azure ecosystem. 

Optimizing efficiency with NVIDIA AgentIQ 

As soon as your NVIDIA NIM microservices are deployed, NVIDIA AgentIQ takes middle stage. This open-source toolkit is designed to seamlessly join, profile, and optimize groups of AI brokers, permits your programs to run at peak efficiency. AgentIQ delivers: 

  • Profiling and optimization: Leverage real-time telemetry to fine-tune AI agent placement, lowering latency and compute overhead. 
  • Dynamic inference enhancements: Constantly acquire and analyze metadata—equivalent to predicted output tokens per name, estimated time to subsequent inference, and anticipated token lengths—to dynamically enhance agent efficiency. 
  • Integration with Semantic Kernel: Direct integration with Azure AI Foundry Agent Service additional empowers your brokers with enhanced semantic reasoning and job execution capabilities. 
Image showing how NVIDIA NIM Models and Azure AI Agent Service can be used together to enable agentic apps.

This clever profiling not solely reduces compute prices but additionally boosts accuracy and responsiveness, so that each a part of your agentic AI workflow is optimized for fulfillment. 

As well as, we are going to quickly be integrating the NVIDIA Llama Nemotron Motive open reasoning mannequin. NVIDIA Llama Nemotron Motive is a robust AI mannequin household designed for superior reasoning. In accordance to NVIDIA, Nemotron excels at coding, complicated math, and scientific reasoning whereas understanding person intent and seamlessly calling instruments like search and translations to perform duties.

Actual-world impression 

Trade leaders are already witnessing the advantages of those improvements.

Drew McCombs, Vice President, Cloud and Analytics at Epic, famous: 

The launch of NVIDIA NIM microservices in Azure AI Foundry gives a safe and environment friendly approach for Epic to deploy open-source generative AI fashions that enhance affected person care, increase clinician and operational effectivity, and uncover new insights to drive medical innovation. In collaboration with UW Well being and UC San Diego Well being, we’re additionally researching strategies to guage medical summaries with these superior fashions. Collectively, we’re utilizing the most recent AI know-how in ways in which actually enhance the lives of clinicians and sufferers.

Epic’s expertise underscores how our built-in answer can drive transformational change—not simply in healthcare however throughout each {industry} the place high-performance AI is a recreation changer. As famous by Jon Sigler, EVP, Platform and AI at ServiceNow:

This mix of ServiceNow’s AI platform with NVIDIA NIM and Microsoft Azure AI Foundry and Azure AI Agent Service helps us convey to market industry-specific, out-of-the-box AI brokers, delivering full-stack agentic AI options to assist resolve issues quicker, ship nice buyer experiences, and speed up enhancements in organizations’ productiveness and effectivity.

Unlock AI-powered innovation 

By combining the strong deployment capabilities of NVIDIA NIM with the dynamic optimization of NVIDIA AgentIQ, Azure AI Foundry offers a turnkey answer for constructing, deploying, and scaling enterprise-grade agentic purposes. This integration can speed up AI deployments, improve agentic workflows, and cut back infrastructure prices—enabling you to concentrate on what actually issues: driving innovation. 

Able to speed up your AI journey? 

Deploy NVIDIA NIM microservices and optimize your AI brokers with NVIDIA AgentIQ toolkit on Azure AI Foundry. Discover extra in regards to the Azure AI Foundry mannequin catalog.

Let’s construct a wiser, quicker, and extra environment friendly future collectively. 



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles