Azure Open AI

Description

The Azure OpenAI provider integrates Azure OpenAI chat + embeddings behind IFlexAIProvider.

Provider capabilities (based on the implementation):

  • Chat completions: ChatAsync(...)

  • Streaming chat: ChatStreamAsync(...)

  • Embeddings: EmbedAsync(...)

Important concepts

  • In Azure OpenAI, “model” identifiers are typically deployment names.

  • The implementation supports API key auth and also includes a Managed Identity–based constructor.

Configuration in DI

// Registers IFlexAIProvider / IFlexAIProviderBridge
services.AddFlexAzureOpenAI(configuration);

appsettings.json

Configuration section: FlexBase:AI:AzureOpenAI

Examples (template-based)

These examples mirror the generated Query and PostBus handler templates. You do not register these types manually.

Query: generate a completion

PostBus handler: generate a completion

Provider considerations

  • Deployments: set DeploymentName / defaults to match the deployment names you configured in Azure.

  • Auth: store ApiKey securely; if your generated infrastructure is configured for Managed Identity, you can avoid keys.

  • Observability: provider logs duration and usage when available.

Last updated