Select Page

In this Microsoft Azure OpenAI blog post, we will deploy an Azure OpenAI resource and an OpenAI GPT4 model using Bicep.

Azure OpenAI is an enterprise-grade AI service that provides access to all the OpenAI AI models, including GPT, DALL-E, Whisper, and more.

With Azure OpenAI, we can deploy the services using PowerShell, Azure CLI, Terraform and Bicep.

Bicep

In this post, we will use Bicep. Azure Bicep is a domain-specific language (DSL) and infrastructure-as-code(IaC) tool similar to Terraform, which is specific to Azure.

The key features of Bicep and advantages of Bicep over Terraform are:

  • Simplified Syntax – Bicep offers a declarative syntax that is human-readable and easy to maintain.
  • State file management – Bicep does not use a local state file to manage resources and maintain a state file in Azure.
  • Access to the latest Azure API—Bicep uses Azure’s latest API version (including preview), allowing us to write code using that version.
  • Bicep integration with Azure—Bicep is deeply integrated and developed by Microsoft, reducing reliance on third-party companies like HashiCorp.

Install Bicep and Create Resource Group

To install Bicep, I will use the following Azure PowerShell, which will install the latest version.

Next, I will create a resource group using Azure PowerShell. If you prefer to create a resource group using Bicep.

Create Azure OpenAI Account and Deployment

The following Bicep template (main.bicep) will create an Azure OpenAI account and a GPT4 Deployment.

Deploy Bicep Template

The final step in our deployment process is to deploy the Bicep template and effectively create an Azure OpenAI account and GPT4 model deployment.

Use the following Azure PowerShell command to deploy.