Select Page

This Azure AI Services article will show how to integrate Azure AI Vision for image analysis in C# applications using .NET.

Azure AI Services offers access to many AI services, including the popular Azure OpenAI service. Today, we will focus on Azure AI Vision, which offers AI capabilities when working with vision objects.

  • Optical Character Recognition (OCR) allows us to extract text from images, whether they are printed or handwritten.
  • Image Analysis: Extract visual features, generate captions, and identify faces and objects.
  • Recognize human faces for facial recognition software, including image blurring and access control.

The above key capabilities cover many real-world use cases in enterprise applications that can recognise humans and objects and integrate them into identity and access solutions.

Azure SDK for .NET

To access Azure AI Vision and integrate the service, we use the Azure SDK for .NET with the Azure AI Vision package. The package is available in Nuget, and it’s the official Microsoft Azure package for the service.

Deploy Azure AI Vision Resource

Before accessing the AI Vision service, we need to deploy an Azure resource that will give us access to the service. To deploy the service, we can either use the Azure portal, Azure PowerShell, Azure CLI, or the .NET SDK for Azure. In our case, we will use Azure Bicep with the following configuration.

Once the service is deployed, Open the resource from the Azure portal and note the API Key and Endpoint.

Application

Now that we have the service deployed to Azure and the access details to the service, we can use the AI Vision library to detect an image and give us a description of it.

Start with creating a C# Console application and install the Vision package using this command.

dotnet add package Azure.AI.Vision.ImageAnalysis --version 1.0.0-beta.3

Copy the following code into Program.cs and create a folder called images in the root directory. Add images to the directory and reference the image you would like the program to analyse (see line 35).

https://www.ntweekly.com/2024/01/27/connect-to-azure-from-c-with-launchsettings-json/Note: Save the Key and Endpoint in an appsettings.json file.

Save the file and run the application using.

Once the application runs, it will output the description of the submitted image and the confidence level of the results.