Scaling with Azure Container Apps & Apache Kafka

11.06.2024

This article, written by re.alto’s development team, explains how to scale with Azure Container Apps and Apache Kafka. While such documentation already exists for use with Microsoft products, our development team did not find any similar documentation on how to scale containers using Apache Kafka and Azure Container Apps. We have since figured out how to do this ourselves and want to share our knowledge with other developers. The article below is intended to act as a guide for other developers looking to do something similar.

(For an introduction to this topic, please see our previous article on containerisation and container apps here.)
 

To allow containers, specifically replicas of a container, to be scaled up and down depending on the number of messages flowing through an Apache Kafka topic, it is possible to set up scaling rules for the container. But first, we need to create a container image that can consume messages from an Apache Kafka topic. 

Consume message from an Apache Kafka topic:

To consume messages from an Apache Kafka topic, various solutions are already available. Since we have chosen to host our Apache Kafka cluster with Confluent, we have also decided to use their nuget package. You can find it on the nuget.org feed under the name Confluent.Kafka.

 

When you install this nuget package, you can use the ConsumerBuilder class to create a new consumer and subscribe to a topic. 

Now all that is left is to consume the messages.

Now you have the code to consume messages from an Apache Kafka topic. If you would like more information, you can have a look at the documentation provided by Confluent at https://docs.confluent.io/kafka-clients/dotnet/current/overview.html. Now we build a container image with this code inside, register it at our container registry and create a container app based on the image – but how do we tell Azure Container Apps how to scale this container?

Scaling Azure Container Apps:

Azure Container Apps uses a KEDA scaler to handle the scaling of any container. You will not have to configure the KEDA scaler yourself, but you will have to tell the container which scaling rules to use. You will find a lot of examples on the Microsoft documentation pages on how to scale based on HTTP requests or messages in an Azure Service Bus. However, if you would like to know how to configure the scaling rules for an Apache Kafka topic, you may find yourself out of luck with the available documentation. But we explored and managed to do it like this:

You will need to set up a custom scaling rule for your container. We are using bicep to deploy our containers, therefore the examples shown below will be in bicep, but they will translate easily to any other way of deploying to Azure. To understand bicep or have a look how to create a bicep file for an Azure Container App have a look here: https://learn.microsoft.com/en-us/azure/templates/microsoft.app/containerapps?pivots=deployment-language-bicep 

We are going to focus on the scaling part of the container, so to begin with, you need to configure the minimum and maximum number of replicas that the container can scale between.  

In the example above, we have set the minimum number of replicas running to 0, which means it will scale down to zero replicas running when there are no messages to consume from the Apache Kafka topic. This means that we can save on costs, as well as freeing up those resources for something else.  
The maximum number of replicas is set to 6. You can go as high as you like, but going above the number of partitions in the Apache Kafka topic would be pointless, since any replica above the number of partitions will not be able to consume messages. So this number should be set to any number that is less than or equal to the number of partitions of the Apace Kafka topic. 

Now let’s add the rule. 

The name of the rule can be anything you like; the name chosen here is just an example. After configuring the name, we need to define the rule. In our case, it is a custom rule. The first thing that the custom rule needs to know is the type of the rule. This translates to the KEDA scaler that is going to be used for the rule. We want to use the Apache Kafka KEDA scaler, but any available KEDA scaler can be used here.  

Next comes the metadata, which means we need to provide the bootstrap servers, the consumer group and the topic that was used in the code sample to inform the KEDA scaler which Apache Kafka topic to listen to and determine whether scaling up or down is needed.  

And as a final step, we need to allow the KEDA scaler to access the Apache Kafka topic. We have created a few secrets to store the required information (please look at the Microsoft documentation regarding deploying Azure Container Apps mentioned above for more information). We will have to provide the bootstrap servers, the username and password and the type of authentication to the KEDA scaler to allow it to connect to the topic. Once this is all configured and the container is deployed using the bicep module, you have your Azure Container App configured to scale based on the number of messages in the Apache Kafka topic. 

Explore more

API

What is an API?

API hub What is an API? The term API is an acronym, and it stands for “Application Programming Interface.” An API is a vital building block in any digital transformation strategy, and one of the most valuable in achieving scale, reach and innovation. Behind every mobile app and online experience,

Read More »

API

API

A guide to monetising APIs

API hub A guide to monetising APIs In this guide learn about monetising energy APIs, and their commercial value as a new channel for monetising existing digital assets and data. APIs aren’t new. In fact, they’ve been around for quite a while, embraced fully by industries from all new digital

Read More »

API

APIs in energy

API hub APIs in energy Digitalisation in the energy sector. Unlike other industries where digitisation is the norm, the energy sector is a child by comparison.  In many countries, electricity is still purchased via a sales representative using a paper contract. Many energy retailers, (renewable) energy producers or grid operators

Read More »

API

API

re.alto Talks Part II

API hub re.alto Talks, Part I: Realising the energy transition in times of change This webinar is Part One of a three-part series on “Realising the energy transition in times of change”. https://youtu.be/YBdnui2y904 Explore more

Read More »

API

API

re.alto Talks Part III

API hub re.alto Talks, Part III: The benefits of an API marketplace in energy This webinar is Part Three of a three-part series on “Realising the energy transition in times of change”. https://youtu.be/C2IRj699eWg Explore more

Read More »

API

API

re.alto API overview

API hub re.alto API overview re.alto energy – Technical Setup for Existing APIs On the re.alto platform, individual Users can search for, and subscribe to a Provider’s API products. These subscriptions are monitored, tracked, and (if monetised) billed and settled individually by the re.alto platform.   Each subscription made by

Read More »

API

Frequently asked questions

API hub re.alto Marketplace FAQ What is the re.alto API marketplace? The re.alto marketplace is a marketplace for digital energy products and services, delivered via APIs. You can register as a provider or a consumer/user. As a provider, your digital products via APIs are uploaded to the re.alto marketplace, where

Read More »

API

Energy Quantified and re.alto case study

API hub Energy Quantified and re.alto The API-led approach to digital scale and industry growth As decentralisation of the energy market drives the rise of a host of smaller industry players, easy access to digital products at volume is now an essential factor for the rapid scalability desired by those

Read More »

API

APIs are everywhere – short animation

API hub APIs are everywhere – short video animation We are in living in an increasingly API-centric world. APIs are everywhere you look – and you might not even realise it. Need evidence? Gartner considers API management tools an essential component of the unrealized hybrid integration platform (HIP), currently an

Read More »

API

Adopting an API as a product mindset with APIs

API hub Adopting an ‘API as a product’ mindset with APIs APIs have enormous potential to open companies up to new revenue streams, unlock new markets and extract value from existing assets. To fully realise this potential however, APIs need to be lifted out of the sole domain of the

Read More »

API

Three things you may not know about APIs

API hub Three things you may not know about APIs API. Application Programming Interface. It is the communications channel between two web-based applications, allows the exchange of data without any connecting physical infrastructure. APIs lie at the very heart of digital transformation. According to the 2020 State of the API

Read More »

API

Alternative APIs for Dark sky

API hub Alternative APIs for Dark sky and the strategic value of weather forecasting data in energy In this article you’ll be introduced to weather data use cases and the importance of weather data within the renewable energy and digital landscape.  We also do a deeper dive into alternative APIs

Read More »

API

Dev

Dev

Scaling with Azure Container Apps and Apache Kafka

API hub Scaling with Azure Container Apps & Apache Kafka 11.06.2024 This article, written by re.alto’s development team, explains how to scale with Azure Container Apps and Apache Kafka. While such documentation already exists for use with Microsoft products, our development team did not find any similar documentation on how

Read More »