Key Attributes of Cloud-Native Applications

Posted by Aleksandar Foltin on 29 May 2020

 

Introduction

Cloud-native application development is an approach for building and maintaining large and complex applications while taking advantage of cloud infrastructure, as the name “cloud-native” suggests. However, cloud infrastructure is not the only relevant aspect. The cloud-native approach also leverages modern technologies as well as development best practices, so the application can adjust well to all the benefits that modern cloud infrastructure offers today. That being said, an application may be considered “cloud-native” even if it’s not running in the cloud, but on the other hand, meets the key attributes of cloud-native applications.

What are Microservices?

Microservices represent an architectural deployment approach where an application is broken down into smaller independent and standalone parts called microservices. It is the opposite of the traditional monolithic approach where the application is being deployed as one single chunk of code. Microservices communicate with each other through REST APIs, and function together as one application. They are language-independent thus they can be developed in different technologies by different teams.

Since microservices promote modularity and separation of concerns, it is much easier to handle them separately in terms of improvements, testing, and scaling. All this can be much harder to perform on a monolithic application, and ultimately, much more expensive. If an application is built in a cloud-native fashion, microservices usually get deployed inside containers. Containers can enhance key microservices features even further.

Containers

Containers are lightweight, self-contained, independent, and portable software units that bundle up application code and all application dependencies, so the application can run independent of the host environment. In simple terms, a containerized application can run on any computer as long as computers are using the same container platform. Unlike the virtual machines, containers don’t bundle up the guest operating system, which makes them lighter and faster. Furthermore, multiple containers can share their dependencies which eliminates the need for extra copies.

Image 1. Virtual Machines

Most Popular Container Platform

While there are several container platforms on offer, the most popular and widely used is Docker. Docker, as a software used to build containers, is most often used with Kubernetes, which is a system for containers orchestration. It is used for automated deploying containerized applications scaling them by need, their management, and optimization – so it doesn’t have to be done manually.

In addition, Kubernetes can perform all these operations across multiple physical or virtual machines (clusters), on public, private, and hybrid clouds or on-premise. Thus, you can containerize your microservices with Docker, and using Kubernetes you can automate their management across multiple hosts. You can use load balancing to distribute traffic to the deployed containers, add storage to microservices in an automated fashion, control the creation and removal of containers based on a declared state, automatically pair containers and nodes in regards to resource needs, manage self-healing of containers ensuring zero downtime, and much more.

Serverless

In the serverless computing strategy, developers don’t need to worry about server management – all these tasks are abstracted to the cloud provider. Of course, “serverless” doesn’t mean that there is no server included. The server is included in the system but managed mainly by the cloud provider. This way, developers can focus more on the code since they don’t have to worry about the infrastructure.

Though it is similar to the “Infrastructure as a Service” (IaaS) cloud model, the type of pricing is significantly different. It is based on the exact amount of resources that application consumes, in contrast to the prepurchase of capacity units. The idle time of an application is not included in pricing, which means that the user doesn’t get charged for the time when the application is not in the executing state.

There are two main implementations of serverless approach:

► “Backend as a Service” (BaaS) and

► “Function as a Service” (FaaS).

desktop application development

In the BaaS model, the backend part of an application is delegated to cloud provider services which provides alternatives for backend application code. Some of the mentioned services could be user authentication, database management, push notifications, cloud storage, etc. For the ease of communication and integration, these services are usually set up as APIs. An example of BaaS is Amazon API Gateway, a fully managed service that allows the implementation of REST and WebSocket APIs.

On the other hand, FaaS provides more control over the backend functionalities than BaaS. Backend logic is still in the hands of developers, but the containerization and deployment of this code are administered entirely by the cloud provider. Developers write functions that are then deployed inside containers and triggered by a user action such as a mouse click. The most popular examples of FaaS are AWS Lambda, Google Cloud Functions, and Microsoft Azure Functions.

In the BaaS model, the backend part of an application is delegated to cloud provider services which provides alternatives for backend application code. Some of the mentioned services could be user authentication, database management, push notifications, cloud storage, etc. For the ease of communication and integration, these services are usually set up as APIs. An example of BaaS is Amazon API Gateway, a fully managed service that allows the implementation of REST and WebSocket APIs.

On the other hand, FaaS provides more control over the backend functionalities than BaaS. Backend logic is still in the hands of developers, but the containerization and deployment of this code are administered entirely by the cloud provider. Developers write functions that are then deployed inside containers and triggered by a user action such as a mouse click. The most popular examples of FaaS are AWS Lambda, Google Cloud Functions, and Microsoft Azure Functions.

The Role Of DevOps

DevOps (development and operations) represents a set of practices that development and operation teams implement together in order to deliver quality software solutions at a faster pace. DevOps encourages an agile relationship between development and operation teams, so they can respond better and faster to consumer demands, and remain competitive in the market. This also includes the automation of as many processes as possible which fits perfectly in the cloud-native development model.

devops

Image 2. DevOps Life Cycle

Additionally, DevOps “agile” philosophy applies much better to the microservice architecture than to the traditional one because microservices are more agile themselves. Combining agile culture with agile technologies results in much better business responsiveness and productivity. Thus, for instance, agile culture can be implemented using Scrum framework, code integration, and testing in a continuous and automated form can be done with Jenkins CI/CD platform, code management and versioning with Git, automated microservices deployment with Docker and Kubernetes, continuous system monitoring with New Relic, etc. Through all the mentioned aspects, DevOps provides both velocity and quality to product release flow.

Design and Development

Because of the nature of the microservice architecture, cloud-native applications are suitable for the usage of different technologies in a single project. Every microservice can consume the technology that best suits its functionality. This is a big advantage when comparing cloud-native to traditional monolithic applications, and it is possible thanks to REST APIs that microservices provide and use to communicate with each other through HTTP.

Since HTTP is technology agnostic, the best fitting technology can be chosen for each microservice. It is worth mentioning that a growing number of microservices can result in complexity and difficulties in communication. Service discovery mitigates this problem by eliminating the need for hardcoding IP addresses which microservices use for identification among each other. On the other hand, service mesh technologies, like Istio, abstract microservice communication into separate infrastructure layers, which makes communication optimization much easier.

When it comes to the development process of cloud-native applications, the Twelve-Factor App is a widely accepted principle. It is a set of best practices that developers need to follow in order to take full advantage of modern cloud environments. It emphasizes declarative automation, portability between execution environments, continuous deployment, and rapid scale.

Conclusion

Through all the mentioned aspects of cloud-native, we could say that cloud-native philosophy is really about adding value to the business by responding adequately to the fast-paced market and high customer demands, and at the same time, reducing cost by adopting cloud-native standards. These main objectives are achievable through numerous benefits that the cloud-native approach gives such as secure and frequent releases, continuous improvement powered by robust automation, elastic scalability, application availability and resilience, pay-per-use of cloud resources, and more efficient development and operation teams. All the mentioned key attributes of cloud-native are oriented towards providing these benefits, and one can say that this is what the cloud-native is all about.

Aleksandar Foltin - Software Developer
Aleksandar Foltin

Software Developer

PanonIT
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.