3 Problems IT Leaders Solve with Container Technology

d184b-1-v7fcorrwcweviprwxmb4w.jpeg

In 2019, migration of legacy applications to containers will be one of the hottest IT infrastructure trends. Container technology is extremely appealing to organizations as it makes it easy for development teams to move software around reliably from one environment to another. A report by 451 Research finds that application containers will see the fastest growth compared to other segments, with an estimated compound annual growth rate of 40 percent. In fact, a Red Hat survey supports this finding and adds that container usage is expected to increase 89 percent over the next 2 years.

Yet, many IT leaders, especially in larger enterprise organizations are still struggling to define the value and return on investment. Find out how container technology can solve the top 3 problems IT leaders are encountering today.

Inability to Scale

Many current IT architectures are approaching 10 and maybe even 20 years old. These were built in the days when hardware virtualization and orchestration were the prevailing bleeding edge technologies. Such applications were built in 3 tiers:

  • Front end

  • Middleware

  • Database

However, with the explosive growth of Internet of Things (IoT) and mobile enabled devices, Fortune 500 companies are now scrambling to create web-enabled applications. Within such applications, there is an exponential increase in application service requests, network traffic, and data storage requirements.

Today, companies experience 20% to 30% utilization increases in one day. This increase in utilization has led to the increased demand for technical expertise which added 290,000 jobs over the last year.

In a real world example, when a bank released a feature update to their mobile app, the feature added five new functions into the app. Each function generates six new web service calls. Multiply that by 5 million daily active users, and utilization has increased by 20% in a day.

The problem here is that the legacy monolithic architectures were never built to scale on demand. As most IT shops have primitive orchestration tools at the infrastructure layer, they are completely unable to handle increased workloads at the application layer. This leads to severe outages of customer facing applications, sometimes in one day. This could take days or weeks for the IT team to add the required infrastructure to scale to the demands of the business.

Containers at their core enable application orchestration. They allow infrastructure to scale to individual application requests, typically launching one compute instance per application request. Coupled with microservice software architecture, this enables IT teams to have an elastic approach to utilization spikes. So, when the 20% increase results from a mobile app update release, flexible and well orchestrated container architecture using tools like Kubernetes, automatically scale to those requests.

With container technology, IT administrators do not need to scramble. There are no complex configurations, and no outages. Kubernetes can orchestrate Docker images and adapt to the workload on demand.

Lack of available bandwidth

Kubernetes does what legacy architectures are unable to do. At the hardware level, physical limitations are eliminated. With containers and orchestration, DevOps has the capacity to bring online immediate additional private or public cloud resources. Modern orchestration technologies, along with good DevOps pipelining can enable your team to automatically burst workloads out to additional computing, storage and network resources.

The most common use case for this is for companies that have a private cloud on premise. For example, a Fortune 500 retail company has a private cloud that manages their entire online presence. Their infrastructure is always challenged when they run big sales on black Friday or Labor Day.

However, with a solid containers and orchestration strategy, a portion of that traffic can be pushed to public cloud containers to meet the demand on those specific sales days.

Modern orchestration platforms have all of the expected plug-ins and integrations for pubic cloud integration. The good news about this burst strategy on containers is that when the utilization decreases, the orchestration layer will spin down the infrastructure. This will reduce operational costs significantly. This pay-for-what-you-use model minimizes operational costs.

Long feature release cycles

Monolithic application architectures typically require long and laborious development periods, quality assurance and production lifecycles. Incremental changes to code must be compiled into complete releases and all the features — new and existing — need to be tested in order to certify the build. This requires an enormous amount of coordination and effort from development teams. Consequently, they are unable to keep up with the business needs of the company. As a result, development cycles lag and release schedules delayed.

In contrast, container technology enables development teams to separate features and functions of a software application and isolate them into a handful of container enabled services. This is often considered “microservice architecture.”

The concept behind container technology is simple. By building a microservice architecture, you can isolate feature testing down to microservice and reduce (in some cases, by as much as 90%), the number of dependency tests. This software development approach will then enable your team to implement continuous integration and deployment. This is exactly what Facebook, Amazon and other software companies use to roll out features, sometimes as often as every few minutes. When that is married to containers and orchestration, you will have hit the apex of your DevOps journey. At that point, your applications are almost alive, and they can adapt automatically to maximum and minimum utilization.

In conclusion, containers offer previously unimaginable scalability, access to bandwidth, and rapid deployment of new features.There are many mature enterprise solutions and services in the marketplace that can get you started. Many in the industry call this “the DevOps journey“.

It’s not too late to get started today.

About the Author

David Watson is a Red Hat Ansible and OpenShift consultant with a deep background in container platform implementation. He is part of a team of consultants at Stone Door Group, a DevOps Solutions Integrator, who help companies of all size execute on their digital transformation initiatives. To learn more about Stone Door Group, drop us a line at letsdothis@stonedoorgroup.com