Modern Software Engineering using Containers and Microservices


 Modern Software Engineering using Containers and Microservices

Advait Kulkarni, IT Director, Cetera Financial
Group

The world is seeing a hyper competitive and dynamic business landscape these days. Fuelled by intense competition, organizations are being forced to innovate software, engineering being no exception. Among enterprises, it is now acknowledged that downtime costs real dollars. To be cutting edge, it is essential to minimize downtime. There is also a general agreement that cadence around build and release as well as sticking to good quality are recipes to a sound organizational Engineering strategy.

To do this, software will need to be developed as a service. Some of the basic principles that make this happen are:

•Always using declarative formats for automated setup so that it is abstracted and reusable.
•Enabling continuous build and deploy process through continuous integration
•Have ability to replicate environmental settings between different environments so that test and production systems can be brought as close as possible
•Defined consistent contracts with underlying systems providing maximum portability
•Flexibility to integrate with the latest security standards for authentication, authorization and federation
•Consistent architecture to move to cloud platforms removing need for physical servers
•Facilitating scale up without changes to architecture and tools.
For making the above principles implementable, modern software engineering practices include the following practical solutions.

Infrastructure as Code:

Containers are a form of virtualization in which the operating system gets virtualized splitting it up into compartments to run container applications. This approach allows pieces of code to be put into smaller, easily transportable pieces that can run anywhere Linux is running. It’s a way to make applications even more distributed, and strip them down into specific functions. The benefits of this technology include build once and deploy multiple times, higher consistency between testing environments and production environments and increased modularity which reduces the complexity to update and release patches

Internal working

Linux Containers can be defined as a combination various kernel-level features which allow management of applications as well as its resources contained within their own environment. By making use of certain features like namespaces, chroots, cgroups and SELinux profiles, the LXC contains application processes and helps with their management through limiting resources, not allowing reach beyond their own file-system.

Container orchestration engine

As the usage of containers explode, there is a need to manage them across infrastructure changes. There are many tools that help orchestrate the cluster of containers. These are enterprise framework technologies that help to integrate and manage high scale container environments. Kubernetes is an example of an orchestration system for Docker containers. It handles scheduling and manages workloads based on user-defined parameters. Docker Swarm, Amazon EC2 Container Service and Microsoft Azure container service are competitors in this space.

Serverless technology

Contrary to the name, this does not mean there are no servers involved in the IT Process at all but the consumer of a function or a services does not have to own the system running the function. This is in line with the Platform as a service model which helps applications directly make calls to the cloud service. The cloud providers serve requests, and customers are billed by an abstract measure of the resources required to satisfy the request, rather than per virtual machine, per hour. Solutions like Microsoft Azure’s function app and Amazon AWS Lambda run code without provisioning or managing any servers. Using these techniques, enterprises can run pretty much any type of code as a backend service without having to worry about scaling and availability needs.

Microservices

In recent years, Microservices has come out as a way of designing software applications as suites of independently deployable services. It is a variant of the Service-oriented architecture architectural style that structures an application as a collection of loosely coupled services. The benefits of decomposing an application into different smaller services include easier understanding, quicker development, more granular deployments and cleaner testing.

How does Docker container help Microservices?

Some of the requirements of the Microservices architecture are as follows:
•Each microservice is relatively small in size
•Each service should be deployed completely independently of other services
•Easier to scale development due to smaller footprint.
•Improved fault isolation without interrupting other services and containers.
•Each service can be developed and deployed independently with no dependencies•Eliminates any long-term commitment to a technology stack

Dockers provides the following characteristic which helps microservices

• Isolation: Microservices are built around business capabilities and independently deployable by fully automated deployment machinery. Each micro service can be deployed without interrupting the other micro services and containers provide an ideal environment for service deployment in meaning of speed, isolation management, and lifecycle. It is easy to deploy new versions of services inside containers.
•Application Portability: Docker puts application and all of its dependencies into a container which is portable among different platforms, different Linux distributions and clouds. You can build, move and run distributed applications with containers. By automating deployment inside containers, developers and system administrators can run the same application on laptops, virtual machines, bare-metal servers, and the cloud.
•Resource Utilization: Containers comprise just the application and its dependencies, neither more nor less. Each container runs as an isolated process in user space on the host operating system, sharing the kernel with other containers. Thus, it enjoys the resource isolation and allocation benefits of virtual machines but is much more portable and efficient. This does not mean that containers can run not only on VMs, but also on physical servers. Due to the lightweight nature of containers, you can run more containers on a given physical server than virtual machines. The result is higher resource utilization.
•Beyond the virtualization is “containers”: Whenever you need tight resource control and isolation for your application environment, you use virtual machines. But, what if your application environment does not require the hardware resources of full virtualization? Containers can provide user environments whose resource requirements can be strictly controlled with or without using virtualization.
•Enterprise Partnerships: Container technology is the new emerging technology in the IT industry after server virtualization revolution. Docker is leading this trend with new partnerships. Industry leading companies, including IBM , Google, Vmware, Redhat, and Microsoft signed partnership agreements with Docker. Those agreements show that Docker has huge potential in the era of cloud and virtualization.

Security Automation

As automated continuous build and deploy tasks are getting more mainstream, the need to perform security automation to monitor security concerns has arisen. For being proactive in security checking policy execution automation, which refers to the automation of any administrative work required of IT security, has increased drastically.

Subscribe to Industry Era



 

Events