Tech Talk

What is containerisation and the role of memory and storage within?


Last updated 9 February 2023

Whilst we all get our heads around the likes of IoT, Artificial Intelligence, Machine Learning and all of these emerging trends in the data centre - something we are yet to discuss is ‘containerisation’ and the role of memory and storage, specifically DRAM and SSD, within. Let's do that.

What is containerisation?

Containerisation is a technology that has been around for almost as long as the likes of VMs (virtual machines), but it really hasn't been a hot topic until the likes of Docker and Kubernetes came along, causing a ripple in the world of developers and technicians.

The technology allows such folk to package and deploy applications as containers. Developers are able to manage an executable package that will do various things like: run a piece of software, include the code within, the runtime, the systems tools, its libraries and its settings - all in one lightweight and isolated container.

We mentioned VMs further up, and where containers differ is that unlike the virtual machines which run an entire OS (operating system) and provide a total virtualised environment - containers share the host OS along with its resources. 

What's more, containers are portable. They can run on any host that supports runtime. VMs, well they are more often than not tied to a specific hypervisor and hardware architecture. This ultimately will make them less portable.

Why? Well this is to make them far more efficient and simply able to stop and start as a greater pace. VMs still have their place in the case of strict isolate of resources and where multi-tenancy is required. 

How can containerisation benefit those who provide hosting?

Like most things in life, optimisation is key. For hosting firms and data centres, containers are definitely a solution to allow optimisation of their infrastructure and ultimately, better their customers experience with them.

Here's how:-

  1. Resource Efficiency and Utilisation: Through a containers approach, hosting companies can maximise their hardware by allowing containers to run on a single host system as discussed. Data centres and hosting companies are constantly battling with density. This allows an increase in the density of applications that be hosted, thus reducing the total number of servers within the estate.
  2. Scalability: Demand changes, the past 3 years have shown us that at a rate like never before. With containers, you can easily scale. This is done in two ways: vertically by adding more resources to a single container and horizontally by adding more containers to the deployment. Being able to respond to demand for resources is something that you have control of and ensuring the applications have the resources that they will need.
  3. Portability: In more detail as discussed above, such portability will allow hosting companies who deploy containers to move applications between various and differing environments, should they be at a stage of testing, developing or live.
  4. Cost savings: Something we mention a lot here at Simms and how we can help our customers through enterprise standard SSD & RAM is TCO (total cost of ownership). Simple stuff - hosting companies can reduce the number of servers through container deployment. What's more, this allows your team to reduce time and efforts required to manage applications and infrastructure, reducing costs even more.

 

Container Deployment Example

 

Speaking of us, what could the role of memory and storage play in containerisation?

We work with Micron, Solidigm and Kingston on a range of server SSD (NVMe, SATA) and DRAM (DDR5, DDR4) for integration into a hosting company, cloud service provider or data centres infrastructure.

When thinking about containerisation, the role of memory and storage - specifically enterprise-grade - can be a big one when it comes to allowing resources to function the best they possible can.

Memory: when it comes to RAM, the memory resources of that host system will determine the amount of memory that will ultimately be available to the containers. Various containers on the same host system mean they need to pull from the same memory resources of that host system. What this can do is lead to performance issues if one or more containers consumer an excess of memory leading to exhaustion, potentially disrupting the stability of the system and containers on it. 

Where server grade RAM can play a critical role and simply having more of it, is by determining the amount of memory available on containers by simply having enough ram to ensure that the containers run smooth and create the best possible experience for users.

Storage: when it comes to containers, they rely on the host system to store persistent data. SSD will play an imperative role within the architecture that provides fast and reliable storage compared to likes of HDD, which cannot offer as fast read and write speeds which would negate the performance of containers and applications they host - especially applications that require high I/O (input/output) performance such as databases.  

By having enough server SSD storage available, hosting companies can have peace of mind their containers run as best as possible. 

Author

Drew

Drew is Marketing Lead at Simms, leading our marketing department. Drew has a strong knowledge datacentre and server proposition, and leads on our industrial and embedded side of the business also.