corner gradient
Blog

How Kubernetes Make Businesses' Cloud-Based Applications Scalable

How Kubernetes Make Businesses' Cloud-Based Applications Scalable

When businesses seek to roll out their cloud-based apps, whether consumer-facing or for internal use only, a legitimate concern is whether the application will become obsolete due to growth. Kubernetes and the Google Kubernetes Engine can enable businesses to quickly adapt to increased demands on applications and an ever-changing marketplace by automating the deployment of containerized workloads.

 

The GKE also creates simple paths to troubleshooting issues as they arise. These conveniences ensure cloud-based apps will remain competitive and functional for the long haul.

The creative power of Kubernetes, always on hand

Kubernetes are a popular choice for hosting workloads because of the ease in which they define parameters for how applications should run. They do this by relying upon developer-specified definitions, most often YAML files.

 

While there is a good amount of initial labor involved with providing those definitions, from that point, the Kubernetes take over. Similar to creation myths, the developer “speaks the world into existence” by writing JSON or YAML definitions and then the Kubernetes set to work making those words reality.

 

This drives scalability in two ways. First, it ensures that as the evolution of applications occurs, it does so in a controlled way.

 

If a retail business with a mobile application that customers use to locate products in its stores wants to add a function that allows the user to sort items by price, for example, that creates the risk of some clusters drifting, not sure of how they are supposed to function within that new workload. Kubernetes acts as the “judiciary” in this situation, interpreting the developers’ definitions to the workloads of the new function.

 

The second way in which Kubernetes drive scalability is in allowing for the different ways in which end-users actually use applications, whether that fits the developers’ original intentions or not.

How Kubernetes respond to users’ demands upon applications

An issue with providing a solitary contingent of YAML definitions is that they can be too rigid to be of great convenience for the users of applications. Yet, the flip side of that coin is, if developers are constantly “remaking the wheel,” it becomes a never-ending chore.

 

Kubernetes allows for several possible solutions to this issue. For the most part, the approach depends on the developers’ preferences and which strategy may produce the best result given the specific circumstances.

 

There are ways to write definitions with certain degrees of flexibility inherent in them, such as using overlay configurations and parameterized templating. Kubernetes also allow developers to go in and tweak the definitions “after the fact” by replicating existing definitions then customizing them for new purposes.

 

A real-world illustration of this benefit of Kubernetes is a company that wants to enable its inventory management employees to use an internal application originally built for shipping. To successfully do this, the application will have to update inventory totals when items ship so that staff can respond appropriately.

 

Kubernetes will give developers several options on how to integrate new workload containers into the application based on convenience and efficiency without having to modify the entire model. Once the new definitions are in place, Kubernetes manages the new containers as if they had been there all along.

 

Another way Kubernetes can greatly aid business’ growth is in addressing the inevitable issues that will arise with cloud-based applications. One company has already seen tremendous improvement in its e-commerce functionality through Kubernetes.

How Kubernetes eliminated downtime for Shopify

 

Shopify is one of the premier e-commerce platforms in the world, allowing medium and small business owners to expand their reach globally. Using the Google Kubernetes Engine has allowed Shopify to keep its customers’ online shops up and running.

 

“At Shopify, over the years, we moved from shards to the concept of ‘pods,’" said Kir Shatrov, a production engineer for Shopify. “A pod is a fully isolated instance of Shopify with its own datastores like MySQL, Redis, Memcached. A pod can be spawned in any region. This approach has helped us eliminate global outages. As of today, we have more than a hundred pods, and since moving to this architecture we haven't had any major outages that affected all of Shopify. An outage today only affects a single pod or region. As we grew into hundreds of shards and pods, it became clear that we needed a solution to orchestrate those deployments. Today, we use Docker, Kubernetes, and Google Kubernetes Engine to make it easy to bootstrap resources for new Shopify Pods.”

 

GKE uses two products, known as Cloud Logging and Cloud Monitoring. Both work in tandem to essentially provide ‘breadcrumbs” for developers to follow when they encounter an issue.

 

All GKE clusters come with Cloud Logging to automate recording, searching, and storage of logs of the clusters’ activities. This greatly eases the process of addressing bugs. 

 

If a customer attempting to make a purchase on one of Shopify’s client’s consumer-facing mobile applications is presented with an error message upon attempting to checkout, for example, a developer could go straight to the checkout service container in the Kubernetes Engine Console. From there, the developer can use the Logs Viewer to search for error messages.

 

Once the developer has specified the log entries that match the issue, he/she can pinpoint the exact location of the defective code. From there, it’s simply a matter of fixing the defect.

Logging can also help with making decisions about how to expand cloud-based applications. It’s simply a matter of looking at the logs themselves as data.

What GKE logs can tell businesses about the future of their apps

Social media analytics can tell businesses who their target customer is but GKE Cloud Logging can tell businesses how those target customers are using their apps. Essentially, it’s advanced analytics.

 

For example, developers can run custom queries in the logs to search for specific users and see what commands they gave to the relevant containers. That will enable them to identify and share trends, like the execution of checkout services after the app recommends particular products.

 

This lends to scalability because it pinpoints exactly which commands are most frequently given by app users and in which context. That lends to shedding features that users don’t actually use and directs resources to improve those features which users prefer.

 

With developers expending far less effort and using a minimal amount of time to debug applications and guide their evolution, they are free to innovate. CloudMatos can help businesses harness the power of Kubernetes to not only make life easier today but prepare for the future as well.

 

Comments

No comments yet! Why don't you be the first?
Add a comment

Get started with MatosSphere today

Get Demo