Wednesday, April 1, 2020

Kubernetes - Deploying Pod workloads and running Docker images with a load balancer


In this example I go over how to deploy manageable pod workloads and run Docker images with a load balancer using Kubernetes. I first configured a cluster with 3 nodes and connected to it via SSH using the command line interface. I set up the appropriate yaml file and then checked to see that my nodes were up.


Configuring the Kubernetes nodes


More Kubernetes insights

Load averages for the virtual machines
Kubernetes with Digital Ocean also gives a lot of insights into the network management which becomes more important as sites scale since the purpose is to have an efficient infrastructure for the websites I am creating and deploying. I will go over the intricacies in subsequent posts but for now let it suffice that from this point a broad variety of workloads can be employed from cron jobs to all sorts of daemon sets.

Now with my nodes created I created a cluster
At this point I created the nodes that formed the cluster. Then provisioned the clusters to run an example image with a load balancer to funnel traffic from different regions to different nodes with the same image running. Here below you can see the file where I configured the server protocol, ports, and everything else.




Now here we have NGINX serving up the content appropriately and it is time to customize further but the base is here and there is a solid foundation with the pod workloads running Docker images with a load balancer for efficiency.



No comments:

Post a Comment

Automated Exploitation of a Bluetooth vulnerability that leads to 0-click code execution

This blog post covers an interesting vulnerability that was just discovered earlier this year and an open source free tool that was created ...