Serverless Workloads on centron Kubernetes with Knative: A Guide

Discover in our latest blog post how to deploy serverless workloads on your centron Kubernetes cluster using Knative Serving. From setting up the cluster to deploying your Node.js application, this guide provides a comprehensive overview of the process. Maximize your efficiency and flexibility in the cloud with this straightforward guide to leveraging serverless workloads on Kubernetes.

Kubernetes is a powerful tool for container orchestration, enabling the deployment and management of containerized applications. However, managing the underlying infrastructure can sometimes be time-consuming. The serverless paradigm allows users to deploy applications without worrying about the underlying infrastructure. With the advent of Serverless 2.0, many platforms and tools now allow for the deployment of serverless applications on Kubernetes.

Knative is a Kubernetes-based platform that provides components to deploy and manage serverless workloads. Knative offers an open-source Kubernetes integration, cloud agnosticism, building blocks, and extensibility. Tools like Red Hat’s Openshift also use Knative to help users deploy their serverless workloads on Kubernetes.

Knative has two main components: Eventing and Serving. Eventing handles events that trigger serverless workloads. Serving is a set of components for deploying and managing serverless workloads. Knative Serving allows developers to deploy and manage serverless applications on Kubernetes. With Knative Serving, developers can quickly and easily deploy new services, scale them up and down, and connect them with other services and event sources. This functionality enables developers to create and deploy modern, cloud-native applications that are flexible, scalable, and easy to maintain.

In this tutorial, you will use Knative Serving to deploy a Node.js application as a serverless workload on a centron Kubernetes cluster. You will use kn (the Knative CLI) to create the Kubernetes cluster and deploy the application.

Prerequisites

To complete this tutorial, you will need:

  • A ccenter account to start a Kubernetes cluster with at least 4 GB RAM and 2 CPU cores
  • kubectl and Docker installed and configured on your computer
  • A sample Node.js application
  • A Docker Hub account to store the Docker images you create during this tutorial

Step 1 – Starting the centron Kubernetes Cluster

Since Knative is a Kubernetes-based platform, you’ll use it with a Kubernetes cluster on centron. There are several ways to start a Kubernetes cluster on centron, including using the ccloud³ interface, the centron CLI, or the Terraform provider.

To effectively use Knative in this tutorial, you will need a Kubernetes cluster with at least 4 GB RAM and 2 CPU cores. You can create a cluster with these specifications called `knative-tutorial` by executing the following command with the specified flags:

–size specifies the size of the remote server.

–count specifies the number of nodes to be created as part of the cluster.

To create the centron Kubernetes cluster, execute the following command:


kubernetes cluster create knative-tutorial –size s-2vcpu-4gb –count 3

In this command, you create a cluster named `knative-tutorial` with 4 GB RAM, 2 CPU cores, and 3 nodes.

Note: You can also use the –region flag to specify the server’s region, although this option is not used in this tutorial. If you’re using a remote server, such as a droplet, you might want to have your cluster in the same region as the server. If using the centron Control Panel to create your cluster, you can select a datacenter region and a VPC network. For more information on centron’s regional availability, see our Regional Availability Matrix.

This command takes a few minutes to complete. Once finished, you will receive a message similar to the following:


Output
Notice: Cluster is provisioning, waiting for cluster to be running
…………………………………………………..
Notice: Cluster created, fetching credentials
Notice: Adding cluster credentials to kubeconfig file found in “/home/sammy/.kube/config”
Notice: Setting current-context to do-nyc1-knative-tutorial
ID Name Region Version Auto Upgrade Status Node Pools
d2d1f9bc-114b-45e7-b109-104137f9ab62 knative-tutorial nyc1 1.24.4-do.0 false running knative-tutorial-default-pool

The cluster is now ready for use.

You can now verify that kubectl is set up on your system and can access the centron Kubernetes cluster with the following command:

You should see output similar to the following:


Output
Kubernetes control plane is running at https://69de217e-0284-4e18-a6d7-5606915a4e88.k8s.oncentron.de
CoreDNS is running at https://69de217e-0284-4e18-a6d7-5606915a4e88.k8s.oncentron.de/api/v1/namespaces/kube-system/services/coredns:dns/proxy

Now you have a Kubernetes cluster set up on centron and can proceed to install Knative and deploy your Node.js application.

Step 2 – Installing Knative Serving

With your centron Kubernetes cluster set up, install Knative Serving on this cluster.

You will install Knative Serving using kubectl and the Knative Operator, a Kubernetes operator that installs and manages Knative components.

To install Knative Serving, execute the following commands:


kubectl apply –filename https://github.com/knative/serving/releases/download/v0.30.0/serving-crds.yaml
kubectl apply –filename https://github.com/knative/serving/releases/download/v0.30.0/serving-core.yaml

These commands apply the YAML files that define the resources for installing Knative Serving. The first line applies the Custom Resource Definitions (CRDs) for Knative Serving, and the second line applies the core resources.

Once the installation completes, check if all Knative Serving pods are running:


kubectl get pods –namespace knative-serving

You should see output similar to the following:


Output
NAME READY STATUS RESTARTS AGE
activator-7859d98ff7-7trbh 1/1 Running 0 2m
autoscaler-6d76669c99-7dh8d 1/1 Running 0 2m
autoscaler-hpa-5b47b68f4b-v9c59 1/1 Running 0 2m
controller-6bdfb87d8b-9kn6f 1/1 Running 0 2m
networking-istio-6c57dd5d8c-2k99z 1/1 Running 0 2m
webhook-79fc8f468d-9nkjz 1/1 Running 0 2m

With Knative Serving successfully installed, you can proceed to deploy your Node.js application.

Step 3 – Deploying Your Node.js Application as a Serverless Workload

After successfully installing Knative Serving, you can deploy your Node.js application as a serverless workload.

To deploy your Node.js application, use the Knative CLI (`kn`). First, create a YAML manifest that defines the resource for your serverless workload, then apply this manifest to your Kubernetes cluster.

First, create a file named `service.yaml` and add the following content:


apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: nodejs-app
spec:
template:
spec:
containers:
– image: YOUR_DOCKER_USERNAME/nodejs-app:latest

Replace `YOUR_DOCKER_USERNAME` with your Docker username. This YAML manifest defines a serverless workload named `nodejs-app` that uses the container image of your Node.js application.

Then apply this manifest to your Kubernetes cluster:


kubectl apply –filename service.yaml

This command applies the YAML manifest and creates a resource of type Service on your Kubernetes cluster. This resource deploys your Node.js application as a serverless workload.

To check the status of your deployment, use the following command:

You should see output similar to the following:


Output
NAME URL LATESTCREATED LATESTREADY READY REASON
nodejs-app http://nodejs-app.default.69de217e-0284-4e18-a6d7-5606915a4e88.k8s.oncentron.de nodejs-app-00001 nodejs-app-00001 True

The URL displayed in the output is the URL of your deployed Node.js application. You can use this URL to access your application.

That’s it! You have successfully deployed a Node.js application as a serverless workload on a centron Kubernetes cluster using Knative Serving.

Conclusion

In this tutorial, you learned how to deploy serverless workloads using Knative Serving on a centron Kubernetes cluster. You started a Kubernetes cluster on centron, installed Knative Serving, and deployed your Node.js application as a serverless workload. You can now deploy additional applications and experiment with Knative on Kubernetes to create and manage modern, cloud-native applications.

Create a Free Account

Register now and get access to our Cloud Services.

Posts you might be interested in:

centron Managed Cloud Hosting in Deutschland

How To Use Wildcards in SQL

MySQL
How To Use Wildcards in SQL Content1 Introduction2 Prerequisites for Wildcards in SQL3 Connecting to MySQL and Setting up a Sample Database4 Querying Data with Wildcards in SQL5 Escaping Wildcard…
centron Managed Cloud Hosting in Deutschland

How To Use Joins in SQL

MySQL
How To Use Joins in SQL Content1 Introduction2 Prerequisites for Joins in SQL3 Connecting to MySQL and Setting up a Sample Database4 Understanding the Syntax of Joins in SQL Operations5…
centron Managed Cloud Hosting in Deutschland

How To Work with Dates and Times in SQL

MySQL
How To Work with Dates and Times in SQL Content1 Introduction2 Prerequisites3 Connecting to MySQL and Setting up a Sample Database4 Using Arithmetic with Dates and Times5 Using Date and…