Deploying with containers

Deploy HCL Workload Automation quickly and easily with containers.

Following you can find more details about the HCL Workload Automation deployment with containers based on your environment.

Docker containers
An easy and fast deployment method of HCL Workload Automation. Docker compose is a method to instantly download the product image, create a container, and start up the product.

Docker is a state-of-the-art technology which creates, deploys, and runs applications by using containers. Packages are provided containing an application with all of the components it requires, such as libraries, specific configurations, and other dependencies, and deploy it in no time on any other Linux or Windows workstation, regardless of any different settings between the source and the target workstation.

Docker adoption ensures standardization of your workload scheduling environment and provides an easy method to replicate environments quickly in development, build, test, and production environments, speeding up the time it takes to get from build to production significantly. Install your environment using Docker to improve scalability, portability, and efficiency.

Docker containers are available for UNIX, Windows and Linux on Z operating systems.

For more information, see the introductory readme file for all components available at HCL Workload Automation. You can also find detailed information for each component in the related readme file, as follows:
Amazon Web Services (AWS) Elastic Kubernetes Service (EKS) (Amazon EKS)
You can use Amazon EKS to run HCL Workload Automation containerized product components don the Amazon Web Services secure cloud platform.

For more information see Deploying on Amazon EKS.

Azure Kubernetes Service (AKS)

Deploy and manage HCL Workload Automation containerized product components on the Azure AKS, a container orchestration service available on the Microsoft Azure public cloud. You can use Azure AKS to deploy, scale up, scale down and manage containers in the cluster environment. You can also deploy and run an Azure SQL database.

For more information see Deploying on Azure AKS.
Google GKE

Google Kubernetes Engine (GKE) provides a managed environment for deploying, managing, and scaling your containerized applications using Google infrastructure. The Google GKE environment consists of multiple machines grouped together to form a cluster. You can also deploy and run Google Cloud SQL for SQL server.

Google GKE supports session affinity in a load balancing cluster, a feature which maintains each user session always active on the same pod. This ensures that the Dynamic Workload Console always connects to the same server during a session and that the user can perform any number of operations smoothly and seamlessly.

For more information, see Deploying on Google GKE.

Red Hat OpenShift
Starting with version 9.5 Fix Pack 2, all HCL Workload Automation product components can be deployed on Red Hat OpenShift, V4.x. Red Hat OpenShift is a container application platform based on Kubernetes to orchestrate containers.

On the Red Hat OpenShift, V3.x container platform, the HCL Workload Automation agent container continues to be supported. You can deploy the HCL Workload Automation agent container with a template.yml file to quickly configure and run it as a Docker container application in a Kubernetes cluster. You can then manage the HCL Workload Automation agent container from the OpenShift dashboard or from the command-line interface. For more information see Deploying HCL Workload Automation components on Red Hat OpenShift.