LB6535 - Sovereign Cloud Architecture with OpenShift, Multi-Cluster Data Residency and Compliance Management

Do the modules in order and follow the instructions carefully. And make sure to run all of the scripts on this page before proceeding to the next module.

Introduction

Customers today want flexibility. They want to do more, with less, while accelerating AI adoption to stay ahead of the market. Geographic specific compliance standards, petabytes of data in restricted environments, and increasing provider costs are all pushing organizations to reevaluate their technology strategies. Customers are seeking Sovereign Cloud solutions to provide the flexibility necessary to be adaptive in the age of AI and geographic specific requirements.

Red Hat is perfectly positioned to take advantage of these customer needs. With Red Hat’s existing hybrid cloud approach, setting up an automated sovereign cloud architecture is simple. Red Hat has the tools to manage data privacy and compliance concerns, bringing simplified management of OpenShift clusters, containers, virtual machines, and AI applications, while providing hardened authorization, authentication, data controls, encryption, vulnerability management, and more.

In this lab, you will progress through a simplified migration process where you, the customer, will bring your workflows and applications to an on premise and cloud environment, bringing your platform, to your data, in a compliant, secure, and repeatable manner.

What You’ll Learn

This lab demonstrates how Red Hat Solutions can empower enterprises to manage their own sovereign cloud architectures enabling application and data mobility and policy enforcement.

Through hands-on activities, you will:

  • Configure geographically-distributed OpenShift clusters with automated compliance policies

  • Implement workload placement strategies that enforce data residency requirements

  • Deploy applications with built-in sovereignty enforcement

  • Establish audit trails and compliance reporting for regulatory demonstrations

  • Design disaster recovery scenarios that maintain geographic data constraints

Prerequisites

Before beginning this lab, you should have:

  • Solid OpenShift administration experience

  • Familiarity with multi-cluster concepts

Lab Modules

This lab is organized into four modules:

Getting Started

The Environment for this lab already provides the following links for you:

  • This content on the left

  • On the right there are three tabs:

    • OpenShift Console: this is where you will spend the majority of the time configuring the various environments and applications

    • Red Hat Advanced Cluster Security: the console for configuring security related items

    • Bastion: command line access to the environment. You will run commands in this tab to deploy a few initial settings and interact with the applications.

Before proceeding with the lab modules, you will need to configure kubectl contexts for the clusters, install a few insecure applications and the roxctl CLI tool.

Configure kubectl contexts

This lab uses two OpenShift clusters. You’ll need to set up kubectl contexts to easily switch between them.

Procedure

  1. Switch to the Bastion tab, located on the right side of the screen.

  2. Log in to the AWS cluster and create the aws-us context and rename the existing admin context to local-cluster

    oc config rename-context admin local-cluster
    oc login -u kubeadmin -p {aws_openshift_kubeadmin_password} {aws_openshift_api_url} --insecure-skip-tls-verify
    oc config rename-context $(oc config current-context) aws-us
  3. Verify that the changes have occured and the context is configured:

    oc config get-contexts
    OUTPUT
    [lab-user@bastion ~]$ oc config get-contexts
    CURRENT   NAME            CLUSTER            AUTHINFO
    *         aws-us          api-ocp-...:6443   kube:admin/...:6443
              local-cluster   ocp                admin

    Now you are able to switch between contexts as needed.

  4. Make sure to switch back to the local cluster context before continuing.

    # Switch back to local cluster
    oc config use-context local-cluster
    You will need to switch between contexts as needed throughout the lab. Please ensure you have access to both clusters before proceeding. Otherwise contact the instructor.

Set Up the Lab While You Work on Module 1

First, you will set up a few lab environment variables.

Procedure

  1. Add the RHACS environment variables (The "Bastion" tab in the right panel) to your ~/.bashrc file:

    cat >> ~/.bashrc << 'EOF'
    export ROX_CENTRAL_ADDRESS="{acs_route}"
    export ACS_PORTAL_USERNAME="{acs_portal_username}"
    export ACS_PORTAL_PASSWORD="{acs_portal_password}"
    export GRPC_ENFORCE_ALPN_ENABLED=false
    EOF
    source ~/.bashrc
  2. Next, clone the repository and run the setup scripts to configure your environmen

    The following script can be run in the background while woring on the first module. You DO NOT have to wait for it to finish before proceeding to module 1.

    cd ~ && git clone https://github.com/jalvarez-rh/rh1-svc-lab.git
    cd ~/rh1-svc-lab
    ./lab-setup/run-all-setup.sh
    ./tssc-setup/setup.sh
    ./ai-setup/setup.sh

    You will be asked to run the same script at the beginning of module 2, 3 and 4 to verify that everything is working as expected.

OpenShift Console Access

The Red Hat Advanced Cluster Management for Kubernetes (RHACM) Console is now integrated with OpenShift Console. To access it, select Fleet Management from the dropdown menu on the left once you log in.

Switch to the OpenShift Console tab in the right panel. In case you want to open it in another window it is here: {openshift_cluster_console_url}[window=blank]

ctrl + click on the URL to open URLs in a new tab.

Administrator login credentials:

Username:

{openshift_cluster_admin_username}

Password:

{openshift_cluster_admin_password}

RHACS Console Access

Your RHACS Console is also available in the right panel. In case you want to open it in another window it is here: {acs_route}[window=blank]

Administrator login credentials:

RHACS Console Username:

{acs_portal_username}

RHACS Console Password:

{acs_portal_password}

What’s Next

Click on any module above to begin. We recommend starting with Module 1 and proceeding sequentially, as each module builds upon concepts from previous modules.

Customer Value: Upon completing this lab, you will be equipped to propose and implement sovereign cloud solutions that enable organizations to pursue digital transformation while meeting strict regulatory requirements—turning compliance from a barrier into a competitive advantage.