Private Model as a Service - A Practical Introduction with Red Hat AI
Welcome to the Workshop
This isn’t a simple tutorial, this is the beginning of an adventure into the command center of an AI-powered enterprise!
In this hands-on session, you will rotate between key personas critical to delivering modern, AI-infused, production-level enterprise services:
-
As a developer, you will access model credentials through a Models-as-a-Service interface and integrate AI code assistants into your development workflow. Then, you will connect real applications to private LLM endpoints, use code assistant technology to build applications, and leverage AI to enhance your coding productivity.
-
As a DevOps Practitioner or Site Reliability Engineer, you will explore the capabilities of agentic AI to monitor your OpenShift cluster resources and interact with collaboration tools.
-
As a technical decision maker or platform stakeholder, you will monitor and analyze model usage across the cluster to understand cost, usage patterns, and business impact.
At the end, you will walk away with practical experience across the lifecycle of model deployment.
Workshop Agenda
Module 1: Introduction to your environment and Models-as-a-Service Infrastructure
As a developer, get acquainted with your OpenShift cluster environment and navigate to the OpenShift AI dashboard to view your model deployment and retrieve your endpoint and API token.
Module 2: AI-Assisted Development
Set up OpenShift Dev Spaces, install an AI-powered code assistant, and connect it to your private MaaS model endpoint to enhance your coding workflow.
Module 3: Game Time
Build a simple game using AI assistance. Explore, experiment, and build confidence with AI-assisted coding through a fun, low-pressure project.