Stop writing brittle glue code for your MLOps pipelines. See how Argo Workflows provides a Kubernetes-native way to define and execute complex processes declaratively.
#1about 3 minutes
Understanding the core principles and lifecycle of MLOps
MLOps applies DevOps principles to machine learning to automate and streamline the entire model lifecycle from data collection to deployment and monitoring.
#2about 4 minutes
Why Argo Workflows is a powerful Kubernetes-native engine
Argo Workflows is a Kubernetes-native engine that orchestrates complex, multi-stage processes as custom resources, eliminating the need for extensive glue code.
#3about 4 minutes
Building and running a basic workflow with YAML
A simple workflow is defined in YAML using templates for each step, which are then executed inside containers on a Kubernetes cluster.
#4about 4 minutes
Managing data files in pipelines using artifacts
Argo artifacts simplify data handling by automatically downloading input files from cloud storage into a container and uploading outputs upon completion.
#5about 4 minutes
Orchestrating complex training jobs with DAGs
Directed acyclic graph (DAG) templates in Argo allow you to define complex workflows with multiple dependencies, enabling parallel and sequential task execution for model training.
#6about 4 minutes
Building resilient batch inference pipelines with retries
For reliable batch inference, Argo's retry strategies with configurable limits and backoff policies can automatically recover from transient failures in individual steps.
#7about 3 minutes
Evaluating if Argo Workflows is right for your team
Argo is ideal for teams already using Kubernetes to manage complex, multi-stage ML pipelines, but may be overkill for small projects or teams without Kubernetes expertise.
#8about 1 minute
Integrating Argo with tools like Argo CD and MLflow
Argo Workflows can be used alongside Argo CD for deployment and MLflow for experiment tracking, with Argo providing more flexible, language-agnostic container orchestration.
Related jobs
Jobs that call for the skills explored in this talk.
Matching moments
02:31 MIN
Demonstrating the automated LLMOps pipeline in action
LLMOps-driven fine-tuning, evaluation, and inference with NVIDIA NIM & NeMo Microservices
04:01 MIN
Automating complex LLM workflows with Argo Workflows
LLMOps-driven fine-tuning, evaluation, and inference with NVIDIA NIM & NeMo Microservices
03:20 MIN
Implementing a GitOps approach for end-to-end LLMOps
LLMOps-driven fine-tuning, evaluation, and inference with NVIDIA NIM & NeMo Microservices
02:24 MIN
Orchestrating MLOps workflows for reliability
The state of MLOps - machine learning in production at enterprise scale
02:49 MIN
Q&A: MLOps tools for building CI/CD pipelines
Data Science in Retail
03:33 MIN
Introduction to GitOps and the talk agenda
Get ready for operations by pull requests
04:56 MIN
What MLOps is and the engineering challenges it solves
MLOps - What’s the deal behind it?
03:08 MIN
Understanding the role and challenges of MLOps
The Road to MLOps: How Verivox Transitioned to AWS
MLops – Deploying, Maintaining And Evolving Machine Learning Models in ProductionWelcome to this issue of the WeAreDevelopers Live Talk series. This article recaps an interesting talk by Bas Geerdink who gave advice on MLOps.About the speaker:Bas is a programmer, scientist, and IT manager. At ING, he is responsible for the Fast...
Benedikt Bischof
MLOps – What’s the deal behind it?Welcome to this issue of the WeAreDevelopers Live Talk series. This article recaps an interesting talk by Nico Axtmann who introduced us to MLOpsAbout the speaker:Nico Axtmann is a seasoned machine learning veteran. Starting back in 2014 he observed ...
Benedikt Bischof
MLOps And AI Driven DevelopmentWelcome to this issue of the WeAreDevelopers Dev Talk Recap series. This article recaps an interesting talk by Natalie Pistunovic who spoke about the development of AI and MLOps. What you will learn:How the concept of AI became an academic field and ...