Generative AI in Production

Course 1485 Advantage Plan Course

  • Duration: 1 day
  • Language: English
  • Level: Intermediate

Traditional MLOps is a set of practices to productionize traditional ML systems for enterprise applications. Generative AI raises new challenges in managing and productionizing applications at scale. The field of generative AI operations seeks to address these new challenges. In this course, you learn about the challenges that arise when deploying and productionizing generative AI-powered applications. You learn how to secure your generative AI-powered applications. Finally, you will discuss best practices for logging and monitoring your generative AI-powered applications in production.

Generative AI in Production Course Delivery Methods

  • In-Person

  • Online

  • Upskill your whole team by bringing Private Team Training to your facility.

Generative AI in Production Course Information

This course will empower you to:

  • Understand the challenges in productionizing applications using generative AI
  • Manage experimentation and evaluation for LLM-powered applications
  • Productionize LLM-powered applications
  • Secure generative AI applications
  • Implement logging and monitoring for LLM-powered applications

Prerequisites

Completion of the "Application Development with LLMs on Google Cloud" or equivalent knowledge

Who Should Attend

Developers and DevOps Engineers who wish to operationalize GenAI-based applications

Generative AI in Production Course Outline

Module 1: Introduction to Generative AI in Production

  • Understand generative AI operations
  • Compare traditional MLOps and GenAIOps
  • Analyze the components of an LLM system
  • Define and compare RAG and ReAct

Module 2: Generative AI Application Deployment

  • Evaluate application deployment options
  • Deploy, package, and version apps

Module 3: Productionizing Generative AI

  • Maintain and update LLM models
  • Test and evaluate gen AI-powered apps
  • Deploy CI/CD pipelines for gen AI-powered apps

Module 4: Logging and Monitoring for Production LLM Systems

  • Utilize Cloud Logging
  • Version, evaluate, and generalize prompts
  • Monitor for evaluation-serving skew
  • Utilize continuous validation.

Module 4: Securing Generative AI Applications

  • Identify security challenges for gen AI applications
  • Understand prompt security issues
  • Apply sensitive data protection and DLP API
  • Implement Model Armor

 

Module 5: Observability for Production LLM Systems

  • Describe the purpose and capabilities of Google Cloud Observability
  • Explain the purpose of Cloud Monitoring
  • Explain the purpose of Cloud Logging
  • Explain the purpose of Cloud Trace

Need Help Finding The Right Training Solution?

Our training advisors are here for you.

Generative AI in Production Course FAQs

Yes, a foundational understanding of machine learning and AI concepts is recommended. Familiarity with Python and cloud-based AI tools is also beneficial.

While this course provides valuable practical knowledge, it does not directly lead to a Google certification. However, the insights and skills gained can serve as a strong foundation for those interested in pursuing certifications related to generative AI, such as those offered by Google Cloud.

  • Vertex AI
  • Vertex AI Pipelines
  • Vertex AI Evaluation
  • Vertex AI Studio
  • Vertex AI Gemini API
  • Gemini