This repository contains notebooks from the LLMOps short course introduced by DeepLearning.AI, with additional comments, notes, and explanatory lines to enhance understanding.
Upon completion of this course, you will be able to:
- Understand the holistic view of the entire MLOps framework.
- Manage data from Google Cloud data warehouses for preparation. Specifically, you'll work with BigQuery in this course.
- Implement smooth orchestration using
dsl
components and pipelines in Python. - Automate the data preparation cycle.
- Automate detailed model tuning processes.
- Implement the fundamentals of the Kubeflow framework.
- Understand the model deployment process and how to utilize it via two different methods: Batch or REST API.
- Gain insights into important topics such as model maintenance and monitoring.
The labs are divided into three directories, each containing all the necessary data to successfully run the code. You just need to add your specific environment variables to your .env
file to utilize the utils.py
file effectively.
Finally, a special thanks to DeepLearning.AI and Google Cloud for offering this insightful course.
Feel free to use, customize, and experiment with the code to explore different ideas.