12-04, 19:00–20:30 (UTC), AI/ML Track
This tutorial empowers deep learning practitioners to master the entire PyTorch workflow, from efficient model creation to advanced tracking and optimization techniques. We'll begin by exploring a practical PyTorch workflow, then delve into integrating popular experiment tracking tools like MLFlow and Weights & Biases. You'll learn to log custom metrics, artifacts, and interactive visualizations, enhancing your model development process. Finally, we'll tackle hyperparameter optimization using Optuna's Bayesian search, all while maintaining meticulous experiment tracking for easy comparison and reproducibility.
By the end of the session, you'll have constructed a robust, modular pipeline for managing experiments and optimizing model performance. Whether you're new to PyTorch or an experienced data scientist looking to improve your workflow, this hands-on tutorial offers immediately applicable insights and techniques to enhance your deep learning projects across diverse domains.
This session will provide a hands-on journey through the PyTorch ecosystem with a comprehensive guide on creating, tracking, and optimizing deep learning models.
Outline:
1. Introduction to PyTorch Workflow (10 minutes)
- The challenges of modern deep learning workflows
- The importance of reproducibility and efficiency
2. PyTorch Fundamentals Recap (15 minutes)
- A quick review of PyTorch basics
- Building a sample model (e.g., image classification with ResNet)
3. Experiment Tracking Deep Dive (20 minutes)
- Introduction to experiment tracking tools (MLflow, W&B, TensorBoard)
- Live demo: Setting up MLflow for comprehensive experiment tracking
- Advanced tracking techniques: custom metrics, artifact logging, static and interactive visualizations
4. Hyperparameter Optimization (20 minutes)
- Introduction to Optuna
- Live demo: Integrating Optuna with PyTorch and MLflow
- Fine-tuning hyperparameters such as learning rates, batch sizes, or optimizer choice
5. Putting It All Together (20 minutes)
- Demonstration of a complete workflow combining PyTorch, MLflow, and Optuna
- Best practices for code organization and scalability
- Tips for adapting the workflow to different project requirements
6. Q&A (5 minutes)
Who Should Attend: This tutorial is suitable for beginners and advanced deep learning practitioners. Attendees should have a basic understanding of machine learning and how deep learning models work overall, but specific PyTorch experience is not mandatory. Python programming experience is expected.
Takeaways: By the end of the session, participants will:
- Understand the fundamentals of an efficient PyTorch workflow
- Know how to log and track deep learning experiments, including custom artifacts and interactive visualizations
- Understand how to integrate hyperparameter optimization into their projects
- Gain hands-on experience with a modular framework they can adapt to their own DL projects
All tutorial code will be available on GitHub, enabling attendees to immediately apply and extend the learned concepts in their own projects.
Previous knowledge expected
Data scientist with 10+ years of experience spanning academia and industry. Published research in prestigious scientific journals and developed end-to-end AI products for startups and global companies. Passionate educator contributing to data training programs as a professor and consultant.