Loading…
Virtual Event
May 4 - May 7
Learn More and Register to Attend

The Sched app allows you to build your schedule but is not a substitute for your event registration. You must be registered for KubeCon + CloudNativeCon Europe 2021 Virtual to participate in the sessions. If you have not registered but would like to join us, please go to the event registration page to purchase a registration.

Please note: This schedule is automatically displayed in Central European Summer Time (UTC +2). To see the schedule in your preferred timezone, please select from the drop-down menu to the right, above "Filter by Date." The schedule is subject to change.

Back To Schedule
Thursday, May 6 • 13:30 - 14:05
Optimizing Knowledge Distillation Training With Volcano - Ti Zhou, Baidu & William Wang, Huawei

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Knowledge distillation is a classic model compression technology, which is a way of migrating knowledge from a complex model (Teacher) to another lightweight model (Student) to achieve model compression. EDL use Volcano as scheduler to deploy the Teacher model to an online Kubernetes GPU inference card cluster, and use the resources of the online inference GPU card to increase the throughput of the teacher model in knowledge distillation. At the same time, because the Teacher model can be flexibly scheduled by Volcano, there is no need to worry about task failures caused by preemption of online instances during peak hours. You can also deploy the Teacher model to cluster fragmented resources, or low-usage resources such as k40, to make full use of the cluster's idle and fragmented resources. In this lecture, we will explain in detail how to use Volcano to optimize elastic distillation training and give the corresponding benchmark data.

Speakers
avatar for Ti Zhou

Ti Zhou

Senior Architect, Baidu
Ti Zhou, Kubernetes member, LF AI & Data TAC member, currently serves as senior architect in Baidu Inc, focusing on PaddlePaddle Deep Learning Framework and Baidu Cloud Container Engine, helps developers to deploy cloud-native machine learning on private and public cloud.
WW

William Wang

Software Architect, HuaWei
William Wang, Volcano community tech-lead, experienced in batch system, bigdata and AI workload performance acceleration.Currently working on multi-cluster scheduling project and hybird scheduling project.



Thursday May 6, 2021 13:30 - 14:05 CEST
ML Theater
  Machine Learning + Data