The second training session in the series titled Foundations of LLM Mastery, organized by EuroCC Austria, is scheduled for 26 February 2025. This event, called Fine-tuning LLMs on Multi GPUs, is being conducted in collaboration with the VSC Research Center and TU Wien.
Scaling fine-tuning large language models (LLMs) to multiple GPUs can unlock new levels of performance and efficiency, making it accessible for industries of all sizes. In this 3.5-hour course, participants from start-ups, SMEs, and large enterprises will gain hands-on experience with powerful multi-GPU fine-tuning techniques, optimizing their LLM workflows for both speed and scalability.
Key topics include:
Through guided exercises and real-world examples, this course provides participants with the skills to fine-tune LLMs efficiently across multiple GPUs, preparing them to tackle complex, large-scale projects. By the end of this course, participants will have developed a foundational understanding of techniques to distribute model parameters and states across GPUs on several nodes.