FDM-Kolloquium: Open Source LLMs: Functionality, Scaling, and HPC Training

Datum: 13. Dezember 2024Zeit: 14:00 – 15:30Ort: Hybrid --- Large Language Models (LLMs) are revolutionizing the way we interact with artificial intelligence, and the open-source community plays a pivotal role in driving their accessibility and innovation. This talk delves into the inner workings of LLMs, exploring their foundational mechanisms and architectures. Additionally, we examine how these models can be efficiently trained on high-performance computing (HPC) systems, leveraging state-of-the-art scaling strategies and principles derived from scaling laws. By understanding these methodologies, attendees will gain valuable insights into the challenges and opportunities of developing and deploying LLMs in diverse computational environments.

Informationen siehe StudOn: https://www.studon.fau.de/crs5918040.html

Zum Kalender hinzufügen

Details

Datum:
13. Dezember 2024
Zeit:
14:00 – 15:30
Ort:

Hybrid

---

Large Language Models (LLMs) are revolutionizing the way we interact with artificial intelligence, and the open-source community plays a pivotal role in driving their accessibility and innovation. This talk delves into the inner workings of LLMs, exploring their foundational mechanisms and architectures. Additionally, we examine how these models can be efficiently trained on high-performance computing (HPC) systems, leveraging state-of-the-art scaling strategies and principles derived from scaling laws. By understanding these methodologies, attendees will gain valuable insights into the challenges and opportunities of developing and deploying LLMs in diverse computational environments.