Treffer: Exploring large language models (LLMs) through interactive Python activities.
Weitere Informationen
This paper presents an approach to introduce physics students to the basic concepts of large language models (LLMs) using Python-based activities in Google Colab. The teaching strategy integrates active learning strategies and combines theoretical ideas with practical, physics-related examples. Students engage with key technical concepts, such as word embeddings, through hands-on exploration of the Word2Vec neural network and GPT-2—an LLM that gained a lot of attention in 2019 for its ability to generate coherent and plausible text from simple prompts. The activities highlight how words acquire meaning and how LLMs predict subsequent tokens by simulating simplified scenarios related to physics. By focusing on Word2Vec and GPT-2, the exercises illustrate fundamental principles underlying modern LLMs, such as semantic representation and contextual prediction. Through interactive experimenting in Google Colab, students observe the relationship between model parameters (such as temperature) in GPT-2 and output behaviour, understand scaling laws relating data quantity to model performance, and gain practical insights into the predictive capabilities of LLMs. This approach allows students to begin to understand how these systems work by linking them to physics concepts—systems that will shape their academic studies, professional careers and roles in society. [ABSTRACT FROM AUTHOR]
Copyright of Physics Education is the property of IOP Publishing and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)