KSETA Topical Courses September/October 2023

Europe/Berlin
Description

The next KSETA Topical Courses will be held from September 18th to 25th, 2023 and from October 2nd to 6th, 2023.

See schedules and rooms in the program.

Please register until 14 September, 2023.

If you are not going to attend any of the courses you have registered for, please modify your registration or send an email to Raquel Lujan.

Please note that you must have attended at least 6 hours of the course to be included on the KSETA transcript.

    • FPGA programming with hands on: Part I 216 (30.10)

      216

      30.10

      ITIV

      Field-Programmable Gate Arrays (FPGA) are powerful devices that combine the advantages of custom ASICs and the flexibility of microcontrollers. They are able to perform real-time, parallel signal processing and data analysis, while being reconfigurable by describing its functionality using hardware description languages (VHDL, Verilog etc.). One field of application of these devices is the readout of detectors in large scale physics experiment. In this course we will give an introduction to FPGAs in this specific use-case by learning the underlying technologies and exploring the possibilities and limits of them. Additionally, we will dive into the programming of FPGAs by looking at several hands-on practical examples.

      Conveners: Lukas Scheller, Timo Muscheid (KIT-IPE)
    • FPGA programming with hands on: Part II 216 (30.10)

      216

      30.10

      ITIV

      Field-Programmable Gate Arrays (FPGA) are powerful devices that combine the advantages of custom ASICs and the flexibility of microcontrollers. They are able to perform real-time, parallel signal processing and data analysis, while being reconfigurable by describing its functionality using hardware description languages (VHDL, Verilog etc.). One field of application of these devices is the readout of detectors in large scale physics experiment. In this course we will give an introduction to FPGAs in this specific use-case by learning the underlying technologies and exploring the possibilities and limits of them. Additionally, we will dive into the programming of FPGAs by looking at several hands-on practical examples.

      Conveners: Lukas Scheller, Timo Muscheid (KIT-IPE)
    • Presenting Science: How to prepare a scientific presentation with impact!: Introduction 10/1 (30/23)

      10/1

      30/23

      10th floor

      Acquire effective communication and presentation skills for the presentation of scientific results and concepts.

      This course is spread over three events, where the dates for the next two events will be set later. On September 18, the introductory event will take place.

      In this introductory event Professor Issever will give an introduction into how to prepare a scientific presentation. What the pitfalls are and how to ensure to have a presentation with impact.
      After this introduction each student will be ask to pick a scientific topic s/he wants to prepare a presentation about. The presentation should be for physicists who do not need to be experts in the chosen field. Each presentation should be not more than 10 min. Every student has to give a rehearsal talk to the other participants. The date for this will be set later. At this rehearsal the other students and Prof Issever will give the student detailed! feedback on their presentation.
      The feedback needs to be incorporate and a final version of the presentation needs to be prepared. At the end of the course all students present their presentations to everybody in a conference like event that should simulate a parallel session of talks at a scientific conference. We will also need to set a date for this final event.

      Convener: Cigdem Issever
    • Medical Physics in Radiation Therapy: Part I: Physical basis of radiotherapy - interaction of radiation with matter 10/1 (30/23)

      10/1

      30/23

      10th floor

      Part I: Physical basis of radiotherapy - interaction of radiation with matter
      Part II: Dose calculation and Treatment planning
      Part III: Dosimetry and Detectors for RT
      Part IV: Sources of radiation - accelerator technology and medical isotopes
      Part V: Modern developments and future of RT

      Convener: Niklas Wahl
    • Medical Physics in Radiation Therapy: Part II: Dose calculation and Treatment planning 10/1 (30/23)

      10/1

      30/23

      10th floor

      Part I: Physical basis of radiotherapy - interaction of radiation with matter
      Part II: Dose calculation and Treatment planning
      Part III: Dosimetry and Detectors for RT
      Part IV: Sources of radiation - accelerator technology and medical isotopes
      Part V: Modern developments and future of RT

      Convener: Niklas Wahl
    • FPGA programming with hands on: Part III 216 (30.10)

      216

      30.10

      ITIV

      Field-Programmable Gate Arrays (FPGA) are powerful devices that combine the advantages of custom ASICs and the flexibility of microcontrollers. They are able to perform real-time, parallel signal processing and data analysis, while being reconfigurable by describing its functionality using hardware description languages (VHDL, Verilog etc.). One field of application of these devices is the readout of detectors in large scale physics experiment. In this course we will give an introduction to FPGAs in this specific use-case by learning the underlying technologies and exploring the possibilities and limits of them. Additionally, we will dive into the programming of FPGAs by looking at several hands-on practical examples.

      Conveners: Lukas Scheller, Timo Muscheid (KIT-IPE)
    • FPGA programming with hands on: Part IV 216 (30.10)

      216

      30.10

      ITIV

      Field-Programmable Gate Arrays (FPGA) are powerful devices that combine the advantages of custom ASICs and the flexibility of microcontrollers. They are able to perform real-time, parallel signal processing and data analysis, while being reconfigurable by describing its functionality using hardware description languages (VHDL, Verilog etc.). One field of application of these devices is the readout of detectors in large scale physics experiment. In this course we will give an introduction to FPGAs in this specific use-case by learning the underlying technologies and exploring the possibilities and limits of them. Additionally, we will dive into the programming of FPGAs by looking at several hands-on practical examples.

      Conveners: Lukas Scheller, Timo Muscheid (KIT-IPE)
    • Medical Physics in Radiation Therapy: Part III: Dosimetry and Detectors for RT 10/1 (30/23)

      10/1

      30/23

      10th floor

      Part I: Physical basis of radiotherapy - interaction of radiation with matter
      Part II: Dose calculation and Treatment planning
      Part III: Dosimetry and Detectors for RT
      Part IV: Sources of radiation - accelerator technology and medical isotopes
      Part V: Modern developments and future of RT

      Convener: Tim Gehrke
    • Medical Physics in Radiation Therapy: Part IV: Sources of radiation - accelerator technology and medical isotopes 10/1 (30/23)

      10/1

      30/23

      10th floor

      Part I: Physical basis of radiotherapy - interaction of radiation with matter
      Part II: Dose calculation and Treatment planning
      Part III: Dosimetry and Detectors for RT
      Part IV: Sources of radiation - accelerator technology and medical isotopes
      Part V: Modern developments and future of RT

      Convener: Tim Gehrke
    • Philosophy of Science: Part I (POSTPONED) 10/1 (30/23)

      10/1

      30/23

      10th floor

      Physics and philosophy have a longstanding and close relation – may it be through their common roots in antiquity or via discussions about interpretations of quantum mechanics in the first half of the 20th century. For a long time, Philosophy of Science (PoS)was a synonym for philosophy pf physics. This course aims (i) at a short introduction into classical philosophical views on concepts and methods that are central in the sciences, like models, natural laws explanation, or understanding. We will also (ii) discuss some recent development in PoS such as philosophy of computer simulations or AI as afar as they are relevant for physicists. Finally, (iii) we will briefly touch on ethical aspects of scientific practice.

      Convener: Rafaela Hillerbrand
    • Philosophy of Science: Part II (POSTPONED) 10/1 (30/23)

      10/1

      30/23

      10th floor

      Physics and philosophy have a longstanding and close relation – may it be through their common roots in antiquity or via discussions about interpretations of quantum mechanics in the first half of the 20th century. For a long time, Philosophy of Science (PoS)was a synonym for philosophy pf physics. This course aims (i) at a short introduction into classical philosophical views on concepts and methods that are central in the sciences, like models, natural laws explanation, or understanding. We will also (ii) discuss some recent development in PoS such as philosophy of computer simulations or AI as afar as they are relevant for physicists. Finally, (iii) we will briefly touch on ethical aspects of scientific practice.

      Convener: Rafaela Hillerbrand
    • Philosophy of Science: Part III (POSTPONED) 10/1 (30/23)

      10/1

      30/23

      10th floor

      Physics and philosophy have a longstanding and close relation – may it be through their common roots in antiquity or via discussions about interpretations of quantum mechanics in the first half of the 20th century. For a long time, Philosophy of Science (PoS)was a synonym for philosophy pf physics. This course aims (i) at a short introduction into classical philosophical views on concepts and methods that are central in the sciences, like models, natural laws explanation, or understanding. We will also (ii) discuss some recent development in PoS such as philosophy of computer simulations or AI as afar as they are relevant for physicists. Finally, (iii) we will briefly touch on ethical aspects of scientific practice.

      Convener: Rafaela Hillerbrand
    • Philosophy of Science: Part IV (POSTPONED) 10/1 (30/23)

      10/1

      30/23

      10th floor

      Physics and philosophy have a longstanding and close relation – may it be through their common roots in antiquity or via discussions about interpretations of quantum mechanics in the first half of the 20th century. For a long time, Philosophy of Science (PoS)was a synonym for philosophy pf physics. This course aims (i) at a short introduction into classical philosophical views on concepts and methods that are central in the sciences, like models, natural laws explanation, or understanding. We will also (ii) discuss some recent development in PoS such as philosophy of computer simulations or AI as afar as they are relevant for physicists. Finally, (iii) we will briefly touch on ethical aspects of scientific practice.

      Convener: Rafaela Hillerbrand
    • Medical Physics in Radiation Therapy: Part V: Modern developments and future of RT online

      online

      Part I: Physical basis of radiotherapy - interaction of radiation with matter
      Part II: Dose calculation and Treatment planning
      Part III: Dosimetry and Detectors for RT
      Part IV: Sources of radiation - accelerator technology and medical isotopes
      Part V: Modern developments and future of RT

      Convener: Oliver Jäkel
    • Selected Topics on Instrumentation for Particle and Astroparticle Physic: Part I 10/1 (30/23)

      10/1

      30/23

      10th floor

      The lectures will cover basic interaction processes for radiation through matter before focusing on one class of detectors using scintillation and ionization produced by the passage of radiation in noble liquids. Some applications of noble liquid detectors to search for rare physics processes will be reviewed.

      Convener: Elena Aprile
    • Selected Topics on Instrumentation for Particle and Astroparticle Physic: Part II 514 (CN, 601)

      514

      CN, 601

      Campus North

      The lectures will cover basic interaction processes for radiation through matter before focusing on one class of detectors using scintillation and ionization produced by the passage of radiation in noble liquids. Some applications of noble liquid detectors to search for rare physics processes will be reviewed.

      Convener: Elena Aprile
    • Selected Topics on Instrumentation for Particle and Astroparticle Physic: Part III 514 (CN, 601)

      514

      CN, 601

      Campus North

      The lectures will cover basic interaction processes for radiation through matter before focusing on one class of detectors using scintillation and ionization produced by the passage of radiation in noble liquids. Some applications of noble liquid detectors to search for rare physics processes will be reviewed.

      Convener: Elena Aprile
    • The BPHZ theorem: a decisive turn in the history of quantum field theory: Part I 6/1 (30/23)

      6/1

      30/23

      6th floor

      At present, quantum field theory is very successful. This success is absolutely impossible without a procedure called "renormalization". This procedure absorbs all the infinities (of the ultraviolet type) contained in the bare theory by an infinite redefinition of the theory constants.
      The divergent (infinite) counterterms produced by this procedure are subject to very fine tuning. For example, there are so-called subdivergences and overlapping divergences. The Bogoliubov-Parasiuk-Hepp-Zimmermann (BPHZ) theorem was proved in 1950-1960-x and guarantees that renormalization eliminates all ultraviolet divergences at any order of the perturbation series. The theorem raised hopes that quantum field theory made sense and could be studied mathematically rigorously. The formulation and proof of the BPHZ theorem will be explained, as well as related constructions.
      Despite its success, the foundations of quantum field theory appear unsatisfactory to many scientists: it gives precise results for many observables, but the reasoning consists of a large sequence of incorrect logical transitions. In this sense, the BPHZ theorem provides an oasis of mathematical rigour. The possibility or impossibility of making something logically consistent can be considered as another very important "experimental fact". We should also consider the possibility that quantum field theory must be completely reconstructed on absolutely different principles in order to become logically correct. This reconstruction must preserve the present very precise agreement with experiments; to make this possible, some "clues" should be taken from the existing theory. We don't know, what these principles and clues are; however, the mechanism underlying divergence cancellation in the BPHZ approach may serve as a good clue.
      Not only the BPHZ theorem itself, but also understanding the reasoning in its proof is very useful. The ideas and combinatorial constructions underlying it can be used to develop effective computational methods.
      For computations in perturbative quantum field theory, each additional order increases the demand on computer resources by several orders of magnitude. Since high-order results are needed in many areas, calculational methods based on nontrivial mathematical ideas are very welcome. Examples will be demonstrated.
      Related topics will also be discussed, such as correct definitions of integrals, regularizations, approaches to rigorous justification of dimensional regularization, Feynman and Schwinger parameters, Symanzik polynomials, power counting theorems, physical renormalization conditions, infrared divergences, mixing of infrared and ultraviolet divergences, convergence of the whole series, and also limited applicability of renormalization, relations between physics and mathematics, social processes in the society of theoretical physicists, and so on.

      Convener: Sergey Volkov
    • The BPHZ theorem: a decisive turn in the history of quantum field theory: Part II 6/1 (30/23)

      6/1

      30/23

      6th floor

      At present, quantum field theory is very successful. This success is absolutely impossible without a procedure called "renormalization". This procedure absorbs all the infinities (of the ultraviolet type) contained in the bare theory by an infinite redefinition of the theory constants.
      The divergent (infinite) counterterms produced by this procedure are subject to very fine tuning. For example, there are so-called subdivergences and overlapping divergences. The Bogoliubov-Parasiuk-Hepp-Zimmermann (BPHZ) theorem was proved in 1950-1960-x and guarantees that renormalization eliminates all ultraviolet divergences at any order of the perturbation series. The theorem raised hopes that quantum field theory made sense and could be studied mathematically rigorously. The formulation and proof of the BPHZ theorem will be explained, as well as related constructions.
      Despite its success, the foundations of quantum field theory appear unsatisfactory to many scientists: it gives precise results for many observables, but the reasoning consists of a large sequence of incorrect logical transitions. In this sense, the BPHZ theorem provides an oasis of mathematical rigour. The possibility or impossibility of making something logically consistent can be considered as another very important "experimental fact". We should also consider the possibility that quantum field theory must be completely reconstructed on absolutely different principles in order to become logically correct. This reconstruction must preserve the present very precise agreement with experiments; to make this possible, some "clues" should be taken from the existing theory. We don't know, what these principles and clues are; however, the mechanism underlying divergence cancellation in the BPHZ approach may serve as a good clue.
      Not only the BPHZ theorem itself, but also understanding the reasoning in its proof is very useful. The ideas and combinatorial constructions underlying it can be used to develop effective computational methods.
      For computations in perturbative quantum field theory, each additional order increases the demand on computer resources by several orders of magnitude. Since high-order results are needed in many areas, calculational methods based on nontrivial mathematical ideas are very welcome. Examples will be demonstrated.
      Related topics will also be discussed, such as correct definitions of integrals, regularizations, approaches to rigorous justification of dimensional regularization, Feynman and Schwinger parameters, Symanzik polynomials, power counting theorems, physical renormalization conditions, infrared divergences, mixing of infrared and ultraviolet divergences, convergence of the whole series, and also limited applicability of renormalization, relations between physics and mathematics, social processes in the society of theoretical physicists, and so on.

      Convener: Sergey Volkov
    • Tools and Techniques for Sustainable Research Software Development: Part I 6/1 (30/23)

      6/1

      30/23

      6th floor

      Software is becoming an increasingly important part in modern scientific endeavours and everyday work, most likely also in yours.
      Readability, reusability, robustness and extensibility can be important qualities of the code we write, making it easier to share and more valuable to the relevant scientific community. In particular, they make collaboration easier so that junior contributors or other interested parties can contribute or get started using the code more easily.
      In this course we will present (in a critical approach) the tools and techniques that can improve the development of research software and make it more sustainable.
      Topics that we will cover (depending on the prior knowledge of the participants) include:

      • Git and version control systems
      • Software management and collaboration tools (e.g. GitLab or GitHub)
      • Automatic Testing
      • Test Driven Development
      • Continuous Integrations and Testing
        and more.

      The goal of the course is to give an overview of the trade-offs encountered when developing research software.

      Conveners: Michele Mesiti (KIT), Dr René Caspart (Karlsruhe Institute of Technology (KIT))
    • Tools and Techniques for Sustainable Research Software Development: Part II 6/1 (30/23)

      6/1

      30/23

      6th floor

      Software is becoming an increasingly important part in modern scientific endeavours and everyday work, most likely also in yours.
      Readability, reusability, robustness and extensibility can be important qualities of the code we write, making it easier to share and more valuable to the relevant scientific community. In particular, they make collaboration easier so that junior contributors or other interested parties can contribute or get started using the code more easily.
      In this course we will present (in a critical approach) the tools and techniques that can improve the development of research software and make it more sustainable.
      Topics that we will cover (depending on the prior knowledge of the participants) include:

      • Git and version control systems
      • Software management and collaboration tools (e.g. GitLab or GitHub)
      • Automatic Testing
      • Test Driven Development
      • Continuous Integrations and Testing
        and more.

      The goal of the course is to give an overview of the trade-offs encountered when developing research software.

      Conveners: Michele Mesiti (KIT), René Caspart (Karlsruhe Institute of Technology (KIT))
    • Tools and Techniques for Sustainable Research Software Development: Part III 6/1 (30/23)

      6/1

      30/23

      6th floor

      Software is becoming an increasingly important part in modern scientific endeavours and everyday work, most likely also in yours.
      Readability, reusability, robustness and extensibility can be important qualities of the code we write, making it easier to share and more valuable to the relevant scientific community. In particular, they make collaboration easier so that junior contributors or other interested parties can contribute or get started using the code more easily.
      In this course we will present (in a critical approach) the tools and techniques that can improve the development of research software and make it more sustainable.
      Topics that we will cover (depending on the prior knowledge of the participants) include:

      • Git and version control systems
      • Software management and collaboration tools (e.g. GitLab or GitHub)
      • Automatic Testing
      • Test Driven Development
      • Continuous Integrations and Testing
        and more.

      The goal of the course is to give an overview of the trade-offs encountered when developing research software.

      Conveners: Michele Mesiti (KIT), René Caspart (Karlsruhe Institute of Technology (KIT))
    • Tools and Techniques for Sustainable Research Software Development: Part IV 6/1 (30/23)

      6/1

      30/23

      6th floor

      Software is becoming an increasingly important part in modern scientific endeavours and everyday work, most likely also in yours.
      Readability, reusability, robustness and extensibility can be important qualities of the code we write, making it easier to share and more valuable to the relevant scientific community. In particular, they make collaboration easier so that junior contributors or other interested parties can contribute or get started using the code more easily.
      In this course we will present (in a critical approach) the tools and techniques that can improve the development of research software and make it more sustainable.
      Topics that we will cover (depending on the prior knowledge of the participants) include:

      • Git and version control systems
      • Software management and collaboration tools (e.g. GitLab or GitHub)
      • Automatic Testing
      • Test Driven Development
      • Continuous Integrations and Testing
        and more.

      The goal of the course is to give an overview of the trade-offs encountered when developing research software.

      Conveners: Michele Mesiti (KIT), René Caspart (Karlsruhe Institute of Technology (KIT))
    • Scalable Artificial Intelligence: Part I 6/1 (30/23)

      6/1

      30/23

      6th floor

      Artificial intelligence methods have led to astonishing breakthroughs in science and technology over the past decade. In this context, there is an increasing trend towards processing ever larger amounts of data and the use of parallel and distributed computing resources. A prominent example is the machine translation algorithm Generative Pre-trained Transformer 3 (GPT-3) which pushes the limits of conventional AI hardware with 175 billion trainable parameters on 285,000 processor cores and 10,000 graphics cards. In the lecture, the audience is introduced to scalability approaches of different AI algorithms. The focus is on the advantages and approaches of parallel computing for AI methods, different available software packages for their implementation as well as algorithm-specific challenges. In particular, we will have deeper look on different flavors of parallel neural networks as well as scalable hyperparameter optimization.

      Convener: Dr Markus Götz (KIT/SCC)
    • Scalable Artificial Intelligence: Part II 6/1 (30/23)

      6/1

      30/23

      6th floor

      Artificial intelligence methods have led to astonishing breakthroughs in science and technology over the past decade. In this context, there is an increasing trend towards processing ever larger amounts of data and the use of parallel and distributed computing resources. A prominent example is the machine translation algorithm Generative Pre-trained Transformer 3 (GPT-3) which pushes the limits of conventional AI hardware with 175 billion trainable parameters on 285,000 processor cores and 10,000 graphics cards. In the lecture, the audience is introduced to scalability approaches of different AI algorithms. The focus is on the advantages and approaches of parallel computing for AI methods, different available software packages for their implementation as well as algorithm-specific challenges. In particular, we will have deeper look on different flavors of parallel neural networks as well as scalable hyperparameter optimization.

      Convener: Markus Götz (KIT/SCC)
    • The BPHZ theorem: a decisive turn in the history of quantum field theory: Part III 6/1 (30/23)

      6/1

      30/23

      6th floor

      At present, quantum field theory is very successful. This success is absolutely impossible without a procedure called "renormalization". This procedure absorbs all the infinities (of the ultraviolet type) contained in the bare theory by an infinite redefinition of the theory constants.
      The divergent (infinite) counterterms produced by this procedure are subject to very fine tuning. For example, there are so-called subdivergences and overlapping divergences. The Bogoliubov-Parasiuk-Hepp-Zimmermann (BPHZ) theorem was proved in 1950-1960-x and guarantees that renormalization eliminates all ultraviolet divergences at any order of the perturbation series. The theorem raised hopes that quantum field theory made sense and could be studied mathematically rigorously. The formulation and proof of the BPHZ theorem will be explained, as well as related constructions.
      Despite its success, the foundations of quantum field theory appear unsatisfactory to many scientists: it gives precise results for many observables, but the reasoning consists of a large sequence of incorrect logical transitions. In this sense, the BPHZ theorem provides an oasis of mathematical rigour. The possibility or impossibility of making something logically consistent can be considered as another very important "experimental fact". We should also consider the possibility that quantum field theory must be completely reconstructed on absolutely different principles in order to become logically correct. This reconstruction must preserve the present very precise agreement with experiments; to make this possible, some "clues" should be taken from the existing theory. We don't know, what these principles and clues are; however, the mechanism underlying divergence cancellation in the BPHZ approach may serve as a good clue.
      Not only the BPHZ theorem itself, but also understanding the reasoning in its proof is very useful. The ideas and combinatorial constructions underlying it can be used to develop effective computational methods.
      For computations in perturbative quantum field theory, each additional order increases the demand on computer resources by several orders of magnitude. Since high-order results are needed in many areas, calculational methods based on nontrivial mathematical ideas are very welcome. Examples will be demonstrated.
      Related topics will also be discussed, such as correct definitions of integrals, regularizations, approaches to rigorous justification of dimensional regularization, Feynman and Schwinger parameters, Symanzik polynomials, power counting theorems, physical renormalization conditions, infrared divergences, mixing of infrared and ultraviolet divergences, convergence of the whole series, and also limited applicability of renormalization, relations between physics and mathematics, social processes in the society of theoretical physicists, and so on.

      Convener: Sergey Volkov
    • The BPHZ theorem: a decisive turn in the history of quantum field theory: Part IV 6/1 (30/23)

      6/1

      30/23

      6th floor

      At present, quantum field theory is very successful. This success is absolutely impossible without a procedure called "renormalization". This procedure absorbs all the infinities (of the ultraviolet type) contained in the bare theory by an infinite redefinition of the theory constants.
      The divergent (infinite) counterterms produced by this procedure are subject to very fine tuning. For example, there are so-called subdivergences and overlapping divergences. The Bogoliubov-Parasiuk-Hepp-Zimmermann (BPHZ) theorem was proved in 1950-1960-x and guarantees that renormalization eliminates all ultraviolet divergences at any order of the perturbation series. The theorem raised hopes that quantum field theory made sense and could be studied mathematically rigorously. The formulation and proof of the BPHZ theorem will be explained, as well as related constructions.
      Despite its success, the foundations of quantum field theory appear unsatisfactory to many scientists: it gives precise results for many observables, but the reasoning consists of a large sequence of incorrect logical transitions. In this sense, the BPHZ theorem provides an oasis of mathematical rigour. The possibility or impossibility of making something logically consistent can be considered as another very important "experimental fact". We should also consider the possibility that quantum field theory must be completely reconstructed on absolutely different principles in order to become logically correct. This reconstruction must preserve the present very precise agreement with experiments; to make this possible, some "clues" should be taken from the existing theory. We don't know, what these principles and clues are; however, the mechanism underlying divergence cancellation in the BPHZ approach may serve as a good clue.
      Not only the BPHZ theorem itself, but also understanding the reasoning in its proof is very useful. The ideas and combinatorial constructions underlying it can be used to develop effective computational methods.
      For computations in perturbative quantum field theory, each additional order increases the demand on computer resources by several orders of magnitude. Since high-order results are needed in many areas, calculational methods based on nontrivial mathematical ideas are very welcome. Examples will be demonstrated.
      Related topics will also be discussed, such as correct definitions of integrals, regularizations, approaches to rigorous justification of dimensional regularization, Feynman and Schwinger parameters, Symanzik polynomials, power counting theorems, physical renormalization conditions, infrared divergences, mixing of infrared and ultraviolet divergences, convergence of the whole series, and also limited applicability of renormalization, relations between physics and mathematics, social processes in the society of theoretical physicists, and so on.

      Convener: Sergey Volkov
    • Scalable Artificial Intelligence: Part III 6/1 (30/23)

      6/1

      30/23

      6th floor

      Artificial intelligence methods have led to astonishing breakthroughs in science and technology over the past decade. In this context, there is an increasing trend towards processing ever larger amounts of data and the use of parallel and distributed computing resources. A prominent example is the machine translation algorithm Generative Pre-trained Transformer 3 (GPT-3) which pushes the limits of conventional AI hardware with 175 billion trainable parameters on 285,000 processor cores and 10,000 graphics cards. In the lecture, the audience is introduced to scalability approaches of different AI algorithms. The focus is on the advantages and approaches of parallel computing for AI methods, different available software packages for their implementation as well as algorithm-specific challenges. In particular, we will have deeper look on different flavors of parallel neural networks as well as scalable hyperparameter optimization.

      Convener: Markus Götz (KIT/SCC)
    • Scalable Artificial Intelligence: Part IV 6/1 (30/23)

      6/1

      30/23

      6th floor

      Artificial intelligence methods have led to astonishing breakthroughs in science and technology over the past decade. In this context, there is an increasing trend towards processing ever larger amounts of data and the use of parallel and distributed computing resources. A prominent example is the machine translation algorithm Generative Pre-trained Transformer 3 (GPT-3) which pushes the limits of conventional AI hardware with 175 billion trainable parameters on 285,000 processor cores and 10,000 graphics cards. In the lecture, the audience is introduced to scalability approaches of different AI algorithms. The focus is on the advantages and approaches of parallel computing for AI methods, different available software packages for their implementation as well as algorithm-specific challenges. In particular, we will have deeper look on different flavors of parallel neural networks as well as scalable hyperparameter optimization.

      Convener: Markus Götz (KIT/SCC)