Aug 26 – 30, 2019
KIT, Campus North, FTU
Europe/Berlin timezone

Tutors

Sebastien Binet (LPC)

Sebastien Binet

Sebastien Binet received his PhD in Particle Physics at the Universite Blaise Pascal on Jet Calibration and Analysis Tools in ATLAS. As a post-doc at LBL, he worked on core issues of the Athena control framework for the ATLAS experiment (monitoring, event size reduction, I/O optimization), and started the work on AthenaMP: the multi-process-based parallelization of Athena. He then worked on multithreaded parallelization aspects, scouted for better suited languages and has fallen in love with Go and its elegant approach to concurrency programming.

Fabio Baruffa (Intel)

Fabio Baruffa is a senior software technical consulting engineer at Intel. He provides customer support in the high performance computing (HPC) area and artificial intelligence software solutions at large scale.

Prior at Intel, he has been working as HPC application specialist and developer in the largest supercomputing centers in Europe, mainly the Leibniz Supercomputing Center and the Max-Plank Computing and Data Facility in Munich, as well as Cineca in Italy. He has been involved in software development, analysis of scientific code and optimization for HPC systems. He holds a PhD in Physics from University of Regensburg for his research in the area of spintronics devices and quantum computing.

Timo Bingmann (Karlsruhe Institute of Technology)

Timo Bingmann recently received his PhD from the Karlsruhe Institute of
Technology where he works in Prof. Peter Sander's group. He received a
Diploma degree in Computer Science in 2010 at the same university, and a
second Diploma degree in Mathematics from the FernUniversity in Hagen in
2016. His main research topic is algorithm engineering for  the basic
toolbox focused on modern parallel and distributed computing
architectures. His current project is called Thrill, which is a C++
framework for high performance distributed computing with a convenient
interface.

Enrico Bocchi (CERN)

Enrico BocchiEnrico is a computing engineer in the IT Department of the European Organisation for Nuclear Research (CERN). As a member of the Storage Group, he shares responsibility for many distributed storage services used to collect and store all the data produced by CERN physics experiments. He is currently the main responsible for the operations and evolution of CVMFS, a critical WLCG service used to distribute software and condition databases across the worldwide computing grid. He is also the main developer and maintainer of "Science Box": A container-based version of CERN cloud services including EOS (distributed storage system), CERNBox (sync and share cloud storage), and SWAN (web-based interactive data analysis service). Before joining CERN, Enrico was a researcher in the field of computer networks. He holds a Joint-PhD Degree in Electronics and Computer Science issued by Politecnico di Torino (Italy) and Télécom ParisTech (France).

Oliver Freyermuth (Bonn University)

Oliver FreyermuthOliver got his PhD within the field of hadron physics at the BGO-OD experiment located at the ELSA facility in Bonn in 2017. During his analysis work on meson photoproduction, he tended to spend a significant amount of time on C++ analysis framework development, contributions to ROOT and other open source projects within and outside of particle physics, and IT services within the context of the experiment.

Following his strong interesting in computing, he is now in the IT department of the Institute of Physics at the University of Bonn and administers the local High Throughput computing cluster used by groups mainly from high energy particle physics, photonics, hadron physics and different fields of theory. The variety of requirements by these groups has led to the development of a flexible setup based on HTCondor and Singularity containers, backed by CephFS. As part of this work, he also comes into close contact with the software used by the various groups, always trying to understand the different workflows and helping the users to optimize them in the context of a common computing cluster. Additional tasks involve the administration of all Linux desktop machines in the institute, the full automation of the server and desktop infrastructure and operation of any services specific to physics, such as XRootD for the Grid enabled storage and CVMFS servers.

Markus Götz (Karlsruhe Institute of Technology)

Markus GötzMarkus Götz received his Bachelors and Masters degree in Software Engineering from the University of Potsdam in 2010 and 2014 respectively. Afterwards, he has been with the Research Center Jülich and the University of Iceland, from which Markus obtained his PhD degree in Computational Engineering for his works on parallel data-analysis algorithms on high-performance computing (HPC) systems.

Since the beginning of 2018 Markus is with the Steinbuch Centre for Computing (SCC) at the Karlsruhe Institute of Technology (KIT). There, he manages the Helmholtz Analytics Framework project, a german-wide initiative with the aim of developing the data sciences in the Helmholtz Association. His research topics include applied machine learning, scalable data analysis frameworks and parallel algorithms.

Andreas Herten (FZ Jülich)

Andreas HertenAndreas made his PhD as an experimental particle physicist at Forschungszentrum Jülich/Ruhr University Bochum. He investigated the application of graphics processing units (GPUs) for track reconstruction in the online event selection system of the PANDA experiment. After graduating he joined the NVIDIA Application Lab of the Supercomputing Centre of Forschungszentrum Jülich, where he enables scientific applications for GPUs and improves their performances.

 

 

Florian Jetter (Blue Yonder GmbH)

coming soon...

Philipp Krenn (Elastic)

Philipp lives to demo interesting technology. Having worked as a web, infrastructure, and database engineer for over ten years, Philipp is now working as a developer advocate at Elastic — the company behind the open source Elastic Stack consisting of Elasticsearch, Kibana, Beats, and Logstash. Based in Vienna, Austria, he is constantly traveling Europe and beyond to speak and discuss open source software, search, databases, infrastructure, and security.

Mario Lassnig (CERN)

Mario LassnigDr. Mario Lassnig has been working as a Software Engineer at the European Organisation for Nuclear Research (CERN) since 2006. Within the ATLAS Experiment, he is responsible for many aspects of large-scale distributed data management, database applications, as well as descriptive and predictive analytics for large data samples.
In his previous life, he developed mobile navigation software for public transportation in Vienna at Seibersdorf Research, as well as cryptographic smartcard applications for access control at the University of Klagenfurt. He holds a Master's degree in Computer Science from the University of Klagenfurt, and a doctoral degree in Computer Science from the University of Innsbruck.

Zheng Meyer-Zhao (ASTRON)

Zheng works at ASTRON (Netherlands Institute for Radio Astronomy) as Science Data Centre (SDC) Development Lead. ASTRON SDC is under development with the goal of providing astronomers easy access to astronomical data, compute infrastructure and storage, in order to prepare the data challenge generated by future telescopes such as the Square Kilometre Array (SKA).

Zheng has worked as a software engineer for Radio Astronomy for many years. She is passionate about parallel programming with MPI and data processing on HPC/HTC systems. Zheng obtained two master degrees in Computer Science at Free University of Brussels (Vrije Unversiteit Brussel), one is specialised in parallel programming, and the other one in Artificial Intelligence.

Alexandr Mikula (Czech Academy of Sciences)

Alexandr is the only senior Linux administrator at the Computing centre of Institute of Physics of the Czech Academy of Sciences and also an administrator of CESNET WLCG site. His work tasks range from basic maintenance of the site infrastructure, through WLCG middleware and storage clusters to proposing of new hardware and counselling problems of the other sites. Linux self educated admin and admirer since year 2007. Supporter of new technologies and approaches with strong emphasis on the problem solving using only FOSS solutions.

Sebastian Neubauer (Blue Yonder GmbH)

Sebastian NeubauerSebastian joined Blue Yonder as a Data Scientist after completing his PhD in physics. One of the first things Sebastian learned on the job is that it takes a lot more than just software and algorithms to deliver value to customers. Realizing this, he focused his efforts on configuration management and application life cycle management as well as the design, development and operation of distributed systems. He is now a Senior Data Scientist at Blue Yonder and contributed already in nearly all layers, from infrastructure to algorithms.

Oliver Oberst (IBM)

Oliver Oberst made his PhD in particle physics at the KIT focused on analysis of QCD data from the CMS Experiment and development of tools to manage virtualized worker nodes. After two years as a post doc at KIT, managing the CMS Tier 1 contact team for GridKa as-well as coordinating the German CMS grid computing group (DCMS), Oliver joint IBM as an Industry Solution Architect. Since then he designs solutions for research and higher-education customers with emphasis on technical computing (HPC, HTC), big data analytics and cloud computing.

Graeme Stewart (CERN)

Graeme received his PhD in Plasma Physics from the University of Glasgow,
programming numerical simulations of plasma waves. He worked in Mexico on
modelling the Sun's magnetic field structure before switching fields to
high-energy particle physics. Graeme is a member of the ATLAS collaboration and
as Software Coordinator had a major hand in ATLAS's ongoing migration to a
multi-threaded framework. Graeme now works in the Software for Experiments group
at CERN and promotes common software projects for high-energy physics through
the HEP Software Foundation. He keeps an active interest in all cool parts of
software and computing for particle physics.

Oskar Taubert (Karlsruhe Institute of Technology)

Oskar TaubertOskar studied Physics at the Karlsruhe Institute for Technology in Germany and is currently working on his PhD in Physics at the Steinbuch Centre for Computing.

His research interests include machine learning on Co-Evolution of biological macromolecules.

Oliver Scherer (Karlsruhe Institute of Technology)

Oliver Scherer works as a computer science researcher at KIT since 2014
and started teaching Rust regularly in 2016. Since Rust 1.0 he
contributed frequently to the Rust compiler and its constant evaluator
and has joined the compiler team in 2018. In his time at KIT he mentored
several bachelor theses on developing embedded software safely by using
Rust specific features.