Wir laden Sie herzlich zum 8. bwHPC-Symposium am 28. November 2022 ein. Die Veranstaltung findet als Zoom-Konferenz statt und wird vom Höchstleistungsrechenzentrum Stuttgart (HLRS) organisiert.
Das bwHPC Symposium bietet eine einzigartige Gelegenheit zum aktiven Dialog zwischen den Forschungsgruppen, den Betreibern der bwHPC-Dienste sowie den bwHPC-Support-Zentren. Im Mittelpunkt steht die Präsentation wissenschaftlicher Projekte und Erfolge, die mit Unterstützung der landesweiten bwHPC Hochleistungsrechner im Rahmen der BaWü-Datenföderation erzielt werden konnten. Die Teilnahme an dem Symposium ist kostenfrei und steht Wissenschaftlerinnen und Wissenschaftlern aller Fachrichtungen offen.
Für weitergehende Fragen zur Veranstaltung oder bei Interesse einen Vortrag zu halten stehen wir Ihnen gerne unter der Adresse symposium2022@bwhpc.de zur Verfügung.
Wir freuen uns über Ihre Teilnahme.
Mit freundlichen Grüßen
Ihr bwHPC Symposium-Team
We cordially invite you to the 8th bwHPC Symposium on November 28, 2022. The event will take place as a Zoom conference and will be organized by the High-Performance Computing Center Stuttgart (HLRS).
The bwHPC Symposium offers a unique opportunity to actively engage in a dialog between scientific users, operators of bwHPC services, and the bwHPC support centers. Its focus is on the presentation of scientific projects and success stories carried out with the help of bwHPC high-performance computing in the context of the BaWü data federation. The symposium will be free of charge and open to researchers from all scientific fields.
For further information on the event or in case of interest to present at the symposium, please contact symposium2022@bwhpc.de
We are looking forward to your participation.
Sincerely,
Your bwHPC Symposium team
Andrea Beck obtained a M.Sc. degree in aerospace engineering with a focus on fluid dynamics from the Georgia Institute of Technology in Atlanta (USA) and a doctoral degree from the University of Stuttgart (Germany) in computational fluid dynamics (CFD). She held the Dorothea-Erxleben professorship at the Institute of Fluid Dynamics and Thermodynamics of the Otto-von-Guericke University in Magdeburg (Germany) from 2020 to 2022 and is currently a professor for numerical methods in fluid dynamics at the faculty of aerospace engineering and geodesy at the University of Stuttgart. Her areas of interest include numerical discretization schemes for multiscale-multiphysics problems, in particular high order methods, high performance computing and visualization, Large Eddy Simulation methods and models, shock capturing schemes, uncertainty quantification methods and machine learning. She is a co-developer of the open-source high order Discontinuous Galerkin CFD framework FLEXI. Recent fields of application include uncertainty quantification of feedback loops in acoustics, particle-laden flow in turbomachines, wake-boundary layer interaction for transport aircraft at realistic flight conditions, shock-droplet interactions and data-driven models for LES closures.
The presentation outlines the current status of the user support project bwHPC-S5 for HPC, DIC and LS2DM in the state of Baden-Württemberg, Germany.
The state-wide project bwHPC-S5 provides federated support for the bwHPC users and coordinates all associated provisions, including development of policies and services. As the connecting body between scientists and HPC systems, bwHPC-S5 has proven to create synergies for the development of state-wide user support. The implementation of professional competence centers guarantees the support expertise required to embed the scientific communities into the HPC, DIC and LS2DM world and to increase efficiency and effectiveness in utilizing both, the compute and storage resources, by optimizing workflows as well as performance and scalability of applications. bwHPC-S5 extends the federated services established in previous projects to include the acquisition, processing, storage and archiving of large scientific datasets.
The Large Hadron Collider (LHC), the largest particle collider in the world, was built to study the fundamental building blocks of nature. Large amounts of computational resources are required for many projects involving LHC physics. We present some of the studies performed with the help of the computing cluster NEMO which involves tasks such as the global analysis of LHC measurements, the simulation of collider events from first principles or the application of modern machine learning methods.
This contribution describes a proof of concept for the archival of the entire life cycle of the data, the analysis code, and the necessary software stack of an analysis realized during a typical PhD or master thesis in High-Energy Physics. All together, the derived datasets, the container with the appropriate software, and the dedicated analysis code are wrapped together into a dataset and stored on an appropriate long term storage, e.g. on the S3 storage of the bwSFS (Baden-Württemberg – Storage for Science) hosted in the compute centers of the University of Freiburg and Tübingen.
A database keeps track of the individual datasets, such that based on the author name, the thesis title, or the doi of the given thesis, the corresponding dataset can be pulled up. Furthermore it is described how the setup can be generalized to be utilized for the entire Physics Department at the University of Freiburg.
In the framework of the German Astrophysical Virtual Observatory (GAVO), several services and tools have been developed in the last two decades. This includes, e.g., the Theoretical Stellar Spectra Access (TheoSSA) service that provides synthetic stellar spectra calculated by the Tuebingen non-local thermodynamic equilibrium Model-Atmosphere Package (TMAP). About 500000 precalculated stellar energy distributions (SEDs) are presently available, individual SEDs will be calculated on demand. These may be used for precise spectral analyses of hot, compact stars, e.g., subdwarfs, white dwarfs, and even neutron stars.
Earth system models of intermediate complexity, like the Planet Simulator (PlaSim), can be used for studying the long-term climate response to external forcing, for example, from greenhouse gas emissions. The equilibrium climate sensitivity (ECS) quantifies this response and is one of the most important parameters to project future global warming. Convective parameterization plays a key role for the hydrological cycle of a model, but its influence on ECS remains uncertain. The aim of this study is to investigate the influence of two different convection schemes on the ECS in PlaSim. To achieve this goal, we implement the Tiedtke convection scheme in PlaSim and tune the model. The results show that the new scheme reduces the ECS compared to the operational Kuo scheme. Six of eight configurations yield ECS values which are within the likely range obtained from more complex climate simulations. In addition to the convection scheme, we identify other parameters, like the horizontal heat diffusion that influence the ECS in PlaSim.
Introduction in the team and racecar-concept of Rennstall Esslingen. We will talk about the Aerodynamic-Concept and the development from design to simulation and validation. Special Focus will be the CFD workflow using STAR-CCM+ on the bwHPC.
A workflow for battery performance optimization is presented that couples wetting and flow simulations based on pore network models (PNM) and the lattice Boltzmann method (LBM) with electrochemical simulations using the tool BEST. All software packages included show an appropriate scaling behavior on high-performance computing (HPC) clusters. Electrolyte filling of battery components is discussed as a showcase. This step is time-critical and therefore cost-intensive in battery manufacturing. A special focus is given to the unwanted side effect of gas entrapment encountered during the filling, which is also known to have a strong influence on the electrochemical performance of batteries.
We present the structure of bwForCluster Helix and possible use cases.
The bwForCluster JUSTUS 2 is part of the implementation concept for high performance computing in the state of Baden-Württemberg (bwHPC) and is statewide available to all researchers in the fields of "Computational Chemistry and Quantum Sciences". The technical features of JUSTUS 2 were adapted exactly to this intended purpose in order to
achieve high efficiency of the HPC applications for these scientific disciplines.
As the seat of the state-wide bwHPC Competence Center "Computational Chemistry and Quantum Sciences" Baden-Württemberg, this cluster is operated by the Communication and Information Centre (kiz) of Ulm University and provides federated user support by qualified HPC experts for scientists from these areas of research.
This poster will present the key features of the system and invite discussion on best practices in using JUSTUS 2 as productively as possible.
A poster is presented showing the key facts of bwHPC and its application in the field of engineering. The objective of the poster is the recruiting of new users for the bwHPC hardware at the different university locations. It should help new users to get an easy access to the bwHPC infrastructure. The poster illustrates steps to the initial simulation on the cluster, points out ways of support for users and presents the "Kompetenzzentrum Ingenieurswissenschaften".
Both, poster and presentation, outline the current status of the user support project bwHPC-S5 for HPC, DIC and LS2DM in the state of Baden-Württemberg, Germany.
The state-wide project bwHPC-S5 provides federated support for the bwHPC users and coordinates all associated provisions, including development of policies and services. As the connecting body between scientists and HPC systems, bwHPC-S5 has proven to create synergies for the development of state-wide user support. The implementation of professional competence centers guarantees the support expertise required to embed the scientific communities into the HPC, DIC and LS2DM world and to increase efficiency and effectiveness in utilizing both, the compute and storage resources, by optimizing workflows as well as performance and scalability of applications. bwHPC-S5 extends the federated services established in previous projects to include the acquisition, processing, storage and archiving of large scientific datasets.
Poster on recent projects carried out in the field of Global System Science. Project topics include COVID-19, Climate and Weather, Digital Twin, Heavy Rain Simulation.
Gasdermins (GSDMs) execute a form of programmatory cell death, pyroptosis, by forming medium-sized membrane pores. GSDMA3, a variant native to mice, is involved in asthma, systemic sclerosis, alopecia, and inflammatory bowel disease. The exact pathway of GSDMs' pore formation remains a mystery, so we investigated the influence of charged amino acid residues on GSDMA3's membrane insertion process using both a monomer and a 7mer arc. Our results show that salt-bridge formation and protein surroundings reduce the energetic insertion cost dramatically, allowing spontaneous self-insertion. Monomeric gasdermin prefers the membrane-adsorbed over the membrane-inserted state, supporting the hypothesis of oligomers preassembling on the membrane surface before membrane penetration. Furthermore, the inserting oligomer can be small and does not have to comprise a full ring of approximately 26-30 subunits.
Internal combustion engines use highly dispersed noble metals such as Pt, Pd and Rh as catalytic active substances and oxides such as ceria, titania and alumina as carriers for exhaust gas after-treatment. The identification of the active species is exacerbated by the fact that catalysts undergo a dynamic structural change under reaction conditions.
SACs are on the very boundary between heterogeneous and homogeneous catalysis. Thinking about complexchemistry, the support poses a multidentate ligand which interactively takes part into the reaction and enhance the activity of the catalyst. Ceria and Pt are know as very active catalysts. But the exact nature is yet unknown and needs plenty small steps from different fields to be enlightend. Here we start at the very bottom and provide a thourought foundation, which will then lead the direction of further investigations, meaning other noble metals, other supports, or other simulation methods as kinetics or input for characterization methods or testing of the activity.
In microbiome analysis, one main approach is to align metagenomic sequencing reads against a protein reference database, such as NCBI-nr, and then to perform taxonomic and functional binning based on the alignments. This approach is embodied, for example, in the standard DIAMOND+MEGAN analysis pipeline, which first aligns reads against NCBI-nr using DIAMOND and then performs taxonomic and functional binning using MEGAN. Here, we propose the use of the AnnoTree protein database, rather than NCBI-nr, in such alignment-based analyses to determine the prokaryotic content of metagenomic samples. We demonstrate a 2-fold speedup over the usage of the prokaryotic part of NCBI-nr and increased assignment rates, in particular assigning twice as many reads to KEGG. In addition to binning to the NCBI taxonomy, MEGAN now also bins to the GTDB taxonomy.
IMPORTANCE The NCBI-nr database is not explicitly designed for the purpose of microbiome analysis, and its increasing size makes its unwieldy and computationally expensive for this purpose. The AnnoTree protein database is only one-quarter the size of the full NCBI-nr database and is explicitly designed for metagenomic analysis, so it should be supported by alignment-based pipelines.
Electrochemistry is at the heart of many important industrial processes or systems, ranging from electrolysis and hydrogen evolution over corrosion, metal deposition up to batteries, fuel cells and photoelectrochemical devices. Here, the interface between the electrolyte and the electrolyte, which can be considered the site where electrocatalytic processes occur, is of particular interest. Due to its complexity and multi-component environment even after many decades of research, our understanding of the properties of this interface is still scarce. In this talk we will discuss how modern theoretical multi-scale methods in conjunction with in-situ experiments on well-defined model systems is capable to unravel the structure and composition of these interfaces as well as the ongoing electrochemical reactions and diffusion processes.
We will concentrate on different interfacial systems, including Li-ion and Mg-ion batteries. In our work we concentrated on realizing and investigating the deposition and growth on model electrodes in order to understand apparent nucleation and growth processes (e.g. dendrite formation/growth). By combining different theoretical methods we could gain insights self-diffusion on transition metal electrodes, battery-relevant materials and finally the dynamics of alloy-based electrocatalysts under operation conditions, showing the importance and urgent need for in-operando experimental studies.
In my talk I will give a few examples of universal dynamics of ultracold atomic quantum gases that resemble turbulence in a fluid. I highlight, in particular, the importance of its massively parallel computation on graphical processing units that allows for a statistical evaluation of the properties, which reveal the universal nature of the dynamics.
Free-space quantum communication is an active, application-oriented research area responding to the ever-growing demand of secure and flexible communication channels. Although recently developed high-dimensional encoding of information into photonic orbital angular momentum may provide broadband links, its successful implementation proved to be challenging in the presence of atmospheric turbulence. To this end, we strive to develop a new paradigm of high-dimensional encoding – into spatial singular modes of light in turbulence. By exploiting the intrinsic features of the atmosphere, such instantaneous modes are inherently robust to turbulence. In this talk, we will provide an overview of the concept and how numerical calculations on the bwHPC cluster simulate the transmission of light through turbulence.
Over the past decade, advances in computational power have vastly increased the interest in deep neural networks (DNNs) as an efficient machine learning method. Despite their impressive performance on state-of-the-art classification tasks, our understanding of DNNs remains incomplete, mainly owing to their black-box nature, which prohibits understanding of the geometry of the decision boundary. This short-coming particularly hampers the application of DNNs in neuroscience, where the decision boundary needs to be interpreted in terms of anatomical units such as neurons or brain regions, and further linked to cognitive functions and behavior. Here we explore working memory related activity in the hippocampus . We used a linear neural network to examine whether the previous behavior of an animal could be decoded from 100ms long time bins. To that end, permutation tests were performed, while shuffling randomly the labels of left and rightward trials in combination with a 2-fold cross-validation protocol . To directly identify the neuronal basis of the prediction scores of the classifier we visualized its decision boundary (DB) by applying adversarial attack techniques from machine learning. From this set of boundary positions we then constructed most informative directions (MIDs) as clusters of orthogonal vectors to the boundary. For low signal strength and artificial surrogate data, we show that the method outperforms estimating the weight vector by bootstrapping.
The large-scale structural organisation of the adult brain is relatively stable and unchanging. However, in some animal species, including humans, new neurons are born and integrated into the hippocampal network throughout lifetime. The hippocampus is involved in functions such as memory formation, pattern separation, pattern completion, and spatial navigation, among others. However, how the integration of newborn neurons in the hippocampus affects its function is not clear. Most neurons in the hippocampus are born early in development and have functional properties that are distinct from the newborn cells. We suspect that this age-dependent functional distinction is responsible for the integration of the newborn cells in the existing brain networks. To assay the process of integration we have simulated models of the hippocampus where, following a maturation process, newborn neurons form plastic connections and interact with other neurons in the network. Furthermore, we aim to explore how the integration of newborn neurons contributes to hippocampal function.
We have employed the bwForCluster NEMO to run simulations in a computational neuroscience project where the integration of newborn neurons into mature brain networks is analysed. Neurogenesis in the adult brain is reported for certain parts of the brain, however, it is not known how newborn neurons affect brain function. Our simulations enable us to monitor the dynamics of activity and connectivity of newborn neurons at a level of detail which is currently not possible in real brains. First, I will briefly describe the biological background and motivation behind our work, this will be followed by a short discussion of the MPI-based simulation protocol. Finally, I will outline the results and prospects of our work.
Surveys are facing pressures to shorten questionnaires: Long questionnaires are associated with low response rates, poor response quality, and are particularly considered inappropriate for the increasingly popular online mode. This is why survey designs with planned missing data, such as split questionnaire designs, are becoming more and more common in large-scale social surveys: They help reducing survey length by administering varying components of the whole questionnaire to each respondent. However, imputation may be needed to obtain reasonably analyzable data with such a design. Yet, these data can be difficult to impute due to common features of social survey data, such as low correlations, predominantly categorical data, and relatively small sample sizes available to support imputation models with many potential predictor variables.
In this presentation, I will discuss findings from a series of Monte Carlo simulation studies in which split questionnaire designs are simulated using real social survey data from the German Internet Panel and subsequently imputed. Estimates based on the imputed data are compared to population benchmarks to determine their accuracy. In the course of these studies, several different strategies regarding the design of the planned missing data and their imputation are examined in their effects on the accuracy of estimates.
Can we predict if (and when) a patient develops cancer? What are risk factors? Is it possible to tailor cancer treatment to individual patients? To answer these questions, we need to understand how the process of cancer development works on a cellular level. As part of the interdisciplinary field of systems biology, we try to answer these questions by mechanistically modelling cell signal transduction by the means of ordinary differential equations. To fit models with many parameters to biological data, high computational power is needed.
In this talk we introduce our work and discuss the necessity of HPC in cancer research.