Prof. Hannes Hartenstein (KIT)
Dr. Manuel Giffels (KIT)
Prof. John Wood (RDA Council Co-Chair)
This presentation will give an overview of the rapid development over the past 15 years of what is often called the “Fourth Paradigm” leading to Open Science and Open Innovation. A number of examples from research infrastructures will be used to illustrate the current situation and the role of global organisations like the Research Data Alliance have in bringing about the revolution in Data...
Dr. Lorenzo Moneta (CERN)
The Jupyter Notebook is a narrative that combines data visualisation, text, code and multimedia material, all in the same web application. It can be used for instance for data analysis, simulations, statistical inference, machine learning or teaching. Moreover Notebooks encourage suitably documented scientific code, a desirable feature when reproducibility and preservation of analyses is...
Manuela Kuhn (DESY)
This presentation will give an overview about the data life cycle in photon science and the resulting challenges on data handling. The discussed topics include data taking, analysis, storage and achival. The technical realization is illustrated on an example of currently developed infrastructures for synchrotrons and free electron lasers.
Dr. Lena Wiese (Georg-August-Universität Göttingen)
The relational data model - where data are stored in tables and hence structured according to some fixed set of attributes (that is, table columns) - has been a success for several decades. Furthermore, SQL is a standardized and widely used query and management language for relational databases. A transformation of commonly occurring data into the relational table format is however not always...
Dr. Jurry De la Mar (T-Systems)
The Helix Nebula initiative continues to expand with research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities...
Elvin Sindrilaru (CERN)
Linux containers (LXC) is a technology that provides operating system-level virtualisation not via a virtual machines but rather by using a single kernel to run multiple instances on the same OS. Linux namespaces and control groups (cgroups) represent the foundation on which LXC are built. Containers are fast to deploy, they introduce no overhead or indirection as in the case of traditional...
Krebs Jürgen (Hitachi)
Hardware and beyond In a world where new hypes and trends flooding the CIO's on a daily base it's hard to keep up with innovations and staying on a tight budget. Extract / Transfer and Load has been seen as the solution for years. We show a way starting different and solving existing issues without any vendor lock in.
Jose Castro Leon (CERN)
OpenStack is an open-source cloud computing platform for public and private clouds. OpenStack software controls large pools of compute, storage, and networking resources throughout a datacentre, managed through a dashboard or via the API. OpenStack works with popular enterprise and open source technologies making it ideal for heterogeneous infrastructure. The presentation will outline our...
Brendan Bouffler (Amazon.com, Inc.)
Every day, HPC clusters help scientists make breakthroughs, such as proving the existence of gravitational waves, screening new compounds for new drugs and designing better headlights for cars. No industry is untouched by HPC, yet owning HPC clusters is out of reach for most organizations due to the upfront hardware and ongoing operational costs. Now with the cloud, not owning an HPC cluster...
Dr. Andrew Lahiff (Rutherford Appleton Laboratory)
Originally developed over 20 years ago as a means of making use of unused computing resources on desktops, today HTCondor plays an important role in providing high throughput computing for CERN’s Large Hadron Collider. For example, it is used as a batch system at an increasing number of sites, used as a grid computing element, used to provision both grid and cloud resources, and used as an...
Dr. Benedikt Hegner (CERN)
Prof. Peter Braesicke (KIT)
The presentation will describe the state-of-the-art in composition-climate modelling of the atmosphere. Which equations do we use and how are they discretised? Which phenomenons can we describe and how and what can we learn about the atmosphere by characterising a models sensitivities? Issues of scalability and data access will be discussed, because comprehensive numerical experiments are...
6. Big Data Analytics in Biology: Biomolecular Structure Prediction and Beyond by Tracing Residue Co-Evolution
Dr. Alexander Schug (KIT)
One grand challenge of life sciences in the coming years is to fully leverage experimental progress like high-throughput sequencing by taking advantage of recent advances in other disciplines, in particular in information technology. Exploring the interrelationship of structure and function is crucial for understanding life on the molecular level. Yet despite significant progress of...
Dr. Andreas Herten (FZ Jülich)
GPUs, Graphics Processing Units, offer a large amount of processing power by providing a platform for massively parallel computing. They have the ability to greatly increase the performance of scientific applications on a single workstation computer; and they also power the fastest supercomputers in the world. But leveraging the processing power is not as easy as just running a program on a...
Ingolf Wittmann (IBM Deutschland)
Cognitive Computing is nowadays to be talked of and it is already present in the realm of supercomputing. However, limitations are set by Moore's Law, therefore new technologies and compute approaches are required to make cognitive solutions come true. Todays cognitive solutions presage what will be possible in the future - can future computer be smarter than us?
Dr. Eugen Wintersberger (DESY)
HDF5 is on the verge of becoming a standard file format at synchrotron radiation facilities. This talk will give an overview on the most recent features added to HDF5 and how the rules, specified by the NeXus standard, can help to improve HDF5's use on data recorded during synchrotron radiation experiments.
Dr. Liesbeth Vanherpe (École polytechnique fédérale de Lausanne)
The Human Brain Project (HBP) aims to put in place an ICT-based scientific research infrastructure that will allow researchers to improve our understanding of the human brain through data-driven modelling and whole-brain simulation. Advanced computing technologies enable HBP researchers to study models that were unmanageable until recently. This talks focuses on computational and modelling...
Dr. Frank Baetke (Hewlett Packard Enterprise)
The talk will address trends in system architecture for HPC and will include related aspects of Big Data and IoT. A specific focus will be on innovative components like next generation memory interconnects, non-volatile memory and silicon photonics that play a key role in future system designs. HPE's 'The Machine' will be used to bring those components into the context of an actual system...
Dr. Klaus Maier-Hein (Deutsches Krebsforschungszentrum Heidelberg)
Radiologic images uniquely represent the spatial fingerprints of a progressing disease over time. “Radiomics” coins the emerging endeavor to systematically extract, mine and leverage this rich information in a personalized medicine approach. We establish and study comprehensive imaging phenotypes reflecting multiple time-points and modalities that can be directly linked to other information...
Dr. Manuel Giffels (KIT)