Prof.
John Wood
(RDA Council Co-Chair)
8/29/16, 2:50 PM
This presentation will give an overview of the rapid development over the past 15 years of what is often called the “Fourth Paradigm” leading to Open Science and Open Innovation. A number of examples from research infrastructures will be used to illustrate the current situation and the role of global organisations like the Research Data Alliance have in bringing about the revolution in Data...
Dr
Lena Wiese
(Georg-August-Universität Göttingen)
8/29/16, 5:20 PM
The relational data model - where data are stored in tables and hence structured according to some fixed set of attributes (that is, table columns) - has been a success for several decades. Furthermore, SQL is a standardized and widely used query and management language for relational databases. A transformation of commonly occurring data into the relational table format is however not always...
Dr
Jurry De la Mar
(T-Systems)
8/30/16, 9:00 AM
The Helix Nebula initiative continues to expand with research organisations,
providers and services.
The hybrid cloud model deployed by Helix Nebula has grown to become a viable
approach for provisioning ICT services for research communities from both
public and commercial service providers
(http://dx.doi.org/10.5281/zenodo.16001).
The relevance of this approach for all those communities...
Krebs Jürgen
(Hitachi)
8/30/16, 10:40 AM
Hardware and beyond
In a world where new hypes and trends flooding the CIO's on a daily base it's hard to keep up with innovations and staying on a tight budget.
Extract / Transfer and Load has been seen as the solution for years. We show a way starting different and solving existing issues without any vendor lock in.
Brendan Bouffler
(Amazon.com, Inc.)
8/31/16, 9:00 AM
Every day, HPC clusters help scientists make breakthroughs, such as proving the existence of gravitational waves, screening new compounds for new drugs and designing better headlights for cars. No industry is untouched by HPC, yet owning HPC clusters is out of reach for most organizations due to the upfront hardware and ongoing operational costs. Now with the cloud, not owning an HPC cluster...
Dr
Andrew Lahiff
(Rutherford Appleton Laboratory)
8/31/16, 9:40 AM
Originally developed over 20 years ago as a means of making use of unused computing resources on desktops, today HTCondor plays an important role in providing high throughput computing for CERN’s Large Hadron Collider. For example, it is used as a batch system at an increasing number of sites, used as a grid computing element, used to provision both grid and cloud resources, and used as an...
Prof.
Peter Braesicke
(KIT)
8/31/16, 11:20 AM
The presentation will describe the state-of-the-art in composition-climate modelling of the atmosphere. Which equations do we use and how are they discretised? Which phenomenons can we describe and how and what can we learn about the atmosphere by characterising a models sensitivities? Issues of scalability and data access will be discussed, because comprehensive numerical experiments are...
Dr
Andreas Herten
(FZ Jülich)
9/1/16, 9:40 AM
GPUs, Graphics Processing Units, offer a large amount of processing power by providing a platform for massively parallel computing. They have the ability to greatly increase the performance of scientific applications on a single workstation computer; and they also power the fastest supercomputers in the world. But leveraging the processing power is not as easy as just running a program on a...
Ingolf Wittmann
(IBM Deutschland)
9/1/16, 10:40 AM
Cognitive Computing is nowadays to be talked of and it is already present in the realm of supercomputing. However, limitations are set by Moore's Law, therefore new technologies and compute approaches are required to make cognitive solutions come true. Todays cognitive solutions presage what will be possible in the future - can future computer be smarter than us?
Dr
Eugen Wintersberger
(DESY)
9/1/16, 11:20 AM
HDF5 is on the verge of becoming a standard file format at synchrotron radiation facilities. This talk will give an overview on the most recent features added to HDF5 and how the rules, specified by the NeXus standard, can help to improve HDF5's use on data recorded during synchrotron radiation experiments.
Dr
Liesbeth Vanherpe
(École polytechnique fédérale de Lausanne)
9/2/16, 9:00 AM
The Human Brain Project (HBP) aims to put in place an ICT-based scientific research infrastructure that will allow researchers to improve our understanding of the human brain through data-driven modelling and whole-brain simulation. Advanced computing technologies enable HBP researchers to study models that were unmanageable until recently.
This talks focuses on computational and modelling...
Dr
Frank Baetke
(Hewlett Packard Enterprise)
9/2/16, 9:40 AM
The talk will address trends in system architecture for HPC and will include related aspects of Big Data and IoT. A specific focus will be on innovative components like next generation memory interconnects, non-volatile memory and silicon photonics that play a key role in future system designs. HPE's 'The Machine' will be used to bring those components into the context of an actual system...
Dr
Klaus Maier-Hein
(Deutsches Krebsforschungszentrum Heidelberg)
9/2/16, 10:40 AM
Radiologic images uniquely represent the spatial fingerprints of a progressing disease over time. “Radiomics” coins the emerging endeavor to systematically extract, mine and leverage this rich information in a personalized medicine approach. We establish and study comprehensive imaging phenotypes reflecting multiple time-points and modalities that can be directly linked to other information...