5. Evolving Infrastructures


Currently, we are facing the digital transformation. The internet is transforming from a communication network to a data network. Like previous transformations, e.g., electrification, optimization of devices like transistors or lasers, this process will take decades and involve science & industry with a concomitant major impact on society. As it turns out, the rise of Quantum Computing (QC) will lead to another paradigm shift in addition to digital transformation with a just a few years of delay. Therefore, we need to discuss also the immanent transition of the hardware base for QC from classical to quantum computers. Managing both essentially at the same rime, makes it an extraordinary challenge and – at the same time – an opportunity. We are working towards evolving infrastructures within the four sections

  • Quantum Research Data Management (Q-RDM)
  • Artificial Intelligence
  • Standards and Quality
  • Computing

Q-RDM

Leadership

  • Karl Jansen (Desy)
  • Kristel Michielsen (FzJ)
  • Giovanna Morigi (U Saarland)

Recent years have been witnessing a boost of activities towards the development of quantum technologies. One prominent focus has been the development of quantum computer hardware platforms and the implementation of adequate software, including quantum algorithms and novel methods to efficiently use the available quantum hardware. These activities are being promoted at all levels, from fundamental research in Universities and research centers to Industry. A very important aspect is the education of young scientists which will form the next generation of teachers, researchers and engineers. Quantum technologies and, in particular quantum computing, are strategic priorities of international and national funding agencies. In Germany, BMBF is by now taking a central role in actively supporting quantum technologies through funding and by setting up an agenda in close collaboration with the main actors. The German Physical Society has recently funded the new division “Quantum Information”, which collects the growing number of scientists working on quantum technologies in Germany. The Helmholtz society has published a roadmap for quantum technologies for the next 10 years (Astakhov et al ).

This boost of activities, with their impressive scientific and technological advances they carry with them, hold the premise of the so-called “second quantum revolution”, namely, of the groundbreaking and revolutionary potential that quantum physics offers for e.g. cybersecurity, communication, precision measurements, and computing. Among several milestones and breakthroughs, we mention here the recent experiment of Google providing proof of principle of quantum supremacy (Arute et al, 2019 ), the research activities merging of quantum physics with machine learning pursued, amongst others, by CERN and IBM (Zoufal et al, 2019 ), applications in chemistry, in biology, in condensed-matter and high energy physics (Acin et al, 2018 ). In addition, quantum computers can lead to a supremacy in classical optimization problems, e.g. in drug development, traffic and logistics to mention only a few applications (Acin et al, 2018 ). Impressive benchmark calculations have already been performed, the field is rapidly developing and with the anticipated growing number of qubits* it is to be expected that real life applications are employed on quantum computers very soon. The recently established  “JUNIQ Quantum User Faclility” at FzJ aims to speed up this development. It offers a uniform quantum computing platform as a Service (QC-PaaS), integrating quantum computers and quantum annealers in the form of quantum-classical hybrid computing systems into the modular HPC environment of the Jülich Supercomputing Centre. This provides a seamless transition for methods developed on simulators over to actual quantum computing hardware, streamlining the development process of new methods.

Within the Section Q-RDM of our Task Area Evolving Infrastructures, we have identified three main areas where our domain Quantum Information & AI will significantly contribute to the consortium.

Storage and management of classical data for quantum technologies. Storage and management of classical data is a common issue which is shared by theory and experiments in physics, e.g., atomic molecular, optical, condensed-matter, and high energy physics, including the research groups across these fields actively working in quantum information. This motivates the active participation of these groups in the consortium. The workflow of classical simulation of quantum systems is rapidly increasing thanks to the computational power of tensor networks, quantum Monte Carlo, and variational Quantum Simulations (Bañuls et al, 2020 ). Such simulations serve to prepare the actual computations on the quantum hardware, validate the software and the code implementation and provide the size of the needed statistics that needs to be reached. Already nowadays, these classical simulations are being performed on high performance cluster architectures with multiple core processors which can be standard chips or GPUs. In the near future, larger, more realistic system sizes will be studied leading to an increasing demand for using super computers with the corresponding increasing amount of data to be stored and managed.

 Feedback loop between quantum and classical hardware. The remarkable progress in the development of quantum computers has already led to the establishment of platforms at  IBM, Rigetti and others where quantum computations can be performed on dozens of qubits. In a few years the number of qubits will scale up by at least one order of magnitude*. This expected breakthrough poses the problem of the classical workflow between a local machine and the quantum computer. In fact, such transfers of data build already now a bottleneck for the efficient usage of the quantum hardware. This urges for identifying novel solutions and strategies to overcome this barrier. The NFDI4phys consortium is a most adequate platform for adressing this problem through an open exchange of knowledge and expertise. In this way, NFDI4phys will contribute in developing novel strategies for dataflow and storage in the quantum era.

Storage and management of quantum data.  An open and fascinating question is to find solutions for storage of quantum data in quantum platforms. Here, NFDI4phys will contribute towards solving the outstanding question of storage and perform pioneering work towards management of quantum data. FDOs are particular suited for implementing Q-RDM as their hierarchical and modular architecture allows to adjust only their low-level functionality on a Q-RDM platform while keeping the high level interfaces to data lakes essentially unchanged.

*For example IBM has published a roadmap with quantum computers with 127, 433 and 1121 qubits in 2021, 2022 and 2023, respectively


Artificial Intelligence

Leadership

  • Frank Kirchner (DFKI)
  • Philipp Slusallek (DFKI)

Quantum machine learning. Quantum technologies have  ignited research towards quantum machine learning, which ranges from implementing machine learning algorithms in quantum computers to developing novel concepts for quantum neural networks [6]. This research is being actively pursued by physicists in close collaboration with computer scientists, e.g. for overcoming the huge amount of data expected in the next generations of the LHC at CERN. Completely novel solutions are necessarily to be developed since the sheer increase of computational power will not be sufficient to solve this problem. It is therefore one of our objectives to develop quantum machine learning further towards practical solutions in the midterm future. These efforts will spin off numerous applications relevant for society needs and expectations.


Standards and Quality

Leadership

  • Piet Schmidt (PTB)
  • Kristel Michielsen (FzJ)
  • Holger Israel (PTB)

Standards and quality for quantum data.


Computing

Leadership

  • Stefan Krieg (FzJ)
  • Karl Jansen (DESY)

The data repositories, metadata services, and analysis tools require hardware infrastructure to be developed and rolled out the consortium and the whole NFDI. Critically, the repositories should make data available quickly, but also be able to store large amounts of datasets of varying sizes: images or simulation data, annotations, data ensembles etc. can require storage in the KB or GB range. An efficient storage hierarchy providing both high-performance as well as high-capacity data services is thus necessary as the backbone of the envisaged repositories of NFDI4Phys. Access and metadata services to the information stored on the storage hierarchy, as they are developed within the consortium, will be implemented on dedicated access nodes, depending on their requirements either as virtual machines or actual compute hardware. Finally, data analysis tools integrated into the NFDI4Phys date services will require compute resources during development, but also in the production phase. Storage capacity on a state-of-the-art storage hierarchy, the resources for the access and metadata services as well as analysis tools are provided by Forschungszentrum Jülich through the Jülich Supercomputing Centre. Especially we will have

  • 1 PByte of hard disk storage and
  • 10 Million core hours/year

available for our consortium.


previous task area (Evolving Data Quality and Standards) | task areas | next task area (Community Interactions)

NFDI4Phys