5. Evolving Infrastructures


Currently, we are facing the digital transformation. The internet is transforming from a communication network to a data network. Like previous transformations, e.g., electrification, optimization of devices like transistors or lasers, this process will take decades and involve science & industry with a concomitant major impact on society. As it turns out, the rise of Quantum Computing (QC) will lead to another paradigm shift in addition to digital transformation with a just a few years of delay. Therefore, we need to discuss also the immanent transition of the hardware base for QC from classical to quantum computers. Managing both essentially at the same rime, makes it an extraordinary challenge and – at the same time – an opportunity. We are working towards evolving infrastructures within the four sections

  • Quantum Research Data Management (Q-RDM)
  • Quantum Artificial Intelligence (Q-AI)
  • Quantum Quality and Standards (QQ)
  • Computing (C)

Q-RDM

Leadership

  • Karl Jansen (DESY)
  • Kristel Michielsen (FzJ)
  • Giovanna Morigi (U Saarland)

Recent years have been witnessing a boost of activities towards the development of quantum technologies. One prominent focus has been the development of quantum computer hardware platforms and the implementation of adequate software, including quantum algorithms and novel methods to efficiently use the available quantum hardware. These activities are being promoted at all levels, from fundamental research in Universities and research centers to Industry. A very important aspect is the education of young scientists which will form the next generation of teachers, researchers and engineers. Quantum technologies and, in particular quantum computing, are strategic priorities of international and national funding agencies. In Germany, BMBF is by now taking a central role in actively supporting quantum technologies through funding and by setting up an agenda in close collaboration with the main actors. The German Physical Society has recently funded the new division “Quantum Information”, which collects the growing number of scientists working on quantum technologies in Germany. The Helmholtz society has published a roadmap for quantum technologies for the next 10 years (Astakhov et al ).

This boost of activities, with their impressive scientific and technological advances they carry with them, hold the premise of the so-called “second quantum revolution”, namely, of the groundbreaking and revolutionary potential that quantum physics offers for e.g. cybersecurity, communication, precision measurements, and computing. Among several milestones and breakthroughs, we mention here the recent experiment of Google providing proof of principle of quantum supremacy (Arute et al, 2019 ), the research activities merging of quantum physics with machine learning pursued, amongst others, by CERN and IBM (Zoufal et al, 2019 ), applications in chemistry, in biology, in condensed-matter and high energy physics (Acin et al, 2018 ). In addition, quantum computers can lead to a supremacy in classical optimization problems, e.g. in drug development, traffic and logistics to mention only a few applications (Acin et al, 2018 ). Impressive benchmark calculations have already been performed, the field is rapidly developing and with the anticipated growing number of qubits* it is to be expected that real life applications are employed on quantum computers very soon. The recently established  “JUNIQ Quantum User Faclility” at FzJ aims to speed up this development. It offers a uniform quantum computing platform as a Service (QC-PaaS), integrating quantum computers and quantum annealers in the form of quantum-classical hybrid computing systems into the modular HPC environment of the Jülich Supercomputing Centre. This provides a seamless transition for methods developed on simulators over to actual quantum computing hardware, streamlining the development process of new methods.

Within the Section Q-RDM of our Task Area Evolving Infrastructures, we have identified three main areas where our domain Quantum Information & AI will significantly contribute to the consortium.

Storage and management of classical data for quantum technologies. Storage and management of classical data is a common issue which is shared by theory and experiments in physics, e.g., atomic molecular, optical, condensed-matter, and high energy physics, including the research groups across these fields actively working in quantum information. This motivates the active participation of these groups in the consortium. The workflow of classical simulation of quantum systems is rapidly increasing thanks to the computational power of tensor networks, quantum Monte Carlo, and variational Quantum Simulations (Bañuls et al, 2020 ). Such simulations serve to prepare the actual computations on the quantum hardware, validate the software and the code implementation and provide the size of the needed statistics that needs to be reached. Already nowadays, these classical simulations are being performed on high performance cluster architectures with multiple core processors which can be standard chips or GPUs. In the near future, larger, more realistic system sizes will be studied leading to an increasing demand for using super computers with the corresponding increasing amount of data to be stored and managed.

 Feedback loop between quantum and classical hardware. The remarkable progress in the development of quantum computers has already led to the establishment of platforms at  IBM, Rigetti and others where quantum computations can be performed on dozens of qubits. In a few years the number of qubits will scale up by at least one order of magnitude*. This expected breakthrough poses the problem of the classical workflow between a local machine and the quantum computer. In fact, such transfers of data build already now a bottleneck for the efficient usage of the quantum hardware. This urges for identifying novel solutions and strategies to overcome this barrier. The NFDI4phys consortium is a most adequate platform for adressing this problem through an open exchange of knowledge and expertise. In this way, NFDI4phys will contribute in developing novel strategies for dataflow and storage in the quantum era.

Storage and management of quantum data.  An open and fascinating question is to find solutions for storage of quantum data in quantum platforms. Here, NFDI4phys will contribute towards solving the outstanding question of storage and perform pioneering work towards management of quantum data. FDOs are particular suited for implementing Q-RDM as their hierarchical and modular architecture allows to adjust only their low-level functionality on a Q-RDM platform while keeping the high level interfaces to data lakes essentially unchanged.

*For example IBM has published a roadmap with quantum computers with 127, 433 and 1121 qubits in 2021, 2022 and 2023, respectively


Quantum Artificial Intelligence (Q-AI)

Leadership:

  • Karl Jansen (DESY)
  • Philipp Slusallek (DFKI)

Quantum technologies have ignited research towards quantum AI (Q-AI), especially
quantum machine learning, which ranges from implementing machine learning algorithms
in quantum computers to developing novel concepts for quantum neural networks.
Within the section Q-AI of our Task Area “Evolving Infrastructures” linked to the domain
“Quantum Information & AI”, we will contribute to the consortium as follows.
We pursue development at DESY accompanied by joint consultations with DFKI on practical
solutions for Q-AI methods for the huge amount of data expected in the next generations
of the LHC (Large Hadron Collider) at CERN.
In general,  we expect that completely novel solutions will have to be developed for this purpose since the
increase of computational power alone is not be sufficient to solve this problem.
These efforts will potentially spin off numerous applications relevant for
business, society, and manufacturing. as well as society needs and expectations.
Further, we analyze an adaptation of concepts of the Q-RDM section for the execution of
selected hybrid quantum-classical QAI algorithms with classical/quantum data. This concerns the
quantum/classical feedback loop and the storage of data, both, on the classical and the quantum level.

 

 


Quantum Quality and Standards

Leadership

  • Piet Schmidt (PTB)
  • Kristel Michielsen (FzJ)
  • Holger Israel (PTB)

Quantum devices, and in particular quantum computers, will generate and operate with quantum data stored in the system. Readout of the result of a quantum sensor, quantum key or the result of a quantum computation projects quantum data onto the classical word and destroys the original quantum state. As a consequence of this, quantum data can not be copied (“no cloning theorem”). However, quantum data can be reconstructed statistically by repeating the quantum measurement or algorithm over and over again under the exact same conditions. Different techniques have been developed to represent quantum data in the classical world, such as in the form of a density matrix, matrix product states (MPS), etc. The most efficient way of storing this classical representation of quantum data depends on the particular system.

These special properties set the boundary conditions for future standards and quality assessment for quantum data. Standardization should be hardware agnostic and needs to include established representations of quantum states as well as be open and flexible to newly developed approaches as the field progresses. The inherent statistical nature of the representation needs to be considered, as well as possible systematic errors in retrieving it, such as state preparation and measurement (SPAM) errors. Properties of the quantum system producing the data, such as coherence times, gate fidelities, etc., constitute metadata and are stored together with the data to enable quality assessment and curation.

Quality assessment and standardization of quantum data within NFDI4Phys will be closely coordinated with other relevant  subtasks of  Evolving Infrastructures, as well as with task area Quality Criteria and Standards.


Computing

Leadership

  • Stefan Krieg (FzJ)
  • Karl Jansen (DESY)

The data repositories, metadata services, and analysis tools require hardware infrastructure to be developed and rolled out the consortium and the whole NFDI. Critically, the repositories should make data available quickly, but also be able to store large amounts of datasets of varying sizes: images or simulation data, annotations, data ensembles etc. can require storage in the KB or GB range. An efficient storage hierarchy providing both high-performance as well as high-capacity data services is thus necessary as the backbone of the envisaged repositories of NFDI4Phys. Access and metadata services to the information stored on the storage hierarchy, as they are developed within the consortium, will be implemented on dedicated access nodes, depending on their requirements either as virtual machines or actual compute hardware. Finally, data analysis tools integrated into the NFDI4Phys date services will require compute resources during development, but also in the production phase. Storage capacity on a state-of-the-art storage hierarchy, the resources for the access and metadata services as well as analysis tools are provided by Forschungszentrum Jülich through the Jülich Supercomputing Centre. Especially we will have

  • 1 PByte of hard disk storage and
  • 10 Million core hours/year

available for our consortium.


previous task area (Evolving Data Quality and Standards) | task areas | next task area (Data Literacy)

NFDI4Phys