Quantum Computing

An Introduction to Quantum Computing, Quantum AI and ML 



1. Quantum Computing - Part I

Reproduced by Github https://github.com/desireevl/awesome-quantum-computing

 

Quantum computing utilises quantum mechanical phenomenon such as entanglement and superposition to manipulate qubits to perform computation on a quantum computer. Currently available are tools to create and run programs on publicly usable quantum computers as well as resources to learn about them.

This is a curated list of up-to-date resources on learning about and developing on quantum computers. The goal is to build a categorised community-driven collection of up to date, high quality resources.

Sharing, suggestions and contributions are always welcome! Please take a look at the contribution guidelines and quality standard first. Thanks to all contributors, you're awesome and it wouldn't be possible without you!

Contents

For further resources related to Open Source Quantum Software Projects, please check out qosf's repo.

Learning

MOOCs

Development Tools

Blogs

Books

Popular Science

Videos

Community

Podcasts

 

 

2. Quantum Computing - Part II

 Reproduced by https://www.fraunhofer.de/en/research/current-research/quantum-technologies/quantum-computing.html

Quantum computing looms large on the horizon

Analysts at Morgan Stanley predict that the market for high-end quantum computers will reach 10 billion dollars by 2025, double what it is now. Alongside IBM and Google, there is also Microsoft, the Chinese Internet giant Alibaba and startups such as Novarion, Rigetti and D-Wave. Yet the various manufacturers rely on different physical principles for the realization of the quantum hardware. Scientists distinguish between universal quantum computers, which can perform arbitrary quantum algorithms, and quantum annealers, which are less complex, but limited to very specific tasks. Researchers at VW have been using a D-Wave quantum annealer since 2017 to better simulate traffic flows. And BMW is investigating whether quantum annealers can help optimize its production robots’ performance.

Universal quantum computers are technically very challenging to build and operate. What sets these computers apart is that their performance doubles in power with each added qubit, thereby increasing in exponential rather than in linear fashion. In other words, two qubits yields four possible combinations, three qubits eight, and so on. The quantity of qubits matters, but evenly important is the quality of qubit entanglement and its coherence time. The latter determines how long the quantum system remains stable enough to compute before noise masks the information. Most universal quantum computers, such as Google’s 72-qubit Bristlecone, only work under special laboratory conditions.

In January 2019, IBM unveiled the IBM Q System One, the world’s first commercially viable quantum computer – meaning that it works outside a lab. A consortium of seven Fraunhofer Institutes in Germany has been tasked to look into real-world applications for quantum computing as of 2021 in a bid to drive the advance of applied quantum science in the EU. “We want to find out just what kind of applications there are for quantum computing in industry and how to write the necessary algorithms and translate them for specific applications,” explains Hauswirth. The initiative also aims to keep entry barriers low by sharing insights with companies to fast-track the industry’s efforts to build a knowledge base in quantum computing.

There are still high hurdles to clear on the path to upscale the performance of available quantum computers. The priority now is to find ways of shielding the fragile quanta from ambient influences that interfere with the computing process. For example, qubits have to be cooled to a temperature approaching absolute zero – around minus 273 degrees Celsius, which is colder than outer space. They also require a vacuum and have to be shielded against electromagnetic radiation. Vibrations and parasitic effects of electromagnetic waves used to manipulate the qubits and read out the information they carry can also cause problems.

Solving complex problems

What kind of real problems can quantum computers solve? “In a few years from now, quantum computers will provide highly efficient means for prime factorization. That will leave current cryptographic systems vulnerable, which is why major research into post-quantum cryptography is underway,” says Hauswirth. Quantum computers will be able to tackle even more complex problems a few years down the road: “Today’s fintech, for example, has trouble managing billions of cash flows in parallel and in real time within the confines of a very tight regulatory girdle. Sequential processing is still prone to errors, but quantum computers would help get around this.”

Prof. Anita Schöbel is director of the Fraunhofer Institute for Industrial Mathematics ITWM in Kaiserslautern. She and Hauswirth are mainly responsible for quantum computing at Fraunhofer. Pointing to an application in the works at her institute, she says, “We’re working on projects that use stochastic partial differential equations such as the Fokker-Planck equations. These serve to develop lithium ion batteries and wind turbines, calculate granular flows and determine prices in quantitative finance. These equations can be converted into quantum mechanics equations for quantum computers to crunch the numbers, probably much faster.”

Applied quantum computing is clearly taking shape in the real world. Will we all have a quantum home computer or a quantum processor in our smartphones in a few years? “Quantum computers will only ever be able to solve very specific problems, so they won’t replace conventional computers. It’s likely that cloud-based models will prevail – that is, quantum computing as a service (QCaaS). We’ll probably also see hybrids of quantum computing and conventional high-performance computing,” says Hauswirth.

When will the quantum computer arrive?

Three questions for Prof. Manfred Hauswirth, Fraunhofer FOKUS, on the quantum computer initiative with IBM   What is this project all about?

In partnership with IBM, we are going to install Europe’s first commercial quantum computer at a location in Germany. The aim is to develop applied quantum computing solutions for a range of fields and assess their viability. We would like to see companies of all sizes involved in this project.

Why does this matter?

It is early days yet for applied research in quantum computing. We need to define quantum algorithms and then convert them for easy use in applications programming. That requires expertise on the part of industry, so we want to fast-track efforts to build a knowledge base here in Germany. This initiative will also enable us to pursue quantum computing under full data sovereignty according to European law, without being dependent on large Internet corporations from overseas.

When do you expect to see the first results?

A quantum computer is to be installed in Germany in 2021. But even optimistic forecasts suggest it’s going to take another 10 to 20 years before businesses can use quantum computers.

From supercomputer to superinternet

Research teams around the world are working on the most efficient way to couple together multiple supercomputers using quantum information to create a quantum internet. At QuTech in Delft, a number of partners, among them the Fraunhofer Institute for Laser Technology ILT, are currently working on a highly ambitious project. By 2022, they hope to have built the world’s first quantum internet demonstrator in the Netherlands with the aim of achieving lasting entanglement of qubits over long distances. Nodes at four locations will be connected together via fibre-optic cable. This will enable greater computing capacity, as well as completely new applications, such as blind quantum computing, where computations are performed securely, privately and anonymously on quantum computers in the cloud. According to Florian Elsen, coordinator for quantum technology at Fraunhofer ILT in Aachen, the big challenge lies in “transmitting single, fragile qubits through a fibre-optic cable as losslessly as possible. To achieve this, we carry out frequency conversion, meaning that we modify the wavelength of single photons without changing other significant properties.” Once you have a quantum internet, it is not much of a leap to quantum communication.

How does a quantum computer work?  

A conventional computer works with bits; a quantum computer with qubits. Like bits, qubits can have a value of 0 or 1. Unlike bits, they occupy a superposition of overlapping quantum states, so they can also have any combination of the two. A qubit does not take on a definite value until it is measured. Adding one qubit doubles the system’s performance so that 50 qubits, for example, would yield 2 to the power of 50 (250) possible combinations. This way, big problems and complex tasks are computed in parallel rather than in linear fashion.

 

David Di Vincenzo’s* five criteria for a quantum computer

1. A scalable physical system with well characterized qubit
2. The ability to initialize the state of the qubits to a simple fiducial state
3. A "universal" set of quantum gates
4. A qubit-specific measurement capability
5. Long relevant decoherence times
 

*Pioneer of quantum information science and professor of theoretical physics at RWTH Aachen

 

 

3. Quantum Artificial Intelligence (QuAI) - Part I

Reproduced by https://research.aimultiple.com/quantum-ai/

 

Quantum computing and artificial intelligence are both transformational technologies and artificial intelligence is likely to require quantum computing to achieve significant progress. Although artificial intelligence produces functional applications with classical computers, it is limited by the computational capabilities of classical computers. Quantum computing can provide a computation boost to artificial intelligence, enabling it to tackle more complex problems and AGI.

What is quantum AI?

Quantum AI is the use of quantum computing for computation of machine learning algorithms. Thanks to computational advantages of quantum computing, quantum AI can help achieve results that are not possible to achieve with classical computers.

What is quantum computing?

Quantum mechanics is a universal model based on different principles than those observed in daily life. A quantum model of data is needed to process data with quantum computing. Hybrid quantum-classical models are also necessary in quantum computing for error correction and correct functioning of the quantum computer.

For more, feel free to read our detailed article on the topic.

Why is it important?

Although AI has made rapid progress over the past decade, it has not yet overcome technological limitations. With the unique features of quantum computing, obstacles to achieve AGI (Artificial General Intelligence) can be eliminated. Quantum computing can be used for the rapid training of machine learning models and to create optimized algorithms. An optimized and stable AI provided by quantum computing can complete years of analysis in a short time and lead to advances in technology. Neuromorphic cognitive models, adaptive machine learning, or reasoning under uncertainty are some fundamental challenges of today’s AI. Quantum AI is one of the most likely solutions for next-generation AI.

How does quantum AI work?

Recently, Google announced TensorFlow Quantum(TFQ): an open-source library for quantum machine learning, in collaboration with the University of WaterlooX, and Volkswagen. The aim of TFQ is to provide the necessary tools to control and model natural or artificial quantum systems. TFQ is an example of a suite of tools that combines quantum modeling and machine learning techniques.

SOURCE: GOOGLE
  1. Convert quantum data to the quantum dataset: Quantum data can be represented as a multi-dimensional array of numbers which is called as quantum tensors. TensorFlow processes these tensors in order to represent create a dataset for further use.
  2. Choose quantum neural network models:  Based on the knowledge of the quantum data structure,  quantum neural network models are selected. The aim is to perform quantum processing in order to extract information hidden in an entangled state.
  3. Sample or Average: Measurement of quantum states extracts classical information in the form of samples from the classical distribution. The values are obtained from the quantum state itself. TFQ provides methods for averaging over several runs involving steps (1) and (2).
  4. Evaluate a classical neural networks model – Since quantum data is now converted into classical data, deep learning techniques are used to learn the correlation between data.

The other steps of evaluating cost function, gradients, and updating parameters are classical steps of deep learning. These steps make sure that an effective model is created for unsupervised tasks.

What are the possibilities of applying quantum computing in AI?

Researchers’ near term realistic aim for quantum AI is to create quantum algorithms that perform better than classical algorithms and put them into practice.

What are the critical milestones for quantum AI?

Although quantum AI is an immature technology, there are improvements in quantum computing which increase the potential of quantum AI. However, the quantum AI industry needs critical milestones in order to become a more mature technology. These milestones can be summarized as:

These critical steps would enable quantum AI for further developments.

 

 

4. Quantum Artificial Intelligence (QuAI) - Part II

Reproduced by https://ti.arc.nasa.gov/tech/dash/groups/quail/

NASA Quantum Artificial Intelligence Laboratory (QuAIL)

QuAIL is the space agency's hub for assessing the potential of quantum computers to impact computational challenges faced by the agency in the decades to come.

NASA’s QuAIL team aims to demonstrate that quantum computing and quantum algorithms may someday dramatically improve the agency’s ability to address difficult optimization and machine learning problems arising in NASA's aeronautics, Earth and space sciences, and space exploration missions.

NASA's QuAIL team has extensive and experience utilizing near-term quantum computing hardware to evaluate the potential impact of quantum computing. The team has international recognized approaches to the programming and compilation of optimization problems to near-term quantum processors, both gate-model quantum processors and quantum annealers, enabling efficient utilization of the prototype quantum hardware available for experimenting with quantum and quantum-classical hybrid approaches for exact and approximate optimization and sampling.The has ongoing research developing quantum computational approaches to challenging combinatorial optimization and sampling problems with relevance to areas such as planning and scheduling, fault diagnosis, and machine learning.

A key component of this work is close collaboration with quantum hardware groups. The team's initial focus was on quantum annealing, since D-Wave quantum annealers were the first quantum computational devices available. As gate-model processors have matured, with gate-model processors with 10s of qubits now available, the group has extended its research to include substantial gate-model efforts in addition to deepening our quantum annealing research. For more information on our research, please see our Research Overview and Publication pages.

The NASA QuAIL team leads the T&E team for the IARPA QEO (quantum enhanced optimization) program, has formal collaborative agreements with quantum hardware groups at Google and Rigetti, and research collaborations with many other entities at the forefront of quantum computing, as well as a three-way agreement between Google-NASA-USRA related to the D-Wave machine hosted at NASA Ames.

The QuAIL group's expertise spans physics, computer science, mathematics, chemistry, and engineering.

What is Quantum Computing?

Quantum computing is based on quantum bits or qubits. Unlike traditional computers, in which bits must have a value of either zero or one, a qubit can represent a zero, a one, or both values simultaneously. Representing information in qubits allows the information to be processed in ways that have no equivalent in classical computing, taking advantage of phenomena such as quantum tunneling and quantum entanglement. As such, quantum computers may theoretically be able to solve certain problems in a few days that would take millions of years on a classical computer.

News and Events

Dr. Eleanor Rieffel Panelist at the CES

January 13, 2021

Dr. Eleanor Rieffel will serve on the panel “Quantum Computing – Making It Real” at the Consumer Electronics Show (CES). Wed January 13, 2021, 2:45PM. Other panelists include Joseph Broz (QED-C) and Katie Pizzolato (IBM), and the panel will be moderated by Michael Bergman (Consumer Technology Association).

Dr. Eleanor Rieffel Selected as a 2020 NASA Ames Associate Felllow

July 17, 2020

Dr. Eleanor Rieffel was awarded the 2020 Ames Associate Fellow for her pioneering work in the field of quantum information processing. Her work significantly advances the state of the art in quantum computing and its application to the NASA mission in aeronautics, space exploration, and earth science.

The Ames Associate Fellow is an honorary designation that acknowledges distinguished scientific research or outstanding engineering of a non-management related nature. Appointment as Ames Associate Fellow is for a two-year term. The winning researchers receive a personal award, a research stipend, a travel grant, and will give a lecture to the center.

NASA Ames and Quantum Supremacy

October 24, 2019

In partnership with Google and the Oak Ridge National Laboratory, our researchers in the Quantum Artificial Intelligence Laboratory (QuAIL) group worked to demonstrate the ability to compute in seconds what would take even the largest and most advanced supercomputers thousands of years to achieve, a milestone known as quantum supremacy. This remarkable achievement is featured on the cover of the Oct. 24, 2019 issue of the science journal Nature.

Using our supercomputing facilities, researchers here at Ames advanced techniques for simulating quantum computations - work that helped set the bar for Google's quantum computer to beat. The achievement of quantum supremacy means that the processing power and control mechanisms now exist for scientists to run their code with confidence and see what happens beyond the limits of what can be done on supercomputers. Experimentation with quantum computing is now possible in a way it never has been before.

This is another example of the great and important work we do here at Ames. The high goals we set, the milestones we achieve, the hard work and dedication we contribute as a community is what continues to allow us to push the boundaries of exploration to new heights.

For more information about Ames' contribution to quantum supremacy: https://www.nasa.gov/feature/ames/quantum-supremacy

Flexible Quantum Circuit Simulator (qFlex) Framework Open Sourced

October 24, 2019

Flexible Quantum Circuit Simulator (qFlex) implements an efficient tensor network, CPU-based simulator of large quantum circuits. qFlex computes exact probability amplitudes, a task that proves essential for the verification of quantum hardware, as well as mimics quantum machines by computing amplitudes with low fidelity. qFlex targets quantum circuits in the range of sizes expected for supremacy experiments based on random quantum circuits, in order to verify and benchmark such experiments.

The qFlex framework is licensed under the Apache License, Version 2.0, and is available for download at https://github.com/ngnrsaa/qflex

NASA Ames hosts AQC-18

June 25-28, 2018

Adiabatic Quantum Computing (AQC) and Quantum Annealing are computational methods that have been proposed to solve combinatorial optimization and sampling problems. Several efforts are now underway to manufacture processors that implement these strategies. The Seventh International Conference on AQC brings together researchers from different communities to explore this computational paradigm. The goal of the conference is to initiate a dialogue on the challenges that must be overcome to realize useful adiabatic quantum computations in existing or near-term hardware. Read More

Quantum Annealer with more than 2000 qubits installed and operational

August 31, 2017

We upgraded the D-Wave quantum annealer hosted here at NASA Ames to a D-Wave 2000Q system. The newly upgraded system, which resides at the NASA Advanced Supercomputing Facility at NASA's Ames Research Center, has 2031 quantum bits (qubits) in its working graph—nearly double the number of qubits compared to the previous processor. It has several system enhancements that enable more control over the adiabatic quantum computing process allowing it to solve larger and more complex optimization problems than were previously possible. 

 

 

5. Quantum Artificial Intelligence (QuAI) - Part III

Reproduced by https://github.com/PennyLaneAI?language=python

 

PennyLaneAI

PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.

https://github.com/PennyLaneAI?language=python

 

 

  

6. Quantum Machine Learning (QuML) - Part I

Reproduced by https://pennylane.ai/qml/whatisqml.html

 

What is Quantum Machine Learning?

Quantum machine learning is a research area that explores the interplay of ideas from quantum computing and machine learning.

For example, we might want to find out whether quantum computers can speed up the time it takes to train or evaluate a machine learning model. On the other hand, we can leverage techniques from machine learning to help us uncover quantum error-correcting codes, estimate the properties of quantum systems, or develop new quantum algorithms.

Quantum computers as AI accelerators

_images/gpu_to_qpu.png

The limits of what machines can learn have always been defined by the computer hardware we run our algorithms on—for example, the success of modern-day deep learning with neural networks is enabled by parallel GPU clusters.

Quantum machine learning extends the pool of hardware for machine learning by an entirely new type of computing device—the quantum computer. Information processing with quantum computers relies on substantially different laws of physics known as quantum theory.

Machine learning on near-term quantum devices

_images/quantum_devices_ai.png

Some research focuses on ideal, universal quantum computers (“fault-tolerant QPUs”) which are still years away. But there is rapidly-growing interest in quantum machine learning on near-term quantum devices.

We can understand these devices as special-purpose hardware like Application-Specific Integrated Circuits (ASICs) and Field-Programmable Gate Arrays (FPGAs), which are more limited in their functionality.

Using quantum computers like neural networks

_images/trainable_circuit.png

In the modern viewpoint, quantum computers can be used and trained like neural networks. We can systematically adapt the physical control parameters, such as an electromagnetic field strength or a laser pulse frequency, to solve a problem.

For example, a trained circuit can be used to classify the content of images, by encoding the image into the physical state of the device and taking measurements.

The bigger picture: differentiable programming

But the story is bigger than just using quantum computers to tackle machine learning problems. Quantum circuits are differentiable, and a quantum computer itself can compute the change in control parameters needed to become better at a given task.

Differentiable programming is the very basis of deep learning, implemented in software libraries such as TensorFlow and PyTorch. Differentiable programming is more than deep learning: it is a programming paradigm where the algorithms are not hand-coded, but learned.

_images/applications.png

Similarly, the idea of training quantum computers is larger than quantum machine learning. Trainable quantum circuits can be leveraged in other fields like quantum chemistry or quantum optimization. It can help in a variety of applications such as the design of quantum algorithms, the discovery of quantum error correction schemes, and the understanding of physical systems.

PennyLane for quantum differentiable programming

PennyLane is an open-source software framework built around the concept of quantum differentiable programming. It seamlessly integrates classical machine learning libraries with quantum simulators and hardware, giving users the power to train quantum circuits.

To find out more, visit the PennyLane Documentation, or check out the gallery of hands-on quantum machine learning demonstrations.

_images/jigsaw.png

 

 

 

7. Quantum Machine Learning (QuML) - Part II

Reproduced by https://www.geeksforgeeks.org/the-ultimate-guide-to-quantum-machine-learning-the-next-big-thing/

 

The Ultimate Guide to Quantum Machine Learning – The next Big thing

Innovation in machine learning is far from complete. In fact, things are just about to take a ‘quantum leap’ for the good, when the world of quantum physics and machine learning come together to solve even more advanced problems through intelligent computing. That’s right, Heisenberg’s Uncertainty Principle and the famous Schrödinger’s Cat could help develop advanced quantum machine learning systems that are capable of accelerating the current machine learning models so that they work even faster, as well as help develop entirely new machine learning models that could do unprecedented things. Although it will be a while before quantum machine learning goes mainstream, but as of now, almost all the tech giants like IBM, Microsoft and NASA are already getting on board with this fascinating new tech.

Quantum Concepts That Influence Machine Learning –

Quantum machine learning is an interdisciplinary approach that combines machine learning and the principles of quantum Physics. To understand this, let’s take a look at some of the basic concepts in quantum physics that are at play here –

Quantum:
Physicist Max Planck in 1900 proposed that at the subatomic level, energy is contained in tiny discrete packets called quanta, which behave as both waves and particles, depending on their environment at the time. The basis of quantum theory relies on the observation that at any point in time, these particles could be in any state and may change their state.

Qubits:
The classical computing methods we use today work on chips that process all data using 2 bits – 0 and 1. Even the most complex data or algorithm you input gets broken down into these two bits. Quantum machine learning on the other hand uses the unit ‘qubits’, short for quantum bits. In quantum physics, these qubits could be electrons or protons orbiting a nucleus in an atom.

Superposition:
These quantum particles or Qubits may exist as both 0 and 1 at the same time. This is a phenomenon known as Superposition. Essentially, this means that a particle can exist in multiple quantum states and when placed under supervision, i.e. when we try to measure its position, it undergoes change and its superposition is lost.



Entanglement:
Different qubits interact with each other on an atom in a way that the state of one particle cannot be described independently of the other particles. So even when the particles are separated by a large distance, they communicate with each other in a correlated manner.

So How Does All This Figure in Machine Learning?

Understanding the quantum physics of matter can help develop new special purpose hardware or quantum computers that are superior to the ones we have right now in terms of how much data they can process per second and the kind of computing they can accomplish. Quantum computers offer the immense computational advantage of being able to classify objects in their nth dimension, a feat impossible to achieve on normal classical computers. Using the above described principles of superposition and entanglement, these devices pack in an incredible amount of computational power.

If you are already in awe of hardware such as ASICs (application-specific integrated circuits) and FPGAs (field-programmable gate arrays) to facilitate machine learning, prepare to experience a performance of a much higher order with quantum machine learning.  Quantum chips can be used to map out phenomenal computer algorithms to solve complex problems. While quantum computing proponents make promising advances into arenas of creating new chemicals and drugs with this technology, machine learning aficionados are looking into a future where complex algorithms can map out the brain circuitry, decode the genetic makeup, build a specialized infrastructure that combines biometrics and IOT devices to enable high level security devices and even unlock some phenomenal new discoveries about the vast mysterious universe. Yes, quantum machine learning could facilitate mapping out trillions of neurons firing in our brain at the same time.

Some of the current machine learning processes that can be accelerated by quantum machine learning are –

Linear Algebra:
When it comes to executing linear algebra computations, quantum computers can exponentially speed up the prospects. A quantum gate can execute an exponentially large matrix with an equally large vector at advanced speed in a single operation, helping build machine learning models out of quantum algorithms. This significantly brings down the costs as well as times associated with linear algebra computations.

Optimization:
Be it physicists, chemists or data scientists, everyone is trying to find a way to the point of lowest energy in a high-dimensional energy landscape. In the world of adiabatic quantum computing and quantum annealing, optimization is everyone’s priority. Quantum machine learning can have a strong footprint in optimization, which also happens to be one of the first tasks physicists attempted in the context of quantum machine learning.

Kernel Evaluation:
Quantum machine learning can be used to perform kernel evaluation by feeding estimates from a quantum computer can be fed into the standard kernel method. While the training and inferencing of the model will have to be done in the standard support vector machine, using special-purpose quantum support vector machines could help accelerate the process. As the feature space expands, kernel functions in classical computing become computationally expensive to estimate. This is where quantum algorithms step in. quantum properties like entanglement and interference help create a massive quantum state space that can hugely improve kernel evaluation.

Deep Learning:
Deep learning is one of the most impactful applications of machine learning and artificial intelligence in the recent times. Quantum computers could make deep learning a whole lot more profound by solving complex problems that are intractable on classical computers. In an experiment to train a deep Boltzmann machine, researchers from Microsoft used quantum models and found that they could not only train the Boltzmann machine faster but also achieve a much more comprehensive deep learning framework than a classical computer could ever yield.

Conclusion –
The true potential of quantum machine learning will begin to see fruition in a few years from now, but already, significant progress is being made in the direction. High-quality quantum machine learning algorithms will enable scientists to develop whole new methods to improve lives and facilitate solutions that are so far only imagined.

 

 

  

8. Quantum Machine Learning (QuML) - Part III

Reproduced by Github https://github.com/krishnakumarsekar/awesome-quantum-machine-learning

 

A curated list of awesome quantum machine learning algorithms,study materials,libraries and software (by language).

Main Architecture

Quantum Kernel

In Depth Physics Comparison

Table of Contents

INTRODUCTION

Why Quantum Machine Learning? Machine Learning(ML) is just a term in recent days but the work effort start from 18th century. What is Machine Learning ? , In Simple word the answer is making the computer or application to learn themselves . So its totally related with computing fields like computer science and IT ? ,The answer is not true . ML is a common platform which is mingled in all the aspects of the life from agriculture to mechanics . Computing is a key component to use ML easily and effectively . To be more clear ,Who is the mother of ML ?, As no option Mathematics is the mother of ML . The world tremendous invention complex numbers given birth to this field . Applying mathematics to the real life problem always gives a solution . From Neural Network to the complex DNA is running under some specific mathematical formulas and theorems. As computing technology growing faster and faster mathematics entered into this field and makes the solution via computing to the real world . In the computing technology timeline once a certain achievements reached peoples interested to use advanced mathematical ideas such as complex numbers ,eigen etc and its the kick start for the ML field such as Artificial Neural Network ,DNA Computing etc. Now the main question, why this field is getting boomed now a days ? , From the business perspective , 8-10 Years before during the kick start time for ML ,the big barrier is to merge mathematics into computing field . people knows well in computing has no idea on mathematics and research mathematician has no idea on what is computing . The education as well as the Job Opportunities is like that in that time . Even if a person tried to study both then the business value for making a product be not good. Then the top product companies like Google ,IBM ,Microsoft decided to form a team with mathematician ,a physician and a computer science person to come up with various ideas in this field . Success of this team made some wonderful products and they started by providing cloud services using this product . Now we are in this stage. So what's next ? , As mathematics reached the level of time travel concepts but the computing is still running under classical mechanics . the companies understood, the computing field must have a change from classical to quantum, and they started working on the big Quantum computing field, and the market named this field as Quantum Information Science .The kick start is from Google and IBM with the Quantum Computing processor (D-Wave) for making Quantum Neural Network .The field of Quantum Computer Science and Quantum Information Science will do a big change in AI in the next 10 years. Waiting to see that........... .(googleibm). References

BASICS

What is Quantum Mechanics? In a single line study of an electron moved out of the atom then its classical mechanic ,vibrates inside the atom its quantum mechanics

What is Quantum Computing? A way of parallel execution of multiple processess in a same time using qubit ,It reduces the computation time and size of the processor probably in neuro size

Quantum Computing vs Classical Computing

Quantum Computing

Atom Structure one line : Electron Orbiting around the nucleous in an eliptical format

atom

Photon Wave one line : Light nornmally called as wave transmitted as photons as similar as atoms in solid particles

Photon wave

Electron Fluctuation or spin one line : When a laser light collide with solid particles the electrons of the atom will get spin between the orbitary layers of the atom

Spin

States one line : Put a point on the spinning electron ,if the point is in the top then state 1 and its in bottom state 0

States

SuperPosition two line : During the spin of the electron the point may be in the middle of upper and lower position, So an effective decision needs to take on the point location either 0 or 1 . Better option to analyse it along with other electrons using probability and is called superposition

SuperPosition

SuperPosition specific for machine learning(Quantum Walks) one line : As due to computational complexity ,quantum computing only consider superposition between limited electrons ,In case to merge more than one set quantum walk be the idea

SuperPosition specific for machine learning

Classical Bits one line : If electron moved from one one atom to other ,from ground state to excited state a bit value 1 is used else bit value 0 used

Classical Bits

Qubit one line : The superposition value of states of a set of electrons is Qubit

Qubit

Basic Gates in Quantum Computing one line : As like NOT, OR and AND , Basic Gates like NOT, Hadamard gate , SWAP, Phase shift etc can be made with quantum gates

Basic Gates in Quantum Computing

Quantum Diode one line : Quantum Diodes using a different idea from normal diode, A bunch of laser photons trigger the electron to spin and the quantum magnetic flux will capture the information

Diodes in Quantum Computing1 Diodes in Quantum Computing2 Diodes in Quantum Computing3

Quantum Transistors one line : A transistor default have Source ,drain and gate ,Here source is photon wave ,drain is flux and gate is classical to quantum bits

Quantum Transistors1 Quantum Transistors2

Quantum Processor one line : A nano integration circuit performing the quantum gates operation sorrounded by cooling units to reduce the tremendous amount of heat

Quantum Processor1 Quantum Processor2 Quantum Processor3

Quantum Registery QRAM one line : Comapring the normal ram ,its ultrafast and very small in size ,the address location can be access using qubits superposition value ,for a very large memory set coherent superposition(address of address) be used

QRAM1 QRAM2

QUANTUM COMPUTING MACHINE LEARNING BRIDGE

Complex Numbers one line : Normally Waves Interference is in n dimensional structure , to find a polynomial equation n order curves ,better option is complex number

Complex Numbers1 Complex Numbers2 Complex Numbers3

Tensors one line : Vectors have a direction in 2D vector space ,If on a n dimensional vector space ,vectors direction can be specify with the tensor ,The best solution to find the superposition of a n vector electrons spin space is representing vectors as tensors and doing tensor calculus

Tensors1 Tensors2 Tensors3 Tensors4

Tensors Network one line : As like connecting multiple vectors ,multple tensors form a network ,solving such a network reduce the complexity of processing qubits

Tensors Network1 Tensors Network2 Tensors Network3

QUANTUM MACHINE LEARNING ALGORITHMS

Quantum K-Nearest Neighbour info : Here the centroid(euclidean distance) can be detected using the swap gates test between two states of the qubit , As KNN is regerssive loss can be tally using the average

Quantum K-Means info : Two Approaches possible ,1. FFT and iFFT to make an oracle and calculate the means of superposition 2. Adiobtic Hamiltonian generation and solve the hamiltonian to determine the cluster

Quantum Fuzzy C-Means info : As similar to kmeans fcm also using the oracle dialect ,but instead of means,here oracle optimization followed by a rotation gate is giving a good result

Quantum Support Vector Machine info : A little different from above as here kernel preparation is via classical and the whole training be in oracles and oracle will do the classification, As SVM is linear ,An optimal Error(Optimum of the Least Squares Dual Formulation) Based regression is needed to improve the performance

Quantum Genetic Algorithm info : One of the best algorithm suited for Quantum Field ,Here the chromosomes act as qubit vectors ,the crossover part carrying by an evaluation and the mutation part carrying by the rotation of gates

Flow Chart

Quantum Hidden Morkov Models info : As HMM is already state based ,Here the quantum states acts as normal for the markov chain and the shift between states is using quantum operation based on probability distribution

Flow Chart

Quantum state classification with Bayesian methods info : Quantum Bayesian Network having the same states concept using quantum states,But here the states classification to make the training data as reusable is based on the density of the states(Interference)

Bayesian Network Sample1
Bayesian Network Sample2
Bayesian Network Sample3

Quantum Ant Colony Optimization info : A good algorithm to process multi dimensional equations, ACO is best suited for Sales man issue , QACO is best suited for Sales man in three or more dimension, Here the quantum rotation circuit is doing the peromene update and qubits based colony communicating all around the colony in complex space

Ant Colony Optimization 1

Quantum Cellular Automata info : One of the very complex algorithm with various types specifically used for polynomial equations and to design the optimistic gates for a problem, Here the lattice is formed using the quatum states and time calculation is based on the change of the state between two qubits ,Best suited for nano electronics

Quantum Cellular Automata

QAUNTUM NEURAL NETWORK

QNN 1

one line : Its really one of the hardest topic , To understand easily ,Normal Neural Network is doing parallel procss ,QNN is doing parallel of parallel processess ,In theory combination of various activation functions is possible in QNN ,In Normal NN more than one activation function reduce the performance and increase the complexity

Quantum perceptrons info : Perceptron(layer) is the basic unit in Neural Network ,The quantum version of perceptron must satisfy both linear and non linear problems , Quantum Concepts is combination of linear(calculus of superposition) and nonlinear(State approximation using probability) ,To make a perceptron in quantum world ,Transformation(activation function) of non linearity to certain limit is needed ,which is carrying by phase estimation algorithm

Quantum Perceptron 1
Quantum Perceptron 2
Quantum Perceptron 3 Quantum Perceptron 4 Quantum Perceptron 5

QAUNTUM STATISTICAL DATA ANALYSIS

quantumstatistics1 quantumstatistics2 quantumstatistics3 quantumstatistics4 quantumstatistics5 quantumstatistics6

one line : An under research concept ,It can be seen in multiple ways, one best way if you want to apply n derivative for a problem in current classical theory its difficult to compute as its serialization problem instead if you do parallelization of differentiation you must estimate via probability the value in all flows ,Quantum Probability Helps to achieve this ,as the loss calculation is very less . the other way comparatively booming is Quantum Bayesianism, its a solution to solve most of the uncertainity problem in statistics to combine time and space in highly advanced physical research

QUANTUM PROGRAMMING LANGUAGES , TOOLs and SOFTWARES

All info : All Programming languages ,softwares and tools in alphabetical order

QUANTUM HOT TOPICS

Deep Quantum Learning why and what is deep learning? In one line , If you know deep learning you can get a good job :) ,Even a different platform undergraduated and graduated person done a master specialization in deep learning can work in this big sector :), Practically speaking machine learning (vector mathematics) , deep learning (vector space(Graphics) mathematics) and big data are the terms created by big companies to make a trend in the market ,but in science and research there is no word such that , Now a days if you ask a junior person working in this big companies ,what is deep learning ,you will get some reply as "doing linear regression with stochastic gradient for a unsupervised data using Convolutional Neural Network :)" ,They knows the words clearly and knows how to do programming using that on a bunch of "relative data" , If you ask them about the FCM , SVM and HMM etc algorithms ,they will simply say these are olden days algorithms , deep learning replaced all :), But actually they dont know from the birth to the till level and the effectiveness of algorithms and mathematics ,How many mathematical theorems in vector, spaces , tensors etc solved to find this "hiding the complexity technology", They did not played with real non relative data like medical images, astro images , geology images etc , finding a relation and features is really complex and looping over n number of images to do pattern matching is a giant work , Now a days the items mentioned as deep learning (= multiple hidden artifical neural network) is not suitable for that why quantum deep learning or deep quantum learning? In the mid of Artificial Neural Network Research people realised at the maximum extreme only certain mathematical operations possible to do with ANN and the aim of this ANN is to achieve parallel execution of many mathematical operations , In artificial Intelligence ,the world intelligence stands for mathematics ,how effective if a probem can be solvable is based on the mathematics logic applying on the problem , more the logic will give more performance(more intelligent), This goal open the gate for quantum artificial neural network, On applying the ideas behind the deep learning to quantum mechanics environment, its possible to apply complex mathematical equations to n number of non relational data to find more features and can improve the performance

Quantum Machine Learning vs Deep Learning

Its fun to discuss about this , In recent days most of the employees from Product Based Companies Like google,microsoft etc using the word deep learning ,What actually Deep Learning ? and is it a new inventions ? how to learn this ? Is it replacing machine learning ? these question come to the mind of junior research scholars and mid level employees The one answer to all questions is deep learning = parallel "for" loops ,No more than that ,Its an effective way of executing multiple tasks repeatly and to reduce the computation cost, But it introduce a big cap between mathematics and computerscience , How ? All classical algorithms based on serial processing ,Its depends on the feedback of the first loop ,On applying a serial classical algorithm in multiple clusters wont give a good result ,but some light weight parallel classical algorithms(Deep learning) doing the job in multiple clusters and its not suitable for complex problems, What is the solution for then? As in the title Quantum Machine Learning ,The advantage behind is deep learning is doing the batch processing simply on the data ,but quantum machine learning designed to do batch processing as per the algorithm The product companies realised this one and they started migrating to quantum machine learning and executing the classical algorithms on quantum concept gives better result than deep learning algorithms on classical computer and the target to merge both to give very wonderful result References

QUANTUM MEETUPS

QUANTUM BASED DEGREES

Plenty of courses around the world and many Universities Launching it day by day ,Instead of covering only Quantum ML , Covering all Quantum Related topics gives more idea in the order below Available Courses Quantum Mechanics for Science and Engineers Quantum Physics Quantum Chemistry Quantum Computing Quantum Technology Quantum Information Science Quantum Electronics Quantum Field Theory Quantum Computer Science Quantum Artificial Intelligence and Machine Learning Quantum Mathematics

CONSOLIDATED Quantum Research Papers

Recent Quantum Updates forum ,pages and newsletter

 

 

9. Quantum Machine Learning (QuML) - Part IV

Reproduced by Github https://github.com/artix41/awesome-quantum-ml

 

A list of awesome papers and cool resources in the field of quantum machine learning (machine learning algorithms running on quantum devices). It does not include the use of classical ML algorithms for quantum purpose. 

Papers

Reviews

Discrete-variables quantum computing

Theory Variational circuits

Variational circuits are quantum circuits with variable parameters that can be optimized to compute a given function. They can for instance be used to classify or predict properties of quantum and classical data, sample over complicated probability distributions (as generative models), or solve optimization and simulation problems.

QRAM-based quantum ML Tensor Networks Reinforcement learning Optimization Kernel methods and SVM

Quantum circuits that are used to extract features from data or to improve kernel-based ML algorithms in general:

Dequantization of quantum ML

Kingdom of Ewin Tang. Papers showing that a given quantum machine learning algorithm does not lead to any improved performance compared to a classical equivalent (either asymptotically or including constant factors):

Continuous-variables quantum computing

Variational circuits Kernel methods and SVM

Other awesome lists

 

 

 

10. Quantum Machine Learning (QuML) - Part V

 
Reproduced by https://github.com/tensorflow/quantum

 TensorFlow Quantum (TFQ) is a Python framework 

TensorFlow Quantum

TensorFlow Quantum (TFQ) is a Python framework for hybrid quantum-classical machine learning that is primarily focused on modeling quantum data. TFQ is an application framework developed to allow quantum algorithms researchers and machine learning applications researchers to explore computing workflows that leverage Google’s quantum computing offerings, all from within TensorFlow.

Motivation

Quantum computing at Google has hit an exciting milestone with the achievement of Quantum Supremacy. In the wake of this demonstration, Google is now turning its attention to developing and implementing new algorithms to run on its Quantum Computer that have real world applications.

To provide users with the tools they need to program and simulate a quantum computer, Google is working on Cirq. Cirq is designed for quantum computing researchers who are interested in running and designing algorithms that leverage existing (imperfect) quantum computers.

TensorFlow Quantum provides users with the tools they need to interleave quantum algorithms and logic designed in Cirq with the powerful and performant ML tools from TensorFlow. With this connection we hope to unlock new and exciting paths for Quantum Computing research that would not have otherwise been possible.

Installation

See the installation instructions.

Examples

All of our examples can be found here in the form of Python notebook tutorials

Report issues

Report bugs or feature requests using the TensorFlow Quantum issue tracker.

We also have a Stack Overflow tag for more general TFQ related discussions.

In the meantime check out the install instructions to get the experimental code running!

Contributing

We are eager to collaborate with you! TensorFlow Quantum is still a very young code base, if you have ideas for features that you would like added feel free to check out our Contributor Guidelines to get started.

References

If you use TensorFlow Quantum in your research, please cite:

TensorFlow Quantum: A Software Framework for Quantum Machine Learning arXiv:2003.02989, 2020.

 

 

11. Quantum Machine Learning (QuML) - Part VI

 

QML: A Python Toolkit for Quantum Machine Learning

QML is a Python2/3-compatible toolkit for representation learning of properties of molecules and solids.

Current list of contributors:

1) Citing QML:

Until the preprint is available from arXiv, please cite this GitHub repository as:

AS Christensen, LA Bratholm, FA Faber, B Huang, A Tkatchenko, KR Muller, OA von Lilienfeld (2017) "QML: A Python Toolkit for Quantum Machine Learning" https://github.com/qmlcode/qml

2) Get help:

Documentation and installation instruction is found at: http://www.qmlcode.org/

3) License:

QML is freely available under the terms of the MIT license.

 

 

12. Quantum Machine Learning (QuML) - Part VII

 

Latest papers with CODE, Reproduced by https://paperswithcode.com/task/quantum-machine-learning/latest