Reproduced by Github https://github.com/desireevl/awesome-quantum-computing
Quantum computing utilises quantum mechanical phenomenon such as entanglement and superposition to manipulate qubits to perform computation on a quantum computer. Currently available are tools to create and run programs on publicly usable quantum computers as well as resources to learn about them.
This is a curated list of up-to-date resources on learning about and developing on quantum computers. The goal is to build a categorised community-driven collection of up to date, high quality resources.
Sharing, suggestions and contributions are always welcome! Please take a look at the contribution guidelines and quality standard first. Thanks to all contributors, you're awesome and it wouldn't be possible without you!
For further resources related to Open Source Quantum Software Projects, please check out qosf's repo.
Reproduced by https://www.fraunhofer.de/en/research/current-research/quantum-technologies/quantum-computing.html
Analysts at Morgan Stanley predict that the market for high-end quantum computers will reach 10 billion dollars by 2025, double what it is now. Alongside IBM and Google, there is also Microsoft, the Chinese Internet giant Alibaba and startups such as Novarion, Rigetti and D-Wave. Yet the various manufacturers rely on different physical principles for the realization of the quantum hardware. Scientists distinguish between universal quantum computers, which can perform arbitrary quantum algorithms, and quantum annealers, which are less complex, but limited to very specific tasks. Researchers at VW have been using a D-Wave quantum annealer since 2017 to better simulate traffic flows. And BMW is investigating whether quantum annealers can help optimize its production robots’ performance.
Universal quantum computers are technically very challenging to build and operate. What sets these computers apart is that their performance doubles in power with each added qubit, thereby increasing in exponential rather than in linear fashion. In other words, two qubits yields four possible combinations, three qubits eight, and so on. The quantity of qubits matters, but evenly important is the quality of qubit entanglement and its coherence time. The latter determines how long the quantum system remains stable enough to compute before noise masks the information. Most universal quantum computers, such as Google’s 72-qubit Bristlecone, only work under special laboratory conditions.
In January 2019, IBM unveiled the IBM Q System One, the world’s first commercially viable quantum computer – meaning that it works outside a lab. A consortium of seven Fraunhofer Institutes in Germany has been tasked to look into real-world applications for quantum computing as of 2021 in a bid to drive the advance of applied quantum science in the EU. “We want to find out just what kind of applications there are for quantum computing in industry and how to write the necessary algorithms and translate them for specific applications,” explains Hauswirth. The initiative also aims to keep entry barriers low by sharing insights with companies to fast-track the industry’s efforts to build a knowledge base in quantum computing.
There are still high hurdles to clear on the path to upscale the performance of available quantum computers. The priority now is to find ways of shielding the fragile quanta from ambient influences that interfere with the computing process. For example, qubits have to be cooled to a temperature approaching absolute zero – around minus 273 degrees Celsius, which is colder than outer space. They also require a vacuum and have to be shielded against electromagnetic radiation. Vibrations and parasitic effects of electromagnetic waves used to manipulate the qubits and read out the information they carry can also cause problems.
Solving complex problemsWhat kind of real problems can quantum computers solve? “In a few years from now, quantum computers will provide highly efficient means for prime factorization. That will leave current cryptographic systems vulnerable, which is why major research into post-quantum cryptography is underway,” says Hauswirth. Quantum computers will be able to tackle even more complex problems a few years down the road: “Today’s fintech, for example, has trouble managing billions of cash flows in parallel and in real time within the confines of a very tight regulatory girdle. Sequential processing is still prone to errors, but quantum computers would help get around this.”
Prof. Anita Schöbel is director of the Fraunhofer Institute for Industrial Mathematics ITWM in Kaiserslautern. She and Hauswirth are mainly responsible for quantum computing at Fraunhofer. Pointing to an application in the works at her institute, she says, “We’re working on projects that use stochastic partial differential equations such as the Fokker-Planck equations. These serve to develop lithium ion batteries and wind turbines, calculate granular flows and determine prices in quantitative finance. These equations can be converted into quantum mechanics equations for quantum computers to crunch the numbers, probably much faster.”
Applied quantum computing is clearly taking shape in the real world. Will we all have a quantum home computer or a quantum processor in our smartphones in a few years? “Quantum computers will only ever be able to solve very specific problems, so they won’t replace conventional computers. It’s likely that cloud-based models will prevail – that is, quantum computing as a service (QCaaS). We’ll probably also see hybrids of quantum computing and conventional high-performance computing,” says Hauswirth.
In partnership with IBM, we are going to install Europe’s first commercial quantum computer at a location in Germany. The aim is to develop applied quantum computing solutions for a range of fields and assess their viability. We would like to see companies of all sizes involved in this project.
Why does this matter?It is early days yet for applied research in quantum computing. We need to define quantum algorithms and then convert them for easy use in applications programming. That requires expertise on the part of industry, so we want to fast-track efforts to build a knowledge base here in Germany. This initiative will also enable us to pursue quantum computing under full data sovereignty according to European law, without being dependent on large Internet corporations from overseas.
When do you expect to see the first results?A quantum computer is to be installed in Germany in 2021. But even optimistic forecasts suggest it’s going to take another 10 to 20 years before businesses can use quantum computers.
Research teams around the world are working on the most efficient way to couple together multiple supercomputers using quantum information to create a quantum internet. At QuTech in Delft, a number of partners, among them the Fraunhofer Institute for Laser Technology ILT, are currently working on a highly ambitious project. By 2022, they hope to have built the world’s first quantum internet demonstrator in the Netherlands with the aim of achieving lasting entanglement of qubits over long distances. Nodes at four locations will be connected together via fibre-optic cable. This will enable greater computing capacity, as well as completely new applications, such as blind quantum computing, where computations are performed securely, privately and anonymously on quantum computers in the cloud. According to Florian Elsen, coordinator for quantum technology at Fraunhofer ILT in Aachen, the big challenge lies in “transmitting single, fragile qubits through a fibre-optic cable as losslessly as possible. To achieve this, we carry out frequency conversion, meaning that we modify the wavelength of single photons without changing other significant properties.” Once you have a quantum internet, it is not much of a leap to quantum communication.
A conventional computer works with bits; a quantum computer with qubits. Like bits, qubits can have a value of 0 or 1. Unlike bits, they occupy a superposition of overlapping quantum states, so they can also have any combination of the two. A qubit does not take on a definite value until it is measured. Adding one qubit doubles the system’s performance so that 50 qubits, for example, would yield 2 to the power of 50 (250) possible combinations. This way, big problems and complex tasks are computed in parallel rather than in linear fashion.
David Di Vincenzo’s* five criteria for a quantum computer
1. A scalable physical system with well characterized qubit
2. The ability to initialize the state of the qubits to a simple fiducial state
3. A "universal" set of quantum gates
4. A qubit-specific measurement capability
5. Long relevant decoherence times
*Pioneer of quantum information science and professor of theoretical physics at RWTH Aachen
Reproduced by https://research.aimultiple.com/quantum-ai/
Quantum computing and artificial intelligence are both transformational technologies and artificial intelligence is likely to require quantum computing to achieve significant progress. Although artificial intelligence produces functional applications with classical computers, it is limited by the computational capabilities of classical computers. Quantum computing can provide a computation boost to artificial intelligence, enabling it to tackle more complex problems and AGI.
Quantum AI is the use of quantum computing for computation of machine learning algorithms. Thanks to computational advantages of quantum computing, quantum AI can help achieve results that are not possible to achieve with classical computers.
Quantum mechanics is a universal model based on different principles than those observed in daily life. A quantum model of data is needed to process data with quantum computing. Hybrid quantum-classical models are also necessary in quantum computing for error correction and correct functioning of the quantum computer.
For more, feel free to read our detailed article on the topic.
Although AI has made rapid progress over the past decade, it has not yet overcome technological limitations. With the unique features of quantum computing, obstacles to achieve AGI (Artificial General Intelligence) can be eliminated. Quantum computing can be used for the rapid training of machine learning models and to create optimized algorithms. An optimized and stable AI provided by quantum computing can complete years of analysis in a short time and lead to advances in technology. Neuromorphic cognitive models, adaptive machine learning, or reasoning under uncertainty are some fundamental challenges of today’s AI. Quantum AI is one of the most likely solutions for next-generation AI.
Recently, Google announced TensorFlow Quantum(TFQ): an open-source library for quantum machine learning, in collaboration with the University of Waterloo, X, and Volkswagen. The aim of TFQ is to provide the necessary tools to control and model natural or artificial quantum systems. TFQ is an example of a suite of tools that combines quantum modeling and machine learning techniques.
SOURCE: GOOGLEThe other steps of evaluating cost function, gradients, and updating parameters are classical steps of deep learning. These steps make sure that an effective model is created for unsupervised tasks.
Researchers’ near term realistic aim for quantum AI is to create quantum algorithms that perform better than classical algorithms and put them into practice.
Although quantum AI is an immature technology, there are improvements in quantum computing which increase the potential of quantum AI. However, the quantum AI industry needs critical milestones in order to become a more mature technology. These milestones can be summarized as:
These critical steps would enable quantum AI for further developments.
Reproduced by https://ti.arc.nasa.gov/tech/dash/groups/quail/
QuAIL is the space agency's hub for assessing the potential of quantum computers to impact computational challenges faced by the agency in the decades to come.
NASA’s QuAIL team aims to demonstrate that quantum computing and quantum algorithms may someday dramatically improve the agency’s ability to address difficult optimization and machine learning problems arising in NASA's aeronautics, Earth and space sciences, and space exploration missions.
NASA's QuAIL team has extensive and experience utilizing near-term quantum computing hardware to evaluate the potential impact of quantum computing. The team has international recognized approaches to the programming and compilation of optimization problems to near-term quantum processors, both gate-model quantum processors and quantum annealers, enabling efficient utilization of the prototype quantum hardware available for experimenting with quantum and quantum-classical hybrid approaches for exact and approximate optimization and sampling.The has ongoing research developing quantum computational approaches to challenging combinatorial optimization and sampling problems with relevance to areas such as planning and scheduling, fault diagnosis, and machine learning.
A key component of this work is close collaboration with quantum hardware groups. The team's initial focus was on quantum annealing, since D-Wave quantum annealers were the first quantum computational devices available. As gate-model processors have matured, with gate-model processors with 10s of qubits now available, the group has extended its research to include substantial gate-model efforts in addition to deepening our quantum annealing research. For more information on our research, please see our Research Overview and Publication pages.
The NASA QuAIL team leads the T&E team for the IARPA QEO (quantum enhanced optimization) program, has formal collaborative agreements with quantum hardware groups at Google and Rigetti, and research collaborations with many other entities at the forefront of quantum computing, as well as a three-way agreement between Google-NASA-USRA related to the D-Wave machine hosted at NASA Ames.
The QuAIL group's expertise spans physics, computer science, mathematics, chemistry, and engineering.
Quantum computing is based on quantum bits or qubits. Unlike traditional computers, in which bits must have a value of either zero or one, a qubit can represent a zero, a one, or both values simultaneously. Representing information in qubits allows the information to be processed in ways that have no equivalent in classical computing, taking advantage of phenomena such as quantum tunneling and quantum entanglement. As such, quantum computers may theoretically be able to solve certain problems in a few days that would take millions of years on a classical computer.
January 13, 2021
Dr. Eleanor Rieffel will serve on the panel “Quantum Computing – Making It Real” at the Consumer Electronics Show (CES). Wed January 13, 2021, 2:45PM. Other panelists include Joseph Broz (QED-C) and Katie Pizzolato (IBM), and the panel will be moderated by Michael Bergman (Consumer Technology Association).
Dr. Eleanor Rieffel Selected as a 2020 NASA Ames Associate FelllowJuly 17, 2020
Dr. Eleanor Rieffel was awarded the 2020 Ames Associate Fellow for her pioneering work in the field of quantum information processing. Her work significantly advances the state of the art in quantum computing and its application to the NASA mission in aeronautics, space exploration, and earth science.
The Ames Associate Fellow is an honorary designation that acknowledges distinguished scientific research or outstanding engineering of a non-management related nature. Appointment as Ames Associate Fellow is for a two-year term. The winning researchers receive a personal award, a research stipend, a travel grant, and will give a lecture to the center.
NASA Ames and Quantum SupremacyOctober 24, 2019
In partnership with Google and the Oak Ridge National Laboratory, our researchers in the Quantum Artificial Intelligence Laboratory (QuAIL) group worked to demonstrate the ability to compute in seconds what would take even the largest and most advanced supercomputers thousands of years to achieve, a milestone known as quantum supremacy. This remarkable achievement is featured on the cover of the Oct. 24, 2019 issue of the science journal Nature.
Using our supercomputing facilities, researchers here at Ames advanced techniques for simulating quantum computations - work that helped set the bar for Google's quantum computer to beat. The achievement of quantum supremacy means that the processing power and control mechanisms now exist for scientists to run their code with confidence and see what happens beyond the limits of what can be done on supercomputers. Experimentation with quantum computing is now possible in a way it never has been before.
This is another example of the great and important work we do here at Ames. The high goals we set, the milestones we achieve, the hard work and dedication we contribute as a community is what continues to allow us to push the boundaries of exploration to new heights.
For more information about Ames' contribution to quantum supremacy: https://www.nasa.gov/feature/ames/quantum-supremacy
Flexible Quantum Circuit Simulator (qFlex) Framework Open SourcedOctober 24, 2019
Flexible Quantum Circuit Simulator (qFlex) implements an efficient tensor network, CPU-based simulator of large quantum circuits. qFlex computes exact probability amplitudes, a task that proves essential for the verification of quantum hardware, as well as mimics quantum machines by computing amplitudes with low fidelity. qFlex targets quantum circuits in the range of sizes expected for supremacy experiments based on random quantum circuits, in order to verify and benchmark such experiments.
The qFlex framework is licensed under the Apache License, Version 2.0, and is available for download at https://github.com/ngnrsaa/qflex
NASA Ames hosts AQC-18June 25-28, 2018
Adiabatic Quantum Computing (AQC) and Quantum Annealing are computational methods that have been proposed to solve combinatorial optimization and sampling problems. Several efforts are now underway to manufacture processors that implement these strategies. The Seventh International Conference on AQC brings together researchers from different communities to explore this computational paradigm. The goal of the conference is to initiate a dialogue on the challenges that must be overcome to realize useful adiabatic quantum computations in existing or near-term hardware. Read More
Quantum Annealer with more than 2000 qubits installed and operationalAugust 31, 2017
We upgraded the D-Wave quantum annealer hosted here at NASA Ames to a D-Wave 2000Q system. The newly upgraded system, which resides at the NASA Advanced Supercomputing Facility at NASA's Ames Research Center, has 2031 quantum bits (qubits) in its working graph—nearly double the number of qubits compared to the previous processor. It has several system enhancements that enable more control over the adiabatic quantum computing process allowing it to solve larger and more complex optimization problems than were previously possible.
Reproduced by https://github.com/PennyLaneAI?language=python
Reproduced by https://pennylane.ai/qml/whatisqml.html
Quantum machine learning is a research area that explores the interplay of ideas from quantum computing and machine learning.
For example, we might want to find out whether quantum computers can speed up the time it takes to train or evaluate a machine learning model. On the other hand, we can leverage techniques from machine learning to help us uncover quantum error-correcting codes, estimate the properties of quantum systems, or develop new quantum algorithms.
The limits of what machines can learn have always been defined by the computer hardware we run our algorithms on—for example, the success of modern-day deep learning with neural networks is enabled by parallel GPU clusters.
Quantum machine learning extends the pool of hardware for machine learning by an entirely new type of computing device—the quantum computer. Information processing with quantum computers relies on substantially different laws of physics known as quantum theory.
Some research focuses on ideal, universal quantum computers (“fault-tolerant QPUs”) which are still years away. But there is rapidly-growing interest in quantum machine learning on near-term quantum devices.
We can understand these devices as special-purpose hardware like Application-Specific Integrated Circuits (ASICs) and Field-Programmable Gate Arrays (FPGAs), which are more limited in their functionality.
In the modern viewpoint, quantum computers can be used and trained like neural networks. We can systematically adapt the physical control parameters, such as an electromagnetic field strength or a laser pulse frequency, to solve a problem.
For example, a trained circuit can be used to classify the content of images, by encoding the image into the physical state of the device and taking measurements.
But the story is bigger than just using quantum computers to tackle machine learning problems. Quantum circuits are differentiable, and a quantum computer itself can compute the change in control parameters needed to become better at a given task.
Differentiable programming is the very basis of deep learning, implemented in software libraries such as TensorFlow and PyTorch. Differentiable programming is more than deep learning: it is a programming paradigm where the algorithms are not hand-coded, but learned.
Similarly, the idea of training quantum computers is larger than quantum machine learning. Trainable quantum circuits can be leveraged in other fields like quantum chemistry or quantum optimization. It can help in a variety of applications such as the design of quantum algorithms, the discovery of quantum error correction schemes, and the understanding of physical systems.
PennyLane is an open-source software framework built around the concept of quantum differentiable programming. It seamlessly integrates classical machine learning libraries with quantum simulators and hardware, giving users the power to train quantum circuits.
To find out more, visit the PennyLane Documentation, or check out the gallery of hands-on quantum machine learning demonstrations.
Reproduced by https://www.geeksforgeeks.org/the-ultimate-guide-to-quantum-machine-learning-the-next-big-thing/
The Ultimate Guide to Quantum Machine Learning – The next Big thing
Innovation in machine learning is far from complete. In fact, things are just about to take a ‘quantum leap’ for the good, when the world of quantum physics and machine learning come together to solve even more advanced problems through intelligent computing. That’s right, Heisenberg’s Uncertainty Principle and the famous Schrödinger’s Cat could help develop advanced quantum machine learning systems that are capable of accelerating the current machine learning models so that they work even faster, as well as help develop entirely new machine learning models that could do unprecedented things. Although it will be a while before quantum machine learning goes mainstream, but as of now, almost all the tech giants like IBM, Microsoft and NASA are already getting on board with this fascinating new tech.
Quantum machine learning is an interdisciplinary approach that combines machine learning and the principles of quantum Physics. To understand this, let’s take a look at some of the basic concepts in quantum physics that are at play here –
Quantum:
Physicist Max Planck in 1900 proposed that at the subatomic level, energy is contained in tiny discrete packets called quanta, which behave as both waves and particles, depending on their environment at the time. The basis of quantum theory relies on the observation that at any point in time, these particles could be in any state and may change their state.
Qubits:
The classical computing methods we use today work on chips that process all data using 2 bits – 0 and 1. Even the most complex data or algorithm you input gets broken down into these two bits. Quantum machine learning on the other hand uses the unit ‘qubits’, short for quantum bits. In quantum physics, these qubits could be electrons or protons orbiting a nucleus in an atom.
Superposition:
These quantum particles or Qubits may exist as both 0 and 1 at the same time. This is a phenomenon known as Superposition. Essentially, this means that a particle can exist in multiple quantum states and when placed under supervision, i.e. when we try to measure its position, it undergoes change and its superposition is lost.
Entanglement:
Different qubits interact with each other on an atom in a way that the state of one particle cannot be described independently of the other particles. So even when the particles are separated by a large distance, they communicate with each other in a correlated manner.
Understanding the quantum physics of matter can help develop new special purpose hardware or quantum computers that are superior to the ones we have right now in terms of how much data they can process per second and the kind of computing they can accomplish. Quantum computers offer the immense computational advantage of being able to classify objects in their nth dimension, a feat impossible to achieve on normal classical computers. Using the above described principles of superposition and entanglement, these devices pack in an incredible amount of computational power.
If you are already in awe of hardware such as ASICs (application-specific integrated circuits) and FPGAs (field-programmable gate arrays) to facilitate machine learning, prepare to experience a performance of a much higher order with quantum machine learning. Quantum chips can be used to map out phenomenal computer algorithms to solve complex problems. While quantum computing proponents make promising advances into arenas of creating new chemicals and drugs with this technology, machine learning aficionados are looking into a future where complex algorithms can map out the brain circuitry, decode the genetic makeup, build a specialized infrastructure that combines biometrics and IOT devices to enable high level security devices and even unlock some phenomenal new discoveries about the vast mysterious universe. Yes, quantum machine learning could facilitate mapping out trillions of neurons firing in our brain at the same time.
Some of the current machine learning processes that can be accelerated by quantum machine learning are –
Linear Algebra:
When it comes to executing linear algebra computations, quantum computers can exponentially speed up the prospects. A quantum gate can execute an exponentially large matrix with an equally large vector at advanced speed in a single operation, helping build machine learning models out of quantum algorithms. This significantly brings down the costs as well as times associated with linear algebra computations.
Optimization:
Be it physicists, chemists or data scientists, everyone is trying to find a way to the point of lowest energy in a high-dimensional energy landscape. In the world of adiabatic quantum computing and quantum annealing, optimization is everyone’s priority. Quantum machine learning can have a strong footprint in optimization, which also happens to be one of the first tasks physicists attempted in the context of quantum machine learning.
Kernel Evaluation:
Quantum machine learning can be used to perform kernel evaluation by feeding estimates from a quantum computer can be fed into the standard kernel method. While the training and inferencing of the model will have to be done in the standard support vector machine, using special-purpose quantum support vector machines could help accelerate the process. As the feature space expands, kernel functions in classical computing become computationally expensive to estimate. This is where quantum algorithms step in. quantum properties like entanglement and interference help create a massive quantum state space that can hugely improve kernel evaluation.
Deep Learning:
Deep learning is one of the most impactful applications of machine learning and artificial intelligence in the recent times. Quantum computers could make deep learning a whole lot more profound by solving complex problems that are intractable on classical computers. In an experiment to train a deep Boltzmann machine, researchers from Microsoft used quantum models and found that they could not only train the Boltzmann machine faster but also achieve a much more comprehensive deep learning framework than a classical computer could ever yield.
Conclusion –
The true potential of quantum machine learning will begin to see fruition in a few years from now, but already, significant progress is being made in the direction. High-quality quantum machine learning algorithms will enable scientists to develop whole new methods to improve lives and facilitate solutions that are so far only imagined.
Reproduced by Github https://github.com/krishnakumarsekar/awesome-quantum-machine-learning
A curated list of awesome quantum machine learning algorithms,study materials,libraries and software (by language).
Online
Class Based Course
UK
Australia
Europe
Online
Class Based Course
Europe
Online
Class Based Course
Europe
Online
Class Based Course
Canada
Singapore
USA
China
Class Based Course
Canada
Singapore
Europe
Russia
External Links
Online
Class Based Course
Online
Class Based Course
USA
Europe
Asia
Online
Class Based Course
Class Based Course
External Links
Class Based Course
Reproduced by Github https://github.com/artix41/awesome-quantum-ml
A list of awesome papers and cool resources in the field of quantum machine learning (machine learning algorithms running on quantum devices). It does not include the use of classical ML algorithms for quantum purpose.
Variational circuits are quantum circuits with variable parameters that can be optimized to compute a given function. They can for instance be used to classify or predict properties of quantum and classical data, sample over complicated probability distributions (as generative models), or solve optimization and simulation problems.
Quantum circuits that are used to extract features from data or to improve kernel-based ML algorithms in general:
Kingdom of Ewin Tang. Papers showing that a given quantum machine learning algorithm does not lead to any improved performance compared to a classical equivalent (either asymptotically or including constant factors):
Reproduced by https://github.com/tensorflow/quantum
TensorFlow Quantum (TFQ) is a Python framework
TensorFlow Quantum (TFQ) is a Python framework for hybrid quantum-classical machine learning that is primarily focused on modeling quantum data. TFQ is an application framework developed to allow quantum algorithms researchers and machine learning applications researchers to explore computing workflows that leverage Google’s quantum computing offerings, all from within TensorFlow.
Quantum computing at Google has hit an exciting milestone with the achievement of Quantum Supremacy. In the wake of this demonstration, Google is now turning its attention to developing and implementing new algorithms to run on its Quantum Computer that have real world applications.
To provide users with the tools they need to program and simulate a quantum computer, Google is working on Cirq. Cirq is designed for quantum computing researchers who are interested in running and designing algorithms that leverage existing (imperfect) quantum computers.
TensorFlow Quantum provides users with the tools they need to interleave quantum algorithms and logic designed in Cirq with the powerful and performant ML tools from TensorFlow. With this connection we hope to unlock new and exciting paths for Quantum Computing research that would not have otherwise been possible.
See the installation instructions.
All of our examples can be found here in the form of Python notebook tutorials
Report bugs or feature requests using the TensorFlow Quantum issue tracker.
We also have a Stack Overflow tag for more general TFQ related discussions.
In the meantime check out the install instructions to get the experimental code running!
We are eager to collaborate with you! TensorFlow Quantum is still a very young code base, if you have ideas for features that you would like added feel free to check out our Contributor Guidelines to get started.
If you use TensorFlow Quantum in your research, please cite:
TensorFlow Quantum: A Software Framework for Quantum Machine Learning arXiv:2003.02989, 2020.
QML is a Python2/3-compatible toolkit for representation learning of properties of molecules and solids.
Current list of contributors:Until the preprint is available from arXiv, please cite this GitHub repository as:
AS Christensen, LA Bratholm, FA Faber, B Huang, A Tkatchenko, KR Muller, OA von Lilienfeld (2017) "QML: A Python Toolkit for Quantum Machine Learning" https://github.com/qmlcode/qmlDocumentation and installation instruction is found at: http://www.qmlcode.org/
QML is freely available under the terms of the MIT license.
Latest papers with CODE, Reproduced by https://paperswithcode.com/task/quantum-machine-learning/latest