Quantum Machine Learning - dynexcoin/DynexSDK GitHub Wiki
The Dynex platform integrates neuromorphic computing with popular machine learning frameworks such as TensorFlow and PyTorch, enabling the creation of hybrid models and advanced learning techniques like transfer and federated learning. Examples include the Quantum-Support-Vector-Machine (QSVM) and Mode-assisted unsupervised learning of restricted Boltzmann machines (MA-QRBM), showcasing how neuromorphic dynamics can solve complex problems in optimization, sampling, and machine learning. The platform also supports Quantum-Boltzmann-Machines (QBMs) for generative models and quantum annealing to sample from probability distributions. These integrations are backed by significant scientific research, demonstrating the potential of neuromorphic computing to enhance machine learning and optimization tasks.
Quantum Self-Attention Transformer
The Quantum Self-Attention Transformer leverages the principles of quantum computing to perform tasks typically handled by classical transformers, particularly in the domain of NLP and LLM. Classical transformers rely heavily on the self-attention mechanism, which allows the model to weigh the importance of different words in a sentence when making predictions or generating new text.
The quantum self-attention transformer circuit is designed to process word embeddings derived from sentences and generate new sentences based on quantum operations. The circuit begins by embedding the binary representation of input vectors into quantum states, followed by multiple layers of rotation and controlled gates to capture complex relationships between the inputs. After applying the Quantum Fourier Transform (QFT) and Grover's operator, the circuit uses a combination of Hadamard, T, and rotation gates to further process the information. The final output is a set of expectation values, which are processed with softmax to generate attention-weighted outputs. These outputs are then used to generate a new sentence by combining the embeddings with word vectors similar to the quantum-generated outputs, ensuring a coherent and contextually relevant sentence.
This circuit essentially uses quantum computation to perform the role of attention in a transformer model, which is crucial for tasks like natural language processing, where understanding the importance of different words in a sentence is key to generating meaningful text. The quantum approach aims to leverage the potential speedups and parallelism inherent in quantum computing to perform these tasks more efficiently.
Quantum Natural Language Processing (QNLP)
Harnessing the power of quantum computing, quantum natural language processing algorithms offers unparalleled advantages in language processing tasks. Quantum algorithms excel in processing vast amounts of data simultaneously, enabling faster and more efficient language analysis. By leveraging quantum superposition and entanglement, our algorithm can explore multiple linguistic features in parallel, leading to more accurate and nuanced language understanding. Additionally, quantum computing's ability to handle complex linguistic structures with higher dimensionality allows for the extraction of deeper semantic meanings from text data. With these advancements, our quantum NLP algorithm promises to revolutionize language processing, paving the way for more sophisticated text analysis and comprehension.
- Youtube "Introducing Quantum Natural Language Processing on Dynex" Video showcasing an end-end process of collecting data from websites, training the QNLP model on Dynex and communicating with the resulting ChatGPT style bot (in realtime)
Quantum Transformer Algorithm on Dynex (QTRANSFORM)
Transformers are a type of deep learning model that have revolutionized the field of artificial intelligence, particularly in natural language processing tasks. They excel at handling sequential data and understanding context over long sequences, enabling advancements in machine translation, text generation, and more. The discovery of a quantum transformer algorithm by Dynex represents a significant leap forward, combining the power of transformers with the unparalleled computational capabilities of quantum computing. This hybrid approach promises even faster processing speeds and enhanced performance, making it possible to tackle complex AI problems more efficiently than ever before. By leveraging quantum principles, such as superposition and entanglement, our quantum transformer algorithm can process vast amounts of data simultaneously, leading to more accurate and nuanced AI models, thereby pushing the boundaries of what AI can achieve.
Google Tensorflow
The Dynex Neuromorphic Torch layer can be used in any NN model. Welcome to hybrid models, neuromorphic-, transfer- and federated-learning with TensorFlow
-
Example: Quantum-Restricted-Boltzmann-Machine (QRBM for Tensorflow) | Scientific background: Mode-assisted unsupervised learning of restricted Boltzmann machines, Communications Physics volume 3, Article number:105 (2020)
-
Example: Quantum-Support-Vector-Machine (TensorFlow) on Dynex | Scientific background: Rounds, Max and Phil Goddard. “Optimal feature selection in credit scoring and classification using a quantum annealer.” (2017)
PyTorch
The Dynex Neuromorphic Torch layer can be used in any NN model. Welcome to hybrid models, neuromorphic-, transfer- and federated-learning with PyTorch
-
Example: Quantum-Boltzmann-Machine (PyTorch) on Dynex | Scientific background: Dixit V, Selvarajan R, Alam MA, Humble TS and Kais S (2021) Training Restricted Boltzmann Machines With a D-Wave Quantum Annealer. Front. Phys. 9:589626. doi: 10.3389/fphy.2021.589626; Sleeman, Jennifer, John E. Dorband and Milton Halem. “A Hybrid Quantum enabled RBM Advantage: Convolutional Autoencoders For Quantum Image Compression and Generative Learning.” Defense + Commercial Sensing (2020)
-
Example: Quantum-Support-Vector-Machine (PyTorch) on Dynex | Scientific background: Rounds, Max and Phil Goddard. “Optimal feature selection in credit scoring and classification using a quantum annealer.” (2017)
Mode Assisted QRBM
The integration of neuromorphic computing into the Dynex platform signifies a transformative step in computational technology, particularly in the realms of machine learning and optimization. This advanced platform leverages the unique attributes of neuromorphic dynamics, utilising neuromorphic annealing - a technique divergent from conventional computing methods - to adeptly address intricate problems in discrete optimization, sampling, and machine learning.
- Example: Mode-assisted unsupervised learning of restricted Boltzmann machines (MA-QRBM for Pytorch) | Scientific background: Mode-assisted unsupervised learning of restricted Boltzmann machines, Communications Physics volume 3, Article number:105 (2020); Advancements in Unsupervised Learning: Mode-Assisted Quantum Restricted Boltzmann Machines Leveraging Neuromorphic Computing on the Dynex Platform; Adam Neumann, Dynex Developers; International Journal of Bioinformatics Intelligent Computing. 2024; Volume 3(1):91- 103, ISSN 2816-8089; Quantum Frontiers on Dynex: Elevating Deep Restricted Boltzmann Machines with Quantum Mode-Assisted Training; Adam Neumann, Dynex Developers; 116660843, Academia.edu; 2024
Quantum-RBM
QBMs are quantum analogues of classical Boltzmann Machines, which are generative models used for unsupervised learning. QBMs employ quantum annealing to sample from a probability distribution and learn patterns and structures in the data.
- Example: Quantum-Boltzmann-Machine Implementation on Dynex | Scientific background: Dixit V, Selvarajan R, Alam MA, Humble TS and Kais S (2021) Training Restricted Boltzmann Machines With a D-Wave Quantum Annealer. Front. Phys. 9:589626. doi: 10.3389/fphy.2021.589626; Sleeman, Jennifer, John E. Dorband and Milton Halem. “A Hybrid Quantum enabled RBM Advantage: Convolutional Autoencoders For Quantum Image Compression and Generative Learning.” Defense + Commercial Sensing (2020)
Quantum-SVM
QSVM is a quantum-inspired algorithm that aims to classify data using a quantum kernel function. It leverages the concept of quantum superposition and quantum feature mapping to potentially provide computational advantages over classical SVM algorithms in certain scenarios.
- Example: Quantum-Support-Vector-Machine Implementation on Dynex | Scientific background: Rounds, Max and Phil Goddard. “Optimal feature selection in credit scoring and classification using a quantum annealer.” (2017)