Chronological feed of everything captured from Maria Schuld.
youtube / mariaschuld / 4d ago
Experts in quantum machine learning (QML) discuss the field's current state, emphasizing the need for critical evaluation beyond "speedups." Key areas of focus include the potential for QML algorithms to process quantum data directly, the importance of rigorous theoretical foundations, and the necessity of interdisciplinary collaboration. The discussion highlights the uncertainty surrounding immediate commercial applications, instead pointing to scientific discovery and tool development as more immediate and attainable goals.
quantum-machine-learningquantum-computingqiskit-summer-schoolscientific-researchtheoretical-computer-sciencecareer-developmentquantum-algorithms
“Profound uncertainty currently exists regarding the actual effectiveness and application of quantum machine learning, even for classical machine learning, demonstrating a significant knowledge gap.”
paper / mariaschuld / 17d ago
Spectral methods, which manipulate the Fourier spectrum of ML models for learning and regularization, align naturally with quantum computing capabilities like the Quantum Fourier Transform. Representing generative models as quantum states enables efficient spectrum manipulation unavailable classically. These methods underpin ML successes, including spectral bias in deep learning, SVM Fourier regularization, and CNN Fourier filters, suggesting quantum approaches could yield more direct, resource-efficient model design.
quantum-machine-learningspectral-methodsquantum-fourier-transformspectral-biasquantum-advantagemachine-learning-theory
“Quantum Fourier Transform enables manipulation of the Fourier spectrum of a quantum state representing a generative ML model using quantum routines.”
paper / mariaschuld / 19d ago
Quantum computers exploit super-exponential speedup in the Quantum Fourier Transform (QFT) over the symmetric group to encode exact non-Abelian harmonic analysis models for permutation-structured data. These models, intractable classically, capture permutation correlations via group Fourier spectra, powering Markov chains with diffusion and Bayesian conditioning steps. The approach targets applications like multi-object tracking and recommendation systems, marking an initial step toward practical non-Abelian QFT utility.
quantum-computingquantum-fourier-transformpermutationsprobabilistic-modelingmachine-learningnon-abelian-harmonic-analysisspectral-methods
“Quantum computers provide a super-exponential speedup for Fourier transform over the symmetric group.”
paper / mariaschuld / 26d ago
The paper extends the hidden cut algorithm, which detects fully unentangled qubit partitions using Shor's hidden subgroup method, to identify weakly entangled registers through approximate symmetries. It establishes a rigorous link between the hidden cut algorithm's output distribution and a reward function quantifying cut quality. Reducing input state copies yields measurement samples revealing weak entanglement patterns, broadening hidden subgroup applications beyond cryptography.
quantum-algorithmshidden-subgroupentanglement-detectionquantum-heuristicsshor-algorithmquantum-physics
“The hidden cut algorithm detects partitions into fully unentangled qubit registers using a Shor-type quantum algorithm for hidden subgroup problems.”
paper / mariaschuld / Jan 20
Group Fourier analysis decomposes quantum states into irreducible representations (irreps) of a symmetry group to assess resourcefulness in compact Lie-group quantum resource theories (QRTs). The family of Stratonovich-Weyl quantum phase space (QPS) representations, parameterized by the Cahill-Glauber s, functions as a tunable group Fourier filter: s=-1 emphasizes low-dimensional irreps dominant in free states, s=0 preserves the spectrum, and s=1 highlights high-dimensional irreps indicative of resources. QPS spectra are fully characterized by norms of free state Fourier components, with an s-duality linking free and Haar-random resourceful state spectra.
quantum-phase-spacegroup-fourier-analysisquantum-resource-theorystratonovich-weyllie-groupsquantum-filtering
“Changing the Cahill-Glauber parameter s in Stratonovich-Weyl QPS representations implements a continuously tunable group Fourier filter for quantum resources.”
paper / mariaschuld / Aug 30
The paper reinterprets the Hidden Subgroup Problem (HSP) as a machine learning task by replacing quantum oracles with classical training data. It leverages the Quantum Fourier Transform (QFT) from HSP solutions to identify invariant subspaces where hidden symmetries are exposed. An inference principle is proposed that compares data to these subspaces via quantum state overlaps, aiming to develop quantum heuristics that exploit symmetries for generalization from finite samples.
quantum-fourier-transformhidden-subgroup-problemquantum-machine-learningquantum-algorithmsinference-invariancequantum-heuristics
“The standard quantum algorithm for the Hidden Subgroup Problem uses the Quantum Fourier Transform to expose an invariant subspace manifesting the hidden symmetry.”
paper / mariaschuld / Mar 11
A large-scale benchmark using PennyLane tested 12 quantum machine learning models on 160 datasets from 6 binary classification tasks. Out-of-the-box classical ML models consistently outperformed quantum classifiers. Removing entanglement from quantum models yielded equal or better performance, questioning the necessity of quantum advantages in these small-scale settings.
quantum-machine-learningbenchmarkingquantum-advantagepennylaneclassical-vs-quantumentanglementarxiv-paper
“Out-of-the-box classical machine learning models outperform quantum classifiers overall.”
youtube / mariaschuld / Jan 23 / failed
paper / mariaschuld / Dec 6
Quantum models leverage non-commuting observables to capture order effects in data, such as varying human survey responses to question order swaps. A generative model of sequential learnable measurements adapts to tasks by reordering observables in a multi-task setup. Simulations on psychology-inspired datasets show increased non-commutativity learning with stronger order effects and improved generalization to unseen orders.
quantum-machine-learninginductive-biasnon-commutativityquantum-cognitionorder-effectsmulti-task-learning
“Non-commutativity of quantum observables can model order effects like changes in human survey responses when swapping question order.”
youtube / mariaschuld / Oct 16
Current QML research is overly reliant on physically-motivated variational circuits and small-scale benchmarks that may mask a lack of true quantum advantage. Evidence suggests that many 'quantum' successes are actually driven by classical feature engineering or separable circuits, necessitating a shift toward model design based on first principles—specifically leveraging quantum interference for structured problem solving (e.g., Hidden Subgroup Problems) rather than mimicking deep learning architectures.
quantum-machine-learningquantum-computingalgorithm-designbenchmarkingresearch-strategyquantum-advantage
“Existing Quantum Machine Learning (QML) benchmarks often suffer from a 'positivity bias' and lack scalability insights.”
youtube / mariaschuld / May 31
Current quantum machine learning research exhibits problematic patterns: superficial quantum adaptations of classical ML models, overemphasis on exponential speedups in unsuitable contexts, and a positivity bias in reporting benchmarks on small datasets. A new research agenda is needed to focus on the unique statistical properties of quantum systems, develop models with intrinsic quantum intelligence, and adopt rigorous, reproducible methodologies that scale to relevant problem sizes.
quantum-machine-learningquantum-computingml-research-strategyquantum-algorithmsdeep-learning-critiqueresearch-methodology
“The immediate goal of quantum machine learning should not be to demonstrate practical quantum advantage as a business model.”
github_readme / mariaschuld / Apr 4
Speaker landscapes generate vector embeddings for speakers by prefixing their names as 'agent_' tokens in text corpora and training word embeddings like Gensim's KeyedVectors. The process involves cleaning raw JSON-lines data (lowercasing, punctuation removal, n-gram formation), training 250D embeddings, extracting prefixed speaker vectors, and dimensionality reduction for visualization. Analysis notebooks enable plotting landscapes, projecting onto semantic axes (e.g., anger words), and computing metadata like tweet lengths or emoji counts.
speaker-landscapesword-embeddingsuser-embeddingsnlp-techniquesgensimdata-preprocessinglandscape-analysis
“Speaker vectors are extracted from word embeddings where speaker names are prefixed with 'agent_'”
paper / mariaschuld / Mar 22
Researchers analyzed Reddit parenting subreddits (r/Daddit, r/Mommit, r/Parenting) using word embeddings augmented with usernames to quantify audience and gender effects on language. Fathers emphasize educational and family advice, mothers focus on medical care, sleep, potty training, and food, while mixed-gender r/Parenting covers broader topics; both genders share more about child appearances and events in single-gender forums. High self-monitors adapt more strongly to subreddit norms, especially in mixed settings, demonstrating individual differences in audience conformity.
word-embeddingsaudience-effectsgender-differencesparenting-subredditssocial-network-analysisindividual-differencesnlp-analysis
“Fathers in r/Daddit focus more on advising others on educational and family matters.”
youtube / mariaschuld / Mar 2 / failed
paper / mariaschuld / Feb 2
Quantum contextuality provides a nonclassical inductive bias that enables quantum models to encode linearly conserved quantities in label spaces via operational equivalence. Contextual models are more expressive than noncontextual ones, as demonstrated in a toy problem of learning zero-sum game payoffs. Quantum models leveraging geometric techniques outperform classical surrogates on this task, suggesting contextuality as a source of quantum ML advantage.
quantum-machine-learningcontextualityinductive-biasquantum-advantagearxiv-paper
“Contextual model classes encoding quantum contextuality inductive bias are generally more expressive than noncontextual counterparts”
paper / mariaschuld / Sep 12
Quantum machine learning models exhibit benign overfitting, generalizing well despite overfitting noisy training data, analogous to classical deep networks. The paper derives behavior in classical interpolating Fourier features models for noisy signal regression and maps it to quantum models, linking quantum circuit structures like data-encoding to overparameterization. This enables quantum models to capture locally spiky interpolations of noise, facilitating generalization.
quantum-machine-learningbenign-overfittinggeneralizationoverparameterizationquantum-circuitsfourier-features
“Quantum machine learning models display benign overfitting by generalizing well while overfitting training data.”
paper / mariaschuld / Mar 2
Classical ML algorithms excel in practice but resist theoretical analysis, while quantum computing lacks scalable benchmarks and relies heavily on theory for relevance assessment. Current tools fail to reliably evaluate quantum computers' practical utility for ML tasks. The field should shift from dominating quantum advantage narratives to alternative research perspectives that avoid framing quantum ML as needing to outperform classical methods.
quantum-machine-learningquantum-advantagequantum-computingmachine-learningarxiv-paperquantum-physics
“Machine learning is listed among the most promising applications for quantum computing”
paper / mariaschuld / Feb 27
The framework introduces differentiable quantum transforms as metaprograms that manipulate quantum circuits while preserving differentiability. Implemented in PennyLane, these transforms are themselves differentiable and optimizable, enhancing tasks like gradient computation, circuit compilation, and error mitigation. This enables automated optimization of quantum program transformations to reduce resource demands across quantum computing applications.
quantum-computingdifferentiable-transformspennylanequantum-programminggradient-computationcircuit-compilationerror-mitigation
“Differentiable quantum transforms are metaprograms that manipulate quantum programs while preserving differentiability”
youtube / mariaschuld / Feb 24
The pursuit of "quantum advantage" as the primary driver for quantum machine learning (QML) development may be misapplied. Current quantum computers (NISQ devices) operate under different paradigms than theoretical fault-tolerant quantum computers, making traditional complexity theory an unsuitable metric for evaluating practical QML applications. Over-reliance on quantum advantage could hinder relevant research and the development of truly useful QML applications.
quantum-machine-learningquantum-advantage-critiquequantum-computing-applicationsnisq-erascientific-communication
“The concept of "quantum advantage" (e.g., polynomial or exponential speedup) may not be the optimal framework for evaluating quantum applications, particularly in the context of emerging quantum technologies.”
paper / mariaschuld / Nov 18
PennyLane introduces a differentiable Hartree-Fock solver for computing exact gradients of molecular Hamiltonians with respect to nuclear coordinates and basis set parameters. It provides specialized quantum chemistry operations like excitation gates as Givens rotations, circuit templates, and sparse matrix simulators for efficient Hamiltonian representation. These tools support differentiable variational algorithms for ground/excited-state energies and derivatives, enabling joint optimization of circuit parameters, nuclear geometries, and basis sets. The library uniquely combines quantum computing, chemistry, and ML for end-to-end differentiable quantum chemistry workflows.
quantum-chemistrypennylanedifferentiable-programminghartree-fockquantum-algorithmsquantum-computingcomputational-chemistry
“PennyLane features a differentiable Hartree-Fock solver that computes exact gradients of molecular Hamiltonians w.r.t. nuclear coordinates and basis set parameters.”
youtube / mariaschuld / Oct 19
Quantum machine learning (QML) currently faces significant challenges, particularly the "barren plateau" problem, where gradients vanish in high-dimensional parameter spaces. This issue stems from the use of overly generic quantum models and unoptimized building blocks, such as Pauli rotations in variational circuits, without sufficient consideration for their suitability for machine learning tasks. Resolving these challenges requires a shift towards problem-aware circuit designs and potentially new theoretical frameworks that address the unique aspects of quantum information processing.
quantum-machine-learningbarren-plateausvariational-quantum-algorithmsquantum-neural-networksquantum-algorithmsgenerative-modelsquantum-sensing
“The barren plateau problem in quantum machine learning largely arises from using generic quantum models and unoptimized building blocks.”
youtube / mariaschuld / Apr 29
Quantum neural networks (VQCs) are variational quantum circuits that map data to quantum states via nonlinear embeddings followed by linear operations, mathematically equivalent to kernel methods rather than composable linear-nonlinear chains of classical neural networks. Training scales quadratically with parameters due to parameter-shift rules requiring 2 circuits per parameter, versus linear scaling in classical automatic differentiation, rendering VQCs inefficient for large models. Circuit expressivity does not equate to model expressivity, ansatzes lack principled design, quantum advantages are ill-posed without precise learning-theoretic framing, and small-scale simulations on toy datasets provide limited insights.
quantum-machine-learningvariational-quantum-circuitsquantum-neural-networksquantum-advantagebarren-plateausexpressivitykernel-methods
“Quantum neural networks are kernel methods, not analogous to classical neural networks”
youtube / mariaschuld / Apr 13
Maria Schuld entered quantum machine learning serendipitously during her PhD after failing to secure jobs in political science, leveraging freedom from low expectations to explore quantum walks in neural networks amid the field's nascent emergence. The field shifted from quantum computing researchers pursuing asymptotic speedups on classical ML tasks to practical implementations on noisy near-term devices, attracting industry players like Google and IBM, growing the community to thousands. She advises newcomers to start with accessible software frameworks like PennyLane rather than physics PhDs, emphasizing ML/data science backgrounds to inject fresh perspectives, while cautioning against overhyped quantum advantages and advocating creative, non-speedup-focused approaches.
quantum-machine-learningcareer-pathfield-growthquantum-computingmachine-learningresearch-communitypersonal-journey
“Quantum machine learning field emerged around 2011-2012, born from quantum computing researchers adapting classical algorithms for quantum speedups.”
youtube / mariaschuld / Apr 12
Variational quantum circuits function as machine learning models by encoding classical data into quantum states and performing trainable measurements, enabling composability and differentiability for seamless integration into PyTorch/TensorFlow pipelines via frameworks like PennyLane. This approach mirrors deep learning's non-convex gradient-based optimization while mathematically aligning with 1990s kernel methods, where data is mapped to high-dimensional feature spaces for convex optimization. Quantum advantage emerges in linear-time matrix inversion for kernel optimization, sidestepping classical quadratic scaling in big data regimes.
quantum-machine-learningvariational-circuitspennylanekernel-methodsquantum-computingdeep-learning-hybrid
“Variational quantum circuits are composable and differentiable, allowing integration into deep learning pipelines like PyTorch”
youtube / mariaschuld / Mar 11
Quantum machine learning replaces classical models in ML pipelines with parameterized quantum circuits, which are trained via parameter-shift rules for gradient computation and integrate seamlessly with frameworks like PyTorch and PennyLane. These circuits encode classical data into high-dimensional quantum feature spaces via feature maps, followed by trainable measurements. Mathematically, they reduce to kernel methods akin to support vector machines, enabling convex optimization in low-dimensional space despite exponential feature dimensions.
quantum-machine-learningquantum-computingkernel-methodsquantum-circuitsparameter-shift-rulessupport-vector-machines
“Quantum computations can be trained like neural networks using standard deep learning pipelines”
youtube / mariaschuld / Mar 8
PennyLane integrates quantum circuits into automatic differentiation frameworks like JAX, PyTorch, and TensorFlow by providing gradients via parameter-shift rules, enabling gradient-based optimization of parameterized quantum circuits. This supports near-term noisy quantum devices through variational hybrid quantum-classical algorithms such as VQE for quantum chemistry, QAOA for optimization, and quantum machine learning models with data embeddings. Key challenges include barren plateaus in large circuits, emphasizing the importance of embedding strategies and understanding quantum kernels as state overlaps in Hilbert space.
pennylanequantum-differentiable-programmingquantum-machine-learningautomatic-differentiationnear-term-quantum-computingvqeqaoa
“Differentiable programming uses automatic differentiation to compute gradients for optimizing parameters in high-dimensional spaces via gradient descent.”
paper / mariaschuld / Jan 26
Supervised quantum machine learning models, often mislabeled as quantum neural networks, are mathematically kernel methods that operate in high-dimensional Hilbert spaces accessible only through measurement-derived inner products. These models can be reframed as support vector machines with kernels computing distances between data-encoded quantum states. Kernel-based training outperforms or matches variational circuit training, emphasizing data encoding into quantum states as the key quantum advantage over classical ML.
quantum-machine-learningkernel-methodssupervised-learningquantum-circuitssupport-vector-machinesarxiv-paper
“Supervised quantum machine learning models are mathematically kernel methods”
paper / mariaschuld / Aug 19
Parametrized quantum circuits for supervised learning act as partial Fourier series approximators, with accessible frequencies dictated by data encoding gates. Repeating simple encoding gates expands the frequency spectrum, enabling richer approximations. Models realizing all Fourier coefficients become universal function approximators if the spectrum is sufficiently rich.
quantum-machine-learningvariational-quantum-circuitsdata-encodingexpressive-powerfourier-seriesfunction-approximationarxiv-paper
“Quantum models can be expressed as partial Fourier series in the data inputs.”
youtube / mariaschuld / Jun 19
Quantum machine learning (QML) intersects quantum computing and ML, distinguishing classical-for-quantum, quantum-for-classical, quantum data with quantum processing, and quantum devices for classical data processing. Near-term QML treats quantum circuits as trainable parameterized models, emphasizing data encoding into quantum states as the critical step for generalization, akin to kernel methods mapping to high-dimensional feature spaces. PennyLane enables hybrid quantum-classical pipelines with automatic differentiation for optimization, highlighting software's role in accessible experimentation despite noisy hardware limitations.
quantum-machine-learningpennylanequantum-educationqml-theoryquantum-softwarenear-term-quantumgeneralization
“The primary goal of machine learning, including QML, is generalization—performing well on unseen data—not provable speedups.”
youtube / mariaschuld / Jun 9
Quantum machine learning with variational circuits encodes classical data into exponentially large Hilbert spaces as a feature map, making this step 95% of the challenge since subsequent operations are linear transformations followed by quadratic measurement nonlinearity. Optimal encodings cluster same-class data points closely while separating classes, enabling simple linear models or measurements like Helstrom (optimal but not near-term) or fidelity-based alternatives to achieve high generalization. Hybrid classical-quantum pipelines, enabled by frameworks like PennyLane, learn these embeddings effectively even on small qubit counts, prioritizing generalization over classical speedups.
quantum-machine-learningdata-encodingfeature-mapsvariational-circuitsquantum-kernelspennylanequantum-algorithms
“Data encoding into quantum states constitutes 95% of quantum machine learning effort”
paper / mariaschuld / May 18
Quantum machine learning (QML) is being explored for high energy physics (HEP) problems using noisy intermediate-scale quantum (NISQ) devices. Classical ML has long been applied in HEP, mainly via supervised classification at the analysis stage. This review covers initial QML ideas for HEP and anticipates broader future implementations.
quantum-machine-learninghigh-energy-physicsquantum-computingmachine-learningarxiv-paperquant-phhep-ph
“Machine learning has been used in high energy physics for a long time, primarily at the analysis level with supervised classification.”
paper / mariaschuld / Jan 29
Quantum machine learning leverages superposition to encode parameter sets for parallel computation of classifier ensembles, analogous to classical methods. The accuracy-weighted quantum ensemble implementation is fully dequantizable, implying no inherent quantum advantage. However, the general framework encompasses the Deutsch-Jozsa algorithm, preserving potential for quantum speedups.
quantum-machine-learningquantum-classifiersquantum-ensemblesquantum-superpositiondeutsch-jozsaquantum-speeduparxiv-paper
“Quantum ensembles store sets of classifier parameters in superposition for parallel evaluation”
paper / mariaschuld / Jan 10
Quantum classifiers consist of a feature map embedding classical data into Hilbert space followed by a trainable measurement. The proposed quantum metric learning trains the embedding to maximally separate classes under a specific metric (l1/trace or l2/Hilbert-Schmidt), making the optimal measurement analytically known: Helstrom for l1/trace distance, overlap for l2. This eliminates measurement training, conserving resources on near-term quantum devices and providing an analytic framework for quantum ML.
quantum-machine-learningquantum-embeddingsquantum-feature-mapsmetric-learningquantum-classifiershilbert-spacehelstrom-measurement
“Quantum classifiers comprise a quantum feature map encoding classical inputs into quantum states and a subsequent quantum measurement interpreted as the model output.”
youtube / mariaschuld / Dec 28
Kernel methods in ML define similarity measures between data points via positive semi-definite functions, equivalent to inner products in a high-dimensional Hilbert feature space, enabling nonlinear classification through the kernel trick in algorithms like SVMs and Gaussian processes. Quantum theory mirrors this kernel structure due to shared Hilbert space foundations, allowing quantum feature maps to generate expressive kernels for hybrid quantum-classical ML. This equivalence expands feature spaces to simplify separable data problems unachievable by linear models.
quantum-machine-learningkernel-methodsquantum-kernelssupport-vector-machinesgaussian-processesfeature-mapshybrid-algorithms
“A kernel function k(x, x') is defined on input set X if its Gram matrix is positive semi-definite for any finite inputs and complex coefficients.”
paper / mariaschuld / Dec 17
The paper extends transfer learning to hybrid classical-quantum neural networks by pre-training classical networks on high-dimensional data like images and augmenting them with final variational quantum circuits. This leverages classical preprocessing to embed informative features into quantum processors, ideal for noisy intermediate-scale quantum (NISQ) devices. Proof-of-concept demonstrations include image recognition and quantum state classification, experimentally validated on IBM and Rigetti quantum computers using PennyLane.
transfer-learninghybrid-quantum-networksquantum-machine-learningvariational-quantum-circuitsimage-recognitionpennylanearxiv-paper
“Transfer learning can be applied to hybrid classical-quantum neural networks by augmenting pre-trained classical networks with variational quantum circuits.”
paper / mariaschuld / Dec 16
Gaussian Boson Sampling (GBS) serves as a near-term photonic quantum computing platform with algorithms for graph problems, point processes, and molecular vibronic spectra. The Strawberry Fields library introduces a new applications layer that allows users to design and implement GBS algorithms using minimal code. This software acts as both an introduction with examples and a review of state-of-the-art GBS applications.
quantum-computingphotonic-quantumgaussian-boson-samplingquantum-softwarestrawberry-fieldsnear-term-quantumquantum-algorithms
“Gaussian Boson Sampling (GBS) is a near-term platform for photonic quantum computing.”
paper / mariaschuld / Oct 9
Gaussian boson sampling (GBS) photon-number probabilities for graph G define coefficients of a new displaced GBS polynomial. This polynomial exhibits a duality with the matching polynomial of the prism graph G □ P₂(x), the Cartesian product of G with a weighted edge. The duality enables novel classical simulation methods for GBS and underpins recent coarse-grained quantum feature maps.
gaussian-boson-samplinggraph-polynomialsquantum-computationgraph-dualityboson-samplingquantum-physicscombinatorics
“The displaced GBS polynomial coefficients are the coarse-grained photon-number probabilities of an undirected graph G encoded in a GBS device.”
paper / mariaschuld / Oct 2
Quantum hardware estimation of expectation values in parameterized quantum circuits induces stochastic gradient descent (SGD) for hybrid quantum-classical optimizers like VQE, QAOA, and quantum classifiers. Using as few as k=1 measurement shots per expectation value yields algorithms with provable convergence guarantees. For gradients as linear combinations of expectations (e.g., Hamiltonian terms, parameter-shift rules, dataset sums), doubly stochastic variants via term sampling further reduce measurements while maintaining rigorous convergence. Numerical benchmarks demonstrate state-of-the-art performance with drastically fewer circuit executions.
quantum-optimizationstochastic-gradient-descenthybrid-quantum-classicalvqeqaoaquantum-machine-learningnear-term-quantum
“Estimating expectation values with k measurement outcomes in hybrid quantum-classical optimization results in SGD with rigorously understood convergence properties for any k, including k=1.”
paper / mariaschuld / May 29
Gaussian Boson Samplers (GBS), proposed for near-term quantum advantage demonstrations, are leveraged to construct a graph kernel via feature maps derived from sampling outputs. This kernel measures graph similarity by linking GBS probability distributions to subgraph matching numbers, enabling graph isomorphism testing and competitive performance on benchmark datasets against classical kernels. The approach frames kernels as quantum hardware-efficient feature mappings, opening applications for Noisy Intermediate-Scale Quantum devices in graph analysis.
gaussian-boson-samplinggraph-kernelquantum-machine-learningquantum-computinggraph-isomorphismquantum-hardware
“Gaussian Boson Samplers can decide whether two graphs are isomorphic”
paper / mariaschuld / Mar 25
Machine learning algorithms, informed by physical principles like statistical physics, enhance understanding of ML methods while enabling applications across particle physics, cosmology, quantum many-body systems, quantum computing, and materials science. The review highlights bidirectional exchanges, including physics-motivated conceptual advances in ML and domain-specific ML successes tackling unique challenges. It also covers novel hardware architectures designed to accelerate ML computations.
machine-learningphysical-sciencesstatistical-physicsquantum-physicsparticle-physicscosmologyquantum-computing
“Statistical physics provides insights to understand machine learning methods”
paper / mariaschuld / Nov 27
Gradients of expectation values in parametrized quantum circuits can be estimated using nearly identical hardware architectures to the original circuit. For many cases, a single gate parameter shift and two circuit runs suffice for each gradient component. The approach generalizes to continuous-variable systems and uses ancilla conditioning for broader scenarios, enabling optimization in hybrid quantum-classical algorithms.
quantum-computinganalytic-gradientsvariational-algorithmsquantum-circuitscontinuous-variablesquantum-optimizationarxiv-paper
“Gradients of quantum measurement expectation values can be estimated using the same or almost the same architecture as the original circuit”
paper / mariaschuld / Nov 12
PennyLane is a Python 3 framework for differentiable programming of quantum computers, supporting qubit and continuous-variable devices. It computes gradients of variational quantum circuits compatibly with classical backpropagation, extending automatic differentiation to hybrid quantum-classical models. Plugins integrate with quantum hardware like Xanadu Cloud, Amazon Braket, and IBM Quantum, plus ML libraries including TensorFlow, PyTorch, JAX, and Autograd.
pennylanequantum-softwareautomatic-differentiationhybrid-quantum-classicalquantum-machine-learningvariational-circuitsquantum-computing-framework
“PennyLane supports both qubit and continuous-variable quantum computing paradigms”
paper / mariaschuld / Jun 18
The paper presents variational quantum circuits in the continuous-variable (CV) architecture as universal neural networks, using layered Gaussian gates for affine transformations and non-Gaussian gates for nonlinear activations. These CV-QNNs encode highly nonlinear transformations unitarily, support embeddings of classical networks including convolutional, recurrent, and residual variants. Experiments with Strawberry Fields demonstrate applications like fraud detection classification, Tetris image generation, and hybrid classical-quantum autoencoders.
quantum-neural-networkscontinuous-variable-quantumvariational-quantum-circuitsquantum-machine-learningcv-quantum-computationhybrid-quantum-classical
“CV quantum neural networks use Gaussian gates for affine transformations and non-Gaussian gates for nonlinear activation functions.”
paper / mariaschuld / Apr 2
Proposes a variational quantum classifier encoding input features into quantum state amplitudes, processed by a shallow circuit of parameterized single- and two-qubit gates followed by single-qubit measurement. The architecture scales learnable parameters poly-logarithmically with input dimension, enabling deployment on limited-qubit, error-prone devices. Quantum-classical training uses analytical gradient estimation via circuit perturbations; simulations show strong performance on classical benchmarks with fewer parameters than alternatives, plus noise resilience via quantum dropout.
quantum-classifiersvariational-circuitsquantum-machine-learningsupervised-learningquantum-algorithmserror-robust-quantum
“The quantum classifier uses a circuit with poly-logarithmic number of learnable parameters in the input dimension.”
paper / mariaschuld / Mar 19
Quantum computing parallels kernel methods by enabling efficient computation in high-dimensional Hilbert spaces via nonlinear feature maps that encode classical data into quantum states. Two quantum ML approaches emerge: (1) quantum estimation of intractable kernel inner products for classical algorithms like SVMs; (2) variational quantum circuits as linear classifiers directly in feature Hilbert space. Demonstrated with continuous-variable squeezing feature maps on 2D benchmark data.
quantum-machine-learningkernel-methodsfeature-hilbert-spacesquantum-computingvariational-circuitscontinuous-variable-systems
“Encoding classical inputs into quantum states acts as a nonlinear feature map to Hilbert space”
paper / mariaschuld / Apr 7
Quantum ensembles of quantum classifiers form by preparing a superposition state that encodes multiple classifiers, allowing parallel evaluation on a quantum computer followed by a single-qubit measurement for the collective decision. This approach supports exponentially large ensembles without individual training, akin to Bayesian averaging. An example weights classifiers by training performance, yielding novel quantum and classical machine learning results.
quantum-machine-learningquantum-classifiersquantum-ensemblesarxiv-papermachine-learningquantum-algorithms
“Quantum ensembles of quantum classifiers are created via a state preparation routine.”
paper / mariaschuld / Mar 31
Researchers implement a distance-based classifier using a minimal quantum circuit consisting of state preparation, a single Hadamard gate, and two single-qubit measurements. The circuit computes distances between data points in quantum parallel, bypassing complex subroutines like Hamiltonian simulation. Numerical simulations and IBM Quantum Experience demonstrations show strong performance on simple benchmark tasks.
quantum-machine-learningdistance-classifierquantum-interferencequantum-circuitibm-quantumpattern-recognitionquantum-algorithms
“The quantum classifier circuit comprises only state preparation, one Hadamard gate, and two single-qubit measurements.”
paper / mariaschuld / Dec 6
Quantum gradient descent and Newton's method optimize unit-norm constrained polynomials using quantum phase estimation, adapted quantum PCA, and quantum matrix operations. These algorithms scale polylogarithmically with solution vector dimension but exponentially with iteration count. They offer advantages for high-dimensional problems requiring few iterations, relevant to machine learning optimization.
quantum-optimizationgradient-descentnewtons-methodquantum-algorithmspolynomial-optimizationquantum-machine-learning
“Quantum versions of gradient descent and Newton's method are developed for polynomial optimization under unit norm constraint.”
paper / mariaschuld / Jan 28
The algorithm performs least-squares linear regression on quantum computers, prioritizing prediction of outputs for new inputs over parameter readout. It handles non-sparse data via low-rank approximations, reducing condition number dependence. Runtime is logarithmic in input dimension when data is quantum-encoded, with results accessible via one qubit measurement for further processing.
quantum-machine-learninglinear-regressionquantum-algorithmsquantum-computingleast-squaresarxiv-paper
“The algorithm predicts outputs for new inputs using linear regression without reading out fit parameters.”
paper / mariaschuld / Dec 11
This paper introduces a quantum machine learning approach to pattern classification by adapting Trugenberger's quantum Hamming distance measurement method. The algorithm aims to exploit quantum computing's known advantages over classical methods for specific tasks within machine learning. It demonstrates potential benefits using handwritten digit recognition from the MNIST dataset as a benchmark.
quantum-machine-learningquantum-computingpattern-classificationquantum-physicsmachine-learningarxiv-paperhamming-distance
“Quantum computing outperforms classical computing for certain computational tasks”