absorb.md

Hartmut Neven

Chronological feed of everything captured from Hartmut Neven.

Mapping Image Recognition to QUBO for Adiabatic Quantum Computing

Image recognition, a canonical NP-hard AI problem, is reformulated as a Quadratic Unconstrained Binary Optimization (QUBO) problem. This mapping enables its solution using D-Wave superconducting adiabatic quantum computing processors. Quantum adiabatic algorithms offer a novel approach to tackling NP-hard optimization challenges in AI that resist classical heuristics.

Quantum Adiabatic Optimization Outperforms AdaBoost in Binary Classification

The paper reformulates binary classification as a thresholded superposition of weak classifiers, optimizing weights via binary optimization amenable to adiabatic quantum computing (AQC). Bit-precision for weights scales logarithmically with training examples over weak classifiers, enabling efficient QUBO formulation. Heuristic solvers yield classifiers surpassing AdaBoost on benchmarks, with bit-constrained models showing lower generalization error; quantum processors promise further gains on the NP-hard problem.

Quantum Adiabatic Optimization Scales Classifier Training Beyond AdaBoost via Iterative Subset Selection

The paper proposes training large-scale binary classifiers by iteratively selecting subsets of weak classifiers through discrete global optimization, amenable to quantum adiabatic computing (AQC), and concatenating them into a strong thresholded sum classifier. Heuristic solvers for this optimization already outperform AdaBoost, with L0-norm regularization providing theoretical superiority over empirical loss minimization in boosting. Quantum Monte Carlo simulations indicate AQC efficiently handles generic training problems.

Google Sycamore Processor: Historic Quantum Computing Milestone Deposited at Deutsches Museum

Google has donated its Sycamore quantum processor, which achieved quantum supremacy in 2019, to the Deutsches Museum. This event marks a significant moment for both quantum computing and the museum's collection of technological firsts. The processor represents a foundational step towards large-scale, error-corrected quantum computers, demonstrating the potential for exponential acceleration in specific computational tasks.

Cryptographic Leashes for Quantum Computer Verification and Control

Cryptographic leashes enable classical verifiers to monitor and control quantum computers, addressing challenges posed by quantum coherence and exponential power. This framework uses cryptography to provide scalable proofs of quantumness, certifiable randomness, and secure delegation of quantum computation. The core primitive is a qubit certification protocol based on trapdoor claw-free functions, demonstrating a method for classically verifying quantum operations without compromising privacy.

Surface Codes: A Foundation for Scalable Quantum Error Correction

Hartmut Neven discusses the theoretical underpinnings and practical realizations of surface codes, focusing on their role in mitigating quantum computing errors. He details the toric code, error detection mechanisms, and advanced techniques like maximum likelihood decoding and lattice surgery necessary for building fault-tolerant quantum computers. The presentation highlights current challenges and future directions in implementing these complex error correction schemes.

Google Quantum AI Launches Improved Open-Source Quantum Development Tools

Google Quantum AI has released Cirq 1.0, a refactored quantum programming framework with improved stability and features, alongside a Quantum Virtual Machine (QVM) offering free, unlimited access to simulated NISQ hardware. These tools aim to democratize access to quantum computing development, foster the growth of the quantum workforce, and enable rapid prototyping and algorithm development for near-term quantum computers.

Experimental Probes of Quantum Gravity and Spacetime through Entanglement

The presentation explores the profound connection between quantum entanglement and spacetime, suggesting that the universe's fundamental structure, including gravity, may emerge from quantum information. It highlights how quantum computers can serve as experimental platforms to study quantum gravity phenomena, particularly through the simulation of traversable wormholes and the exploration of quantum criticality. The speaker draws parallels between condensed matter physics and high-energy physics, proposing that insights from one can inform the other in understanding complex quantum systems.

Engineering Superconducting Artificial Atoms for Quantum Computation

Superconducting qubits are engineered artificial atoms utilizing Josephson junctions to introduce the non-linearity required to isolate a computational two-level system from a harmonic oscillator's equidistant spectrum. Optimization involves balancing the non-linearity ratio and quality factor to maximize gate speed while minimizing noise-induced decoherence and spurious level transitions. Current state-of-the-art designs (transmons and fluxonium) rival natural atoms in coherence, but scaling to higher fidelities likely necessitates novel materials or vacuum-integrated fabrication processes.

The Critical Role of Noise Modeling in Quantum Error Correction Simulations

Accurate simulation of quantum error correction experiments necessitates detailed physical noise models, as simplified models fail to capture critical discrepancies in measurement statistics and logical error probabilities. While incorporating such detail is computationally intensive, particularly for larger systems, approximations like Pauli twirling can bridge the gap between expensive full quantum simulations and efficient classical predictions. This approach, validated against experimental data for current device sizes and error regimes, is crucial for designing and scaling fault-tolerant quantum computers.

Quantum Walks Reveal Exponential Speedups for Hierarchical Graphs

This work explores quantum walk algorithms on a class of hierarchical graphs, demonstrating potential for super-polynomial or even exponential speedups over classical algorithms. The key insight lies in how quantum walks leverage the graph's symmetry, effectively navigating a lower-dimensional subspace. The success of these quantum algorithms is linked to the concept of localization transition from many-body physics, offering a framework to predict when quantum advantages emerge.

The Interdisciplinary Talent Model for Quantum AI Development

Google Quantum AI employs a diverse workforce where technical versatility and 'grit' are prioritized over rigid academic credentials. Success in this high-uncertainty environment depends on the ability to blend specialized domain knowledge (e.g., microwave engineering, semiconductor manufacturing) with a growth mindset and strong collaborative communication skills.

Overcoming Calibration Challenges in Large-Scale Quantum Processors

Calibrating large-scale quantum processors presents significant challenges in achieving robust, rapid, and stable performance. Key issues include managing individual variations across thousands of components, optimizing for uniform performance, and counteracting parameter drift over time. Addressing these requires advanced automation, parallelization of calibration procedures, sophisticated optimization algorithms like the "snake optimizer," and metric-based recalibration strategies.

Bell’s Inequality as a Criterion for Quantum Advantage in Machine Learning

This research explores how quantum phenomena contribute to machine learning advantages by using Bell’s inequality as a criterion to identify non-classical behavior in quantum machine learning models. Inspired by quantum cognition, a framework is proposed where parameterized observables learn to leverage quantum effects. The study demonstrates that models trained on "quantum" data exhibit Bell inequality violations, offering a quantifiable way to assess quantum advantage.

Evolving Quantum Software: From Abstractions to 1.0 Stability

The quantum computing field is in a "sweet spot" for software development, requiring close interaction with hardware and flexible abstractions. Google's Circ framework, initially designed with a "moment" abstraction, evolved to incorporate "circuit operations" to address hardware realities, particularly disparate gate and measurement times. This led to Circ reaching version 1.0, signifying API stability for the NISC era, demonstrating the iterative nature of quantum software engineering where user and hardware needs drive continuous refinement and new tool creation.

Practical Demonstration of Quantum Error Correction Achieves Breakthrough

Google researchers have successfully demonstrated working quantum error correction in practice, moving beyond theoretical understanding. This breakthrough involved overcoming significant engineering challenges with qubit control and error rates. The experiment validates a critical step towards building larger, more reliable quantum computers capable of addressing complex problems.

Macroscopic Quantum Systems Pave Way for Quantum Computing

Early experiments in the mid-1980s demonstrated that quantum mechanics applied at a macroscopic scale, challenging prior assumptions. This foundational work, particularly the creation of superconducting artificial atoms, directly led to the architecture of modern quantum computers like Google's. This journey from fundamental research to industrial application highlights the potential of quantum computing for real-world problems insoluble by classical supercomputers.

Reduced Quantum Resources Threaten Cryptocurrency Elliptic Curve Cryptography

A new whitepaper reveals that future quantum computers may break elliptic curve cryptography (ECC) with significantly fewer qubits and gates than previously estimated. This poses a threat to cryptocurrencies and other systems reliant on ECC. The research advocates for a transition to post-quantum cryptography (PQC) and responsible disclosure practices to mitigate these vulnerabilities, emphasizing the urgency of implementing PQC solutions.

Quantum Computing Threatens Current Cryptocurrencies

Quantum computers capable of executing Shor's algorithm pose an imminent threat to elliptic curve cryptography, the foundation of modern blockchain security. This vulnerability allows for "on-spend" attacks on cryptocurrency transactions, particularly impacting fast-clock quantum architectures. The issue extends beyond technical solutions, necessitating public policy and a rapid transition to Post-Quantum Cryptography (PQC).

Google Quantum AI Expands to Neutral Atom Computing for Enhanced Quantum Advantage

Google Quantum AI is diversifying its quantum computing strategy by integrating neutral atom quantum computing alongside its existing superconducting qubit research. This dual approach aims to accelerate progress toward commercially viable quantum computers by leveraging the complementary strengths of both modalities: superconducting qubits excel in circuit depth, while neutral atoms offer higher qubit counts and flexible connectivity. This expansion is supported by new leadership in neutral atom hardware development and collaboration with academic and industry partners.

Preparing for the Quantum Threat to Current Cryptography

Quantum computers are nearing the capability to break existing public-key cryptography, posing a significant threat to digital security. Malicious actors are already collecting encrypted data for "store now, decrypt later" attacks. The industry is responding with Post-Quantum Cryptography (PQC) standards, but widespread adoption and governmental support are crucial to avoid a security crisis.

Observation of a Quantum Glass State in 2D Systems at Finite Temperatures

This paper presents evidence for a two-dimensional quantum glass state at finite temperatures within an interacting spin model. Using a superconducting qubit array, researchers observed an intermediate non-ergodic regime characterized by glass-like properties: broadly distributed physical observables and effectively frozen degrees of freedom. This work demonstrates a transition out of the ergodic phase in 2D systems, advancing the understanding of quantum many-body systems with disorder.

Disorder-Induced Superfluidity Observed in Qutrit System

This research experimentally demonstrates disorder-induced superfluidity on a superconducting processor utilizing qutrit readout and control. The study identifies regions of non-vanishing condensate fraction through spatially-resolved two-point correlator measurements and observes a linearly-dispersing phonon mode within the superfluid, even with induced disorder. These findings challenge the conventional understanding that disorder primarily suppresses long-range coherence.

Magic State Cultivation Achieves High Fidelity on Superconducting Processors

Fault-tolerant quantum computing necessitates universal gate sets, with non-Clifford gates posing substantial resource costs. Magic state cultivation presents an efficient alternative to demanding distillation protocols. This work experimentally demonstrates magic state cultivation on a superconducting quantum processor, integrating code-switching and fault-tolerant measurement to achieve a high-fidelity magic state with reduced error.

Advancements in Quantum Computing and its Potential Impact on Neuroscience

Google Quantum AI is making significant progress in developing fault-tolerant quantum computers, demonstrating practical quantum error correction and quantum algorithms that exceed classical computational capabilities. This progress is driven by advancements in hardware, software, and a long-term roadmap towards building a useful quantum computer for currently unsolvable problems. The potential societal implications extend to breaking current cryptographic standards and offering new tools for scientific discovery, with speculative implications for understanding consciousness through quantum mechanics.

AlphaQubit 2: A Neural Quantum Error Decoder Enabling Practical Fault-Tolerant Computing

AlphaQubit 2, a novel neural-network decoder, achieves superior logical error rates for both surface and color codes under realistic noise conditions. This innovation significantly surpasses the speed of existing high-accuracy decoders for the color code, enabling real-time decoding below 1 microsecond per cycle on commercial hardware. These advancements provide a viable pathway toward scalable, high-accuracy, and real-time neural decoding essential for fault-tolerant quantum computation.

Navigating the Quantum Landscape: Optimism, Challenges, and the Future of Computing

Hartmut Neven, often dubbed Google's "chief optimist," discusses the historical and ongoing challenges as well as the advancements in quantum computing. He highlights the distinction between quantum supremacy and verifiable quantum advantage, providing insights into the motivations behind Google's research and its implications for both academic and industrial landscapes. The discussion also touches upon the speculative but intriguing connection between quantum mechanics and consciousness.

Measurement Contextuality Drives Quantum-Classical Separation in Bounded-Resource Tasks

Quantum phenomena enable capabilities beyond classical computing. This work demonstrates quantum-classical separation using contextuality on a superconducting qubit processor. They achieve success probabilities exceeding classical limits in tasks like the magic square game, N-player GHZ game, and a 2D hidden linear function problem, suggesting contextuality-based algorithms as benchmarks for quantum processors.

Reinforcement Learning for Autonomous Quantum Error Correction

Quantum computers are susceptible to environmental drift, necessitating frequent recalibration. This work proposes a novel framework where quantum error correction (QEC) not only corrects logical states but also acts as a learning signal for a reinforcement learning (RL) agent. This RL agent continuously adjusts physical control parameters, enabling real-time stabilization and calibration during computation, thereby eliminating the need for periodic interruptions.

Google Quantum AI Advances with Verifiable Quantum Echoes Algorithm for Molecular Structure Prediction

Google Quantum AI has developed the Quantum Echoes algorithm, a verifiable method for predicting the behavior of quantum systems. This algorithm, implemented on their quantum processors, can learn the Hamiltonian of molecules, offering a new approach to molecular structure determination. This development moves quantum computing beyond benchmark problems towards real-world applications, with initial proof-of-concept experiments successfully demonstrating its utility in chemistry.

Verifiable Quantum Advantage Achieved with Quantum Echoes on Willow Chip

Google has demonstrated the first-ever verifiable quantum advantage using the "Quantum Echoes" algorithm on its Willow quantum chip. This breakthrough allows for the computation of molecular structures 13,000 times faster than classical supercomputers and provides a method for scalable verification of quantum computations. This advancement paves the way for real-world applications in fields like medicine and materials science by providing unprecedented precision in understanding natural systems.

Google Achieves Verifiable Quantum Advantage with "Quantum Echoes" Algorithm

Google Quantum AI has developed a novel "quantum echoes" algorithm run on their Willow chip, demonstrating verifiable quantum advantage. This algorithm, which probes interactions within quantum systems, executed 13,000 times faster than the best classical supercomputer algorithm. The breakthrough offers a path toward practical quantum computing applications, including enhanced molecular structure analysis for drug and material design.

Google Acquires Atlantic Quantum to Accelerate Superconducting Quantum Computing

Google Quantum AI has acquired Atlantic Quantum, an MIT-founded startup specializing in highly integrated quantum computing hardware. This acquisition aims to accelerate Google's roadmap toward a large-scale, error-corrected quantum computer by integrating Atlantic Quantum's modular chip stack, which combines qubits and superconducting control electronics within the cold stage, enhancing the scalability of Google's superconducting qubit hardware.

Quantum Computers Achieve Generative Advantage in Learning Complex Distributions

Researchers have introduced novel generative quantum models that are classically intractable yet efficiently trainable, overcoming previous hurdles in demonstrating generative quantum advantage. These models mitigate issues like barren plateaus and proliferating local minima, enabling quantum computers to learn and generate distributions beyond classical capabilities. Experimental validation on a 68-qubit superconducting processor confirms their efficacy in learning complex probability distributions and accelerating physical simulations.

Google Quantum AI Joins DARPA QBI to Benchmark Quantum Computing Progress

Google Quantum AI has been selected for the DARPA Quantum Benchmarking Initiative (QBI). This program will rigorously assess quantum computing approaches to determine their potential for achieving utility-scale, fault-tolerant quantum computing by 2033. Google's participation aims to validate their quantum computing approach and accelerate the development of solutions for complex, currently unsolvable problems across various fields.

Second-Order Out-of-Time-Order Correlators (OTOCs) Enable Quantum Advantage in Ergodic Dynamics

This paper demonstrates that second-order out-of-time-order correlators (OTOCs) can characterize ergodic dynamics in quantum many-body systems. Unlike traditional quantum observables, OTOCs remain sensitive to underlying dynamics at long timescales by leveraging repeated time-reversal protocols. The observed constructive interference mechanism endows OTOCs with high classical simulation complexity, preventing their calculation by even the fastest classical algorithms. This complexity positions OTOC measurements as a viable path towards practical quantum advantage.

Google Quantum AI achieves quantum error correction below threshold and introduces new Willow processor

Google Quantum AI has developed the Willow processor, a 105-qubit chip with significantly improved coherence times and low error rates, enabling advancements in quantum error correction. They have demonstrated quantum error correction below the surface code threshold and achieved a double exponential speedup in random circuit sampling, pushing towards useful quantum applications.

Expanding Surface Code Viability via Time-Dynamic Hardware Implementations

This research demonstrates three time-dynamic variants of the surface code that optimize hardware constraints: a hexagonal lattice reducing connectivity requirements, a 'walking' architecture for error mitigation via role-swapping, and an iSWAP-based implementation. Experimental results confirm that these dynamic approaches maintain fault-tolerant scaling, showing measurable error suppression when increasing code distance from 3 to 5.

Color Code Advances Fault-Tolerant Quantum Computing on Superconducting Processors

This work demonstrates a comprehensive implementation of the color code on a superconducting quantum processor. It achieves logical error suppression with scaling and performs logical operations, including transversal Clifford gates and magic state injection. The findings suggest the color code is a promising alternative to the surface code for fault-tolerant quantum computation.

Willow Quantum Chip Achieves Exponential Error Reduction and Beyond-Classical Performance

Google's new Willow quantum chip demonstrates exponential error reduction as qubit count increases, a significant breakthrough in quantum error correction. It also performed a benchmark computation in under five minutes that would take classical supercomputers 10 septillion years, showcasing a substantial performance leap in quantum computing capabilities. These achievements represent major steps towards building useful, large-scale quantum computers for real-world applications.

Observation of Disorder-Free Localization in (2+1)D Lattice Gauge Theory

This paper reports the observation of localization in a (2+1)D lattice gauge theory without disorder, utilizing a quantum processor. The key innovation is the use of quantum circuits initialized in superpositions over all disorder configurations, interpretable as superpositions over gauge sectors. This approach enables a polynomial speedup in sampling disorder configurations, addressing a significant challenge in many-body localization studies.

Experimental Visualization of String Dynamics and Confinement in (2+1)D Lattice Gauge Theories

This research experimentally investigates string dynamics in a (2+1)D Z2 lattice gauge theory using superconducting qubits. The study identifies a transition from deconfined to confined charge dynamics as electric field coupling increases, revealing distinct string fluctuation regimes within the confined phase. The implementation offers novel techniques for studying emergent excitations and string dynamics in LGTs.

Overcoming Engineering Hurdles to Scale Quantum Computing

Quantum computing leverages qubits capable of superposition to achieve exponential processing power beyond classical computers. Google is developing full-stack quantum computers by overcoming significant engineering challenges in qubit fabrication, cryogenic packaging, and precise control systems. This involves designing specialized quantum processors, ultracold dilution refrigerators, and complex algorithms to harness quantum mechanics for problem-solving.

Experimental Quantum Error Correction Beyond Break-Even with Surface Codes

This work demonstrates the successful implementation of surface code quantum error correction (QEC) below the critical threshold, achieving logical error rate suppression and extending logical qubit lifetime beyond that of individual physical qubits. The integration of real-time decoding with low latency further validates the practicality of these QEC techniques for future fault-tolerant quantum computation. The findings suggest that current hardware performance, if scaled, is sufficient to meet the operational demands of large-scale quantum algorithms.

Older entries →