2023
DOI: 10.2139/ssrn.4353577
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient and Fail-Safe Collisionless Quantum Boltzmann Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…To leverage quantum computing for multiphysics, quantum LBM (QLBM) has been recently developed for the advection-diffusion and the incompressible Navier-Stokes equations [8,10]. Other approaches can be found in [14][15][16][17][18]. In contrast to macroscale methods, the LBM replaces fluid density with probability distributions of fictive particles.…”
Section: Quantum Lattice Boltzmann Methods (Lbm)mentioning
confidence: 99%
“…To leverage quantum computing for multiphysics, quantum LBM (QLBM) has been recently developed for the advection-diffusion and the incompressible Navier-Stokes equations [8,10]. Other approaches can be found in [14][15][16][17][18]. In contrast to macroscale methods, the LBM replaces fluid density with probability distributions of fictive particles.…”
Section: Quantum Lattice Boltzmann Methods (Lbm)mentioning
confidence: 99%
“…The hybrid scheme is advantageous because it allows data to be encoded both in qubits and amplitudes; the 'amplitudes' represent the information we are interested in, while the 'qubits' encode the N features. This dual approach makes Qsample encoding particularly useful in probabilistic QML models and quantum Boltzmann machines (QBMs) [89], [90]. In these applications, state preparation for a given probability distribution operates in the same manner, where a qubit-efficient algorithm is polynomial in the input, whereas an amplitude-efficient quantum algorithm is exponential in the input dimension N [83].…”
Section: O Mp U T a T I O N A L B A S I S S T A T E S A Mp L I T U D Ementioning
confidence: 99%