2022
DOI: 10.1007/978-3-030-96033-9_4
|View full text |Cite
|
Sign up to set email alerts
|

Quantum Attention Based Language Model for Answer Selection

Abstract: Chain-of-Thought and Program-Aided Language Models represent two distinct reasoning methods, each with its own strengths and weaknesses. We demonstrate that it is possible to combine the best of both worlds by using different models for different problems, employing a large language model (LLM) to perform model selection. Through a theoretical analysis, we discover that the performance improvement is determined by the differences between the combined methods and the success rate of choosing the correct model. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…Then they implemented Grover's search algorithm to find correct answer. Zhao et al (2022) [34] found a balance between a model's performance and interpretability while proposed a quantum attention-based language model. Density matrix was used in quantum attention mechanism.…”
Section: Question Classificationmentioning
confidence: 99%
“…Then they implemented Grover's search algorithm to find correct answer. Zhao et al (2022) [34] found a balance between a model's performance and interpretability while proposed a quantum attention-based language model. Density matrix was used in quantum attention mechanism.…”
Section: Question Classificationmentioning
confidence: 99%
“…Ref. [26] understood the quantum attention mechanism as a density matrix by which more powerful sentence representations can be constructed. Unfortunately, the above two approaches only involve certain physical concepts in quantum mechanics without providing specific quantum circuits.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast to Ref. [25,26,40], QSAN is potentially fully deployed and realized on quantum devices with fewer measurements and a beneficial byproduct called QBSASM. Whereas, the essential motivation for proposing this QSAN is to explore whether young quantum computers can have quantum characteristic attention and can depict the distribution of outputs in a quantum language, not to replace SAM or to beat all the schemes in the Ref.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation