2023
DOI: 10.21522/tijar.2014.10.02.art001
|View full text |Cite
|
Sign up to set email alerts
|

Light RAT-SQL: A RAT-SQL with More Abstraction and Less Embedding of Pre-existing Relations

Abstract: RAT-SQL is among the popular framework used in the Text-To-SQL challenges for jointly encoding the database relations and questions in a way to improve the semantic parser. In this work, we propose a light version of the RAT-SQL where we dramatically reduced the number of the preexisting relations from 55 to 7 (Light RAT-SQL-7) while preserving the same parsing accuracy. To ensure the effectiveness of our approach, we trained a Light RAT-SQL-2, (with 2 embeddings) to show that there is a statistically signific… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 41 publications
0
2
0
Order By: Relevance
“…In this work, our proposed method improves the encoder side, especially, the existing RAT-SQL model [8,7]. Please refer to [33,34,62] works for a thorough description of the decoder side.…”
Section: Problem Definition (Text-to -Sql)mentioning
confidence: 99%
See 1 more Smart Citation
“…In this work, our proposed method improves the encoder side, especially, the existing RAT-SQL model [8,7]. Please refer to [33,34,62] works for a thorough description of the decoder side.…”
Section: Problem Definition (Text-to -Sql)mentioning
confidence: 99%
“…In Text-To-SQL [1][2][3][4][5][6][7][8][9], The Light RAT-SQL [8] shows the way to reduce the number of preexisting relations in the RAT-SQL framework [7] by preserving the exact match accuracy without any enhancement of pre-trained (LLMs) large language models [10][11][12][13][14][15]. The limitation of this method is that it can not be suitable for the scenario where we have a lot of pre-existing relations.…”
Section: Introductionmentioning
confidence: 99%