2023
DOI: 10.1016/j.trb.2023.102783
|View full text |Cite
|
Sign up to set email alerts
|

Combining discrete choice models and neural networks through embeddings: Formulation, interpretability and performance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…Embedding techniques can be employed to represent categorical data as continuous-valued vectors in a lowerdimensional space, effectively mitigating memory overhead, especially when dealing with a large number of categorical features [76]. Traditional algorithms require numerical inputs, leading to the utilization of encoding methods to transform categorical values into numerical ones [33].…”
Section: A Embedding Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…Embedding techniques can be employed to represent categorical data as continuous-valued vectors in a lowerdimensional space, effectively mitigating memory overhead, especially when dealing with a large number of categorical features [76]. Traditional algorithms require numerical inputs, leading to the utilization of encoding methods to transform categorical values into numerical ones [33].…”
Section: A Embedding Techniquesmentioning
confidence: 99%
“…Another advanced method is the embeddings technique, which is used to represent categorical data as continuous-valued vectors in a lower-dimensional space. These continuous-valued vectors are called embeddings [76]. Additionally, this method enables models to comprehend the connections, parallels, and associations between various categories that the one-hot encoding method does not.…”
Section: Introductionmentioning
confidence: 99%