2022
DOI: 10.1007/978-3-031-20500-2_34
|View full text |Cite
|
Sign up to set email alerts
|

PHN: Parallel Heterogeneous Network with Soft Gating for CTR Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…Later, DeepFM proposed to replace the wide part of Wide&Deep with FM; DCN proposed Cross Network to learn higher-order feature interaction; DCN-V2 replaced vector parameters with matrix parameters to increase the expression capability of the network; xDeepFM [6] proposed CIN structure,a vector-wise interaction approach to construct more interpretable higher-order feature interactions. AutoInt [7] employs multi-head self-attention layers to represent feature interactions,FINT [9] design Field-aware Interaction Layer,which effectively extract field-award information,its also vector-wise level interaction method.PHN [3] combine three parallel towers (FFN, Cross layer, Field Interaction layer) to improve the expression ability of CTR prediction model.…”
Section: Feature Interaction In Ctrmentioning
confidence: 99%
“…Later, DeepFM proposed to replace the wide part of Wide&Deep with FM; DCN proposed Cross Network to learn higher-order feature interaction; DCN-V2 replaced vector parameters with matrix parameters to increase the expression capability of the network; xDeepFM [6] proposed CIN structure,a vector-wise interaction approach to construct more interpretable higher-order feature interactions. AutoInt [7] employs multi-head self-attention layers to represent feature interactions,FINT [9] design Field-aware Interaction Layer,which effectively extract field-award information,its also vector-wise level interaction method.PHN [3] combine three parallel towers (FFN, Cross layer, Field Interaction layer) to improve the expression ability of CTR prediction model.…”
Section: Feature Interaction In Ctrmentioning
confidence: 99%
“…• We use different feature interaction modules in PHN [9] and discuss the private components and public components in the shared bottom layer for the specific and shared information of tasks. Using SSG module to strengthen the selection and combination of raw features for different feature crossing modules.…”
Section: Introductionmentioning
confidence: 99%