We thank the editors for this opportunity and the discussants Kennedy, Balakrishnan and Wasserman (2020) (abbreviated as KBW in the sequel) for their insightful commentaries on our paper (Liu, Mukherjee and Robins, 2020) (abbreviated as LMR in the sequel).
A BRIEF INTRODUCTION TO HIGHER ORDER INFLUENCE FUNCTIONSWe would like to start our rejoinder by responding to the philosophical comments in Section 6 of KBW's discussion before getting into the other more technical comments. In Section 6, KBW divide statistical procedures into structure-driven and methods-driven but also acknowledge that the boundary between these two categories is blurry. For example, even for the poster child of the methods-driven tools -deep neural networks -one common research direction is to prove some form of optimality or robustness under some assumptions, often quantified by smoothness, sparsity or other related complexity measures such as metric entropy (Schmidt-Hieber, 2020, Hayakawa and Suzuki, 2020, Barron and Klusowski, 2018.The discussants then state that higher order influence function (HOIF) based methods are 'structure-driven' because 'they typically rely on carefully constructed series estimates' and achieve 'better performance over appropriate Hölder spaces potentially at the expense of being more structure driven.' This statement misunderstands the motivation and goals of HOIF estimation. Our goal has always been to make HOIF fully methods-driven. However,