“…A recurring question in deep one-class classification is how to meaningfully regularize against a feature map collapse φω ≡ c. Without regularization, minimum volume or maximum margin objectives, such as (16), (20), or (22), could be trivially solved with a constant mapping [137], [333]. Possible solutions for this include adding a reconstruction term or architectural constraints [137], [327], freezing the embedding [136], [139], [140], [142], [334], inversely penalizing the embedding variance [335], using true [144], [336], auxiliary [139], [233], [332], [337], or artificial [337] negative examples in training, pseudolabeling [152], [153], [155], [335], or integrating some manifold assumption [333].…”