Incremental one-class learning using regularized null-space training for industrial defect detection

TitleIncremental one-class learning using regularized null-space training for industrial defect detection
Publication TypeConference Paper
Year of Publication2023
AuthorsHermann, M., Umlauf G., Goldlücke B., & Franz M. O.
Conference Name16th International Conference on Machine Vision (ICMV)
Date Published11/2023
Abstract

One-class incremental learning is a special case of class-incremental learning, where only a single novel class is incrementally added to an existing classifier instead of multiple classes.
This case is relevant in industrial defect detection scenarios, where novel defects usually appear during operation.
Existing rolled-out classifiers must be updated incrementally in this scenario with only a few novel examples.
In addition, it is often required that the base classifier must not be altered due to approval and warranty restrictions.
While simple finetuning often gives the best performance across old and new classes, it comes with the drawback of potentially losing performance on the base classes (catastrophic forgetting).
Simple prototype approaches work without changing existing weights and perform very well when the classes are well separated but fail dramatically when not.
In theory, null-space training (NSCL) should retain the basis classifier entirely, as parameter updates are restricted to the null space of the network with respect to existing classes.
However, as we show, this technique promotes overfitting in the case of one-class incremental learning.
In our experiments, we found that unconstrained weight growth in null space is the underlying issue, leading us to propose a regularization term (R-NSCL) that penalizes the magnitude of amplification.
The regularization term is added to the standard classification loss and stabilizes null-space training in the one-class scenario by counteracting overfitting.
We test the method's capabilities on two industrial datasets, namely AITEX and MVTec, and compare the performance to state-of-the-art algorithms for class-incremental learning.