论文成果

Adaptive online incremental learning for evolving data streams

摘要:Recent years have witnessed growing interests in online incremental learning. However, there are three major challenges in this area. The first major difficulty is concept drift, that is, the probability distribution in the streaming data would change as the data arrives. The second major difficulty is catastrophic forgetting, that is, forgetting what we have learned before when learning new knowledge. The last one we often ignore is the learning of the latent representation. Only good latent representation can improve the prediction accuracy of the model. Our research builds on this observation and attempts to overcome these difficulties. To this end, we propose an Adaptive Online Incremental Learning for evolving data streams (AOIL). We use auto-encoder with the memory module, on the one hand, we obtained the latent features of the input, on the other hand, according to the reconstruction loss of the auto-encoder with memory module, we could successfully detect the existence of concept drift and trigger the update mechanism, adjust the model parameters in time. In addition, we divide features, which are derived from the activation of the hidden layers, into two parts, which are used to extract the common and private features respectively. By means of this approach, the model could learn the private features of the new coming instances, but do not forget what we have learned in the past (shared features), which reduces the occurrence of catastrophic forgetting. At the same time, to get the fusion feature vector we use the self-attention mechanism to effectively fuse the extracted features, which further improved the latent representation learning. Moreover, in order to further improve the robustness of the algorithm, we add the de-noising auto-encoder to original framework. Finally, we conduct extensive experiments on different datasets, and show that the proposed AOIL gets promising results and outperforms other state-of-the-art methods. (C) 2021 Elsevier B.V. All rights reserved.

关键字:Adaptive online incremental learning; Auto-encoder with memory module; Concept drift; Catastrophic forgetting; Latent representation; Self-attention mechanism

ISSN号:1568-4946

卷、期、页:卷: 105

发表日期:2021-07-01

期刊分区(SCI为中科院分区):二区

收录情况:SCIE(科学引文索引网络版),EI(工程索引)

发表期刊名称:APPLIED SOFT COMPUTING

通讯作者:张思思

第一作者:刘建伟,左信

论文类型:期刊论文

论文概要:张思思,刘建伟,左信,Adaptive online incremental learning for evolving data streams,APPLIED SOFT COMPUTING,2021,卷: 105

论文题目:Adaptive online incremental learning for evolving data streams