Optimizing ADWIN for Steady Streams

Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
2022 Association for Computing Machinery. ACM
Abstract
With the ever-growing data generation rates and stringent con straints on the latency of analyzing such data, stream analytics is overtaking. Learning from data streams, aka online machine learn ing, is no exception. However, online machine learning comes with many challenges for the different aspects of the learning process, starting from the algorithm design to the evaluation method. One of these challenges is the ability of a learning system to adapt to the change in data distribution, known as concept drift, to maintain the accuracy of the predictions. Over time, several drift detection approaches have been proposed. A prominent approach is adaptive windowing (ADWIN) which can detect changes in features data distribution without explicit feedback on the correctness of the prediction. Several variants for ADWIN have been proposed to enhance its runtime performance, w.r.t throughput, and latency. However, the drift detection accuracy of these variants was not compared with the original algorithm. Moreover, there is no study concerning the memory consumption of the variants and the origi nal algorithm. Additionally, the evaluation was done on synthetic datasets with a considerable number of drifts not covering all types of drifts or steady streams, those that do not have drifts at all or almost negligible drifts. The contribution of this paper is two-fold. First, we compare the original Adaptive Window (ADWIN) and its variants: Serial, HalfCut, and Optimistic in terms of drift detection accuracy, detec tion speed, and memory consumption, represented in the internal window size. We compare them using synthetic data sets cover ing different types of concept drifts, namely: incremental, gradual, abrupt, and steady. We also use two real-life datasets whose drifts are unknown. Second, we present ADWIN++. We use an adaptive bucket dropping technique to control window size. We evaluate our technique on the same data sets above and new datasets with fewer drifts. Experiments show that our approach saves about 80% of memory consumption. Moreover, it takes less time to detect concept drift and maintains the drift detection accuracy.
Description
Keywords
Online machine learning, Concept drifts, ADWIN, Steady streams
Citation
Moharram, H., Awad, A. and El-Kafrawy, P.M. (2022) “Optimizing ADWIN for steady streams,” in Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing.
Collections