Browsing by Subject "Soft gradient boosting decision tree (sGBDT)"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access End-to-end hybrid architectures for effective sequential data prediction(2023-08) Aydın, Mustafa EnesWe investigate nonlinear prediction in an online setting and introduce two hybrid models that effectively mitigate, via end-to-end architectures, the need for hand-designed features and manual model selection issues of conventional nonlinear prediction/regression methods. Particularly, we first use an enhanced recurrent neural network (LSTM) to extract features from sequential signals, while pre-serving the state information, i.e., the history, and soft gradient boosted decision trees (sGBDT) to produce the final output. The connection is in an end-to-end fashion and we jointly optimize the whole architecture using stochastic gradient descent. Secondly, we again use recursive structures (LSTM) for automatic fea-ture extraction out of raw data but accompany it with a traditional linear time series model (SARIMAX) to deal with the intricacies of the sequential data, e.g., seasonality. The unification of the models is again in a joint manner; it is through a single state space and we optimize the entire architecture using particle filter-ing. The proposed frameworks are generic so that one can use other recurrent architectures, e.g., GRUs, and differentiable machine learning algorithms as well as time series models that have state space representations in lieu of the specific models presented. We demonstrate the learning behavior of the models on syn-thetic data and the significant performance improvements over the conventional methods and the disjoint counterparts over various real life datasets, with which we also show the generic nature of the frameworks. Furthermore, we openly share the source code of the proposed methods to facilitate further research.Item Open Access A hybrid framework for sequential data prediction with end-to-end optimization(Elsevier, 2022-08-08) Aydin, M.E.; Kozat, Süleyman S.We investigate nonlinear prediction in an online setting and introduce a hybrid model that effectively mitigates, via an end-to-end architecture, the need for hand-designed features and manual model selection issues of conventional nonlinear prediction/regression methods. In particular, we use recursive structures to extract features from sequential signals, while preserving the state information, i.e., the history, and boosted decision trees to produce the final output. The connection is in an end-to-end fashion and we jointly optimize the whole architecture using stochastic gradient descent, for which we also provide the backward pass update equations. In particular, we employ a recurrent neural network (LSTM) for adaptive feature extraction from sequential data and a gradient boosting machinery (soft GBDT) for effective supervised regression. Our framework is generic so that one can use other deep learning architectures for feature extraction (such as RNNs and GRUs) and machine learning algorithms for decision making as long as they are differentiable. We demonstrate the learning behavior of our algorithm on synthetic data and the significant performance improvements over the conventional methods over various real life datasets. Furthermore, we openly share the source code of the proposed method to facilitate further research. © 2022 Elsevier Inc.