Browsing by Subject "Nonlinear models"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access A coupled model for healthy and cancerous cells dynamics in Acute Myeloid Leukemia(IFAC, 2014) Avila, J. L.; Bonnet, C.; Özbay, Hitay; Clairambault, J.; Niculescu, S. I.; Hirsch, P.; Delhommeau, F.In this paper we propose a coupled model for healthy and cancerous cell dynamics in Acute Myeloid Leukemia. The PDE-based model is transformed to a nonlinear distributed delay system. For an equilibrium point of interest, necessary and sufficient conditions of local asymptotic stability are given. Simulation examples are given to illustrate the results.Item Open Access Gradient boosting with moving-average terms for nonlinear sequential regression(Institute of Electrical and Electronics Engineers, 2023-08-23) İlhan, Emirhan; Turalı, Mehmet Yiğit; Kozat, Suleyman Serdar.We investigate sequential nonlinear regression and introduce a novel gradient boosting algorithm that exploits the residuals, i.e., prediction errors, as additional features, as inspired by the well-known linear auto-regressive-moving-average (ARMA) models. Our algorithm utilizes the state information from early time steps contained in the residuals to improve the performance in a nonlinear sequential regression/prediction framework. The algorithm exploits the changes in the previous time steps through residual terms between boosting steps by jointly optimizing the model parameters and the feature vectors. For this optimization, we define an iterative procedure in which we alternate between two steps where the former evaluates the optimal base learner parameters for fixed residual values, and the latter updates the residuals given the new parameters. We show through both artificial and well-known real-life competition datasets that our method significantly outperforms the state-of-the-art.Item Open Access Nonlinear regression via incremental decision trees(Elsevier, 2019) Vanlı, N.; Sayın, M.; Neyshabouri, Mohammadreza Mohaghegh; Özkan, H.; Kozat, Süleyman S.We study sequential nonlinear regression and introduce an online algorithm that elegantly mitigates, via an adaptively incremental hierarchical structure, convergence and undertraining issues of conventional nonlinear regression methods. Particularly, we present a piecewise linear (or nonlinear) regression algorithm that partitions the regressor space and learns a linear model at each region to combine. Unlike the conventional approaches, our algorithm effectively learns the optimal regressor space partition with the desired complexity in a completely sequential and data driven manner. Our algorithm sequentially and asymptotically achieves the performance of the optimal twice differentiable regression function for any data sequence without any statistical assumptions. The introduced algorithm can be efficiently implemented with a computational complexity that is only logarithmic in the length of data. In our experiments, we demonstrate significant gains for the well-known benchmark real data sets when compared to the state-of-the-art techniques.