Mirza, Ali H.2019-02-212019-02-212018-05http://hdl.handle.net/11693/50222Date of Conference: 2-5 May 2018Conference name: 26th Signal Processing and Communications Applications Conference (SIU) 2018In this paper, we propose a boosted regression algorithm in an online framework. We have a linear combination of the estimated output for each weak learner and weigh each of the estimated output differently by introducing ensemble coefficients. We then update the ensemble weight coefficients using both additive and multiplicative updates along with the stochastic gradient updates of the regression weight coefficients. We make the proposed algorithm robust by introducing two critical factors; significance and penalty factor. These two factors play a crucial role in the gradient updates of the regression weight coefficients and in increasing the regression performance. The proposed algorithm is guaranteed to converge in terms of exponentially decaying regret bound in terms of number of weak learners. We then demonstrate the performance of our proposed algorithm on both synthetic as well as real-life data sets.EnglishBoosted regressionBoostingEnsemble learningMultiplicative updatesRegressionOnline boosting algorithm for regression with additive and multiplicative updatesConference Paper10.1109/SIU.2018.8404455