Browsing by Subject "Multiplicative updates"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access Adaptive mixture methods based on Bregman divergences(Elsevier, 2013) Donmez, M. A.; Inan, H. A.; Kozat, S. S.We investigate adaptive mixture methods that linearly combine outputs of m constituent filters running in parallel to model a desired signal. We use Bregman divergences and obtain certain multiplicative updates to train the linear combination weights under an affine constraint or without any constraints. We use unnormalized relative entropy and relative entropy to define two different Bregman divergences that produce an unnormalized exponentiated gradient update and a normalized exponentiated gradient update on the mixture weights, respectively. We then carry out the mean and the mean-square transient analysis of these adaptive algorithms when they are used to combine outputs of m constituent filters. We illustrate the accuracy of our results and demonstrate the effectiveness of these updates for sparse mixture systems.Item Open Access Online boosting algorithm for regression with additive and multiplicative updates(IEEE, 2018-05) Mirza, Ali H.In this paper, we propose a boosted regression algorithm in an online framework. We have a linear combination of the estimated output for each weak learner and weigh each of the estimated output differently by introducing ensemble coefficients. We then update the ensemble weight coefficients using both additive and multiplicative updates along with the stochastic gradient updates of the regression weight coefficients. We make the proposed algorithm robust by introducing two critical factors; significance and penalty factor. These two factors play a crucial role in the gradient updates of the regression weight coefficients and in increasing the regression performance. The proposed algorithm is guaranteed to converge in terms of exponentially decaying regret bound in terms of number of weak learners. We then demonstrate the performance of our proposed algorithm on both synthetic as well as real-life data sets.Item Open Access Variants of combinations of additive and multiplicative updates for GRU neural networks(IEEE, 2018) Mirza, Ali H.In this paper, we formulate several variants of the mixture of both the additive and multiplicative updates using stochastic gradient descent (SGD) and exponential gradient (EG) algorithms respectively. We employ these updates on the gated recurrent unit (GRU) networks. We then derive the gradient-based updates for the parameters of the GRU networks. We propose four different updates as a mean, minimum, even-odd and balanced set of updates for the GRU network. Through an extensive set of experiments, we demonstrate that these update variants perform better than simple SGD and EG updates. Overall, we observed that GRU-Mean update achieved the minimum cumulative and steady-state error performance. We also simulated the same set of experiments on the long short-term memory (LSTM) networks.