Aslan, Fatih2021-09-222021-09-222021-092021-092021-09-16http://hdl.handle.net/11693/76529Cataloged from PDF version of article.Includes bibliographical references (leaves 56-60).We investigate the sequential modeling problem and introduce a novel gating mechanism into the temporal convolutional network architectures. In particular, we propose the Gated Temporal Convolutional Network architecture with elaborately tailored gating mechanisms. In our implementation, we alter the way in which the gradients ow and avoid the vanishing or exploding gradient and the dead ReLU problems. The proposed GTCN architecture is able to model the irregularly sampled sequences as well. In our experiments, we show that the basic GTCN architecture is superior to the generic TCN architectures in various benchmark tasks requiring the modeling of long-term dependencies and irregular sampling intervals. Moreover, we achieve the state-of-the-art results on the permuted sequential MNIST and the sequential CIFAR10 benchmarks with the basic structure.xiii, 60 leaves : illustrations, charts ; 30 cm.Englishinfo:eu-repo/semantics/openAccessSequential learningTemporal convolutional networksNovel gating mechanisms for temporal convolutional networksZamansal evrişimli sinirsel ağlar için özgün bir geçit mekanizmasıThesisB154698