Browsing by Subject "Sampling (Statistics)"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Open Access A dynamic importance sampling method for quick simulation of rare events(1993) Erdoğan, AlperSimulation of low-probability events may take extremely long times since they occur very rarely. There are various variance reduction methods used to speed up simulations in such cases. In this thesis, a new variance reduction technique is proposed, which is based on expressing the desired probability as the product of a number of greater probabilities and estimating each term in the product in a recursive manner. It turns out that the resulting estimator, when feasible, uses an importance sampling distribution at each step to constrain the samples into a sequence of larger sets which shrink towards the rare set gradually. Moreover, the important samples used in each step are obtained automatically from the outcomes of the experiments in the previous steps. The method is applied to the estimation of overflow probability in a network of queues and remarkable speed-ups with respect to standard simulation are obtained.Item Open Access Evaluation of the Goldfeld-Quandt test and alternatives(1994) Tomak, KeremIn this study, the widely used Coldfeld-C^uandt test for lieterosk('da.sticity in the linear regression model is evaluated. VV(' reduce the dimension of the data spa.ce that is needed lor tin' computaticui of tlu' t('sts. VVe tlu'ii compa.r(‘ the pi'rformaiK'es of tin' Likelihood Ratio and tin* Cloldh'ld-C^uandt tests by using stringency measure. The problem of analytically non-tractable distribution function in the case of the Likelihood Ratio test is overcome by employing Monte Carlo methods. It is observed that the Likelihood Ratio test is better in most of the cases than the Goldfeld-Quandt test.Item Open Access Finite representation of finite energy signals(2011) Gülcü, Talha CihadIn this thesis, we study how to encode finite energy signals by finitely many bits. Since such an encoding is bound to be lossy, there is an inevitable reconstruction error in the recovery of the original signal. We also analyze this reconstruction error. In our work, we not only verify the intuition that finiteness of the energy for a signal implies finite degree of freedom, but also optimize the reconstruction parameters to get the minimum possible reconstruction error by using a given number of bits and to achieve a given reconstruction error by using minimum number of bits. This optimization leads to a number of bits vs reconstruction error curve consisting of the best achievable points, which reminds us the rate distortion curve in information theory. However, the rate distortion theorem are not concerned with sampling, whereas we need to take sampling into consideration in order to reduce the finite energy signal we deal with to finitely many variables to be quantized. Therefore, we first propose a finite sample representation scheme and question the optimality of it. Then, after representing the signal of interest by finite number of samples at the expense of a certain error, we discuss several quantization methods for these finitely many samples and compare their performances.