We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.
Institute of Electrical and Electronics Engineers (IEEE)
Doi
Abstract
This paper examines the problem of estimating the parameters of a bandlimited signal from samples corrupted by random jitter (timing noise) and additive, independent identically distributed (i.i.d.) Gaussian noise, where the signal lies in the span of a finite basis. For the presented classical estimation problem, the Cramér-Rao lower bound (CRB) is computed, and an Expectation-Maximization (EM) algorithm approximating the maximum likelihood (ML) estimator is developed. Simulations are performed to study the convergence properties of the EM algorithm and compare the performance both against the CRB and a basic linear estimator. These simulations demonstrate that by postprocessing the jittered samples with the proposed EM algorithm, greater jitter can be tolerated, potentially reducing on-chip ADC power consumption substantially.First author draf
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.