搜索结果: 1-15 共查到“理论统计学 entropy”相关记录21条 . 查询时间(0.124 秒)
The Cross-Entropy Method for Estimation
cross-entropy estimation rare events importance sampling adaptive Monte Carlo zero-variance distribution
2015/7/6
This chapter describes how difficult statistical estimation problems can often be solved efficiently by means of the cross-entropy (CE) method. The CE method can be viewed as an adaptive importance sa...
In this note the relation between the range-renewal speed and entropy for i.i.d. models is discussed.
For the evaluation of information flow in bivariate time series, information measures have been employed, such as the transfer entropy (TE), the symbolic transfer entropy (STE), defined similarly to T...
Testing Exponentiality Based on Rényi Entropy With Progressively Type-II Censored Data
Renyi Entropy hazard function Monte Carlo simulation order statistics Type-II progressively censored data
2013/4/28
We express the joint R\'enyi entropy of progressively censored order statistics in terms of an incomplete integral of the hazard function, and provide a simple estimate of the joint R\'enyi entropy of...
Refinement revisited with connections to Bayes error, conditional entropy and calibrated classifiers
Refinement Score Probability Elicitation Calibrated Classifier Bayes Error Bound Conditional Entropy Proper Loss
2013/4/27
The concept of refinement from probability elicitation is considered for proper scoring rules. Taking directions from the axioms of probability, refinement is further clarified using a Hilbert space i...
Statistical estimation of quadratic Rényi entropy for a stationary m-dependent sequence
Entropy estimation quadratic R′enyi entropy stationarym-dependent sequence inter-point distances U-statistics
2013/4/27
The R\'enyi entropy is a generalization of the Shannon entropy and is widely used in mathematical statistics and applied sciences for quantifying the uncertainty in a probability distribution. We cons...
Variance estimation and asymptotic confidence bands for the mean estimator of sampled functional data with high entropy unequal probability sampling designs
covariance function finite population Hajek approximation Horvitz-Thompso estimator Kullback-Leibler divergence rejective sampling unequal probability sampling without replacement
2012/11/23
For fixed size sampling designs with high entropy it is well known that the variance of the Horvitz-Thompson estimator can be approximated by the H\'ajek formula. The interest of this asymptotic varia...
Estimation of entropy-type integral functionals
U-statistics estimation of divergence density power divergence asymptotic normality entropy estimation Raenyi entrop
2012/11/22
Integrated powers of densities of one- or two-multidimensional random variables appear in a variety of problems in mathematical statistics, information theory, and computer science. We study U-statist...
Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs
Rényi Entropy Mutual Information Generalized Nearest-Neighbor Graphs
2010/3/11
In this paper we consider simple and computationally efficient nonparametric estimators of R´enyi
entropy and mutual information based on an i.i.d. sample drawn from an unknown, absolutely
con...
Asymptotic optimality of the cross-entropy method for Markov chain problems
Asymptotic optimality cross-entropy method Markov chain problems
2010/3/11
The correspondence between the cross-entropymethod and the zero-variance
approximation to simulate a rare event problem in Markov chains is shown. This
leads to a sufficient condition that the cross...
Analytical continuation of imaginary axis data using maximum entropy
Analytical continuation imaginary axis data maximum entropy
2010/3/9
We study the maximum entropy (MaxEnt) approach for analytical continuation of spectral data
from imaginary times to real frequencies. The total error is divided in a statistical error, due to
the no...
Sharp Bounds on the Entropy of the Poisson Law and Related Quantities
Sharp Bounds Entropy Poisson Law Related Quantities
2010/3/9
One of the difficulties in calculating the capacity of
certain Poisson channels is that H(), the entropy of the Poisson
distribution with mean , is not available in a simple form. In
this work we...
TSALEIS' ENTROPY BOUNDS FOR GENERALIZED ORDER STATISTICS
Generalized order statistics order statistics progressive type I1 censored order statistics
2009/9/21
We present sharp bounds for expectations of generalized
order statistics with random indices expressed in terms of Tsallis'
entropy. The bounds are attainable and provide new characterizations
of s...
METRIC ENTROPY AND THE SMALL DEVIATION PROBLEM FOR STABLE PROCESSES
Small deviation lower tail probability Gaussian processes stable processes metric entropy
2009/9/18
The famous connection between metric entropy and
small deviation probabilities of Gaussian processes was discovered by
Kuelbs and Li in [6] and completed by Li and Linde in [9]. The
question whethe...
Entropy Estimate for k-Monotone Functions via Small Ball Probability of Integrated Brownian Motions
Metric entropy class of distribution functions
2009/3/19
Metric entropy of the class of probability distribution functions on [0,1] with a k-monotone density is studied through its connection with the small ball probability of k-times integrated Brownian mo...