Search
Now showing items 1-10 of 15
Testing exponentiality based on characterizations of the exponential distribution
In this paper, we first present two characterizations of the exponential distribution and next introduce threeexact goodness-of-fit test for exponentiality. By simulation, the powers of the proposed tests under variousalternatives ...
Kullback-Leibler Information in View of an Extended Version of k-Records
This paper introduces an extended version of k-records. Kullback-Leibler (K-L) information between two
generalized distributions arising from k-records is derived; subsequently, it is shown that K-L information ...
A note on signature based expressions for the entropy of mixed r-out-of-n systems
We provide an expression for the Shannon entropy of mixed r-out-of-n systems when the lifetimes of the components are independent and identically distributed. The expression gives the system’s entropy in terms of the system ...
Entropy and information (divergence) measures
. Various applications
in statistics and properties of it is one of our aim in here. The link between maximum likelihood and
maximum entropy and Kullback Leibler information is important for a discussion which is coming in
this note...
Goodness-of-Fit test with Adaptive Type-II Progressively Censored Data
a goodness of fit test statistic based on the Kullback-Leibler information for
exponentiality dirtribution. Finally, we used Monte Carlo simulations, the
power of the test is estimated against several alternatives under adaptive...
Testing Exponentiality Based on Record Values
We introduce a goodness of fit test for exponentiality
based on record values. The critical points and powers for some
alternatives are obtained by simulation.
Testing exponentiality based on Kullback-Leibler information with progressively Type-II censored data
Abstract—We express the joint entropy of progressively cen-
sored order statistics in terms of an incomplete integral of the
hazard function, and provide a simple estimate of the joint en-
ropy of ...
Information Measures via Copula Functions
such as
Kullback-Leibler information, J-divergence, Hellinger distances,
alpha -Divergence,... and so on. Properties and results related to
distance between probability distributions derived via copula
functions. Some results...
Difference and similarity between differential entropy and discrete entropy
Shannon\\\\\\'s discovery of the fundamental laws of data compression and
transmission marks the birth of Information Theory. Classically, Shannon entropy
was formalized over discrete probability distributions. This ...
A View on extention of Utility-Based on links with information measures
In this paper, we review the utility based generalization of the Shannon entropy and Kullback Leibler information measures as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then we derive some relations...