Search
نمایش تعداد 1-10 از 52
On Entropy of a Pareto distribution in the presence of outliers
The aim of this paper is obtaining the amount of information there exists in the Pareto distribution in the presence of outliers. For the sake of this purpose, Shannon entropy, "-entropy, Fisher information and Kullback-Leibler ...
Exponentiality test based on the progressive type II censoring via cumulative entropy
In this paper, we use cumulative residual Kullback-Leibler information (CRKL) and cumulative Kullback-Leibler information (CKL) to construct two goodness-offit test statisttics for testing exponentiality with progressively ...
Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
The past decade has seen rapid application of Information Theoretic Learning (ITL) criteria in
robust signal processing and machine learning problems. Generally, in ITL’s literature, it is seen that, under non-Gaussian ...
An unsupervised learning approach by novel damage indices in structural health monitoring for damage localization and quantification
is to propose an innovative residual-based feature extraction approach based on AutoRegressive modeling and a novel statistical distance method named as Partition-based Kullback–Leibler Divergence for damage detection and localization by using randomly high...
Data-driven damage diagnosis under environmental and operational variability by novel statistical pattern recognition methods
is to propose an innovative residual-based feature extraction approach based on AutoRegressive modeling and a novel
statistical distance method named as Partition-based Kullback–Leibler Divergence for damage detection and localization
by using randomly...
Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy
Testing exponentiality has long been an interesting issue in statistical inferences. In
this article, we introduce a new measure of distance between two distributions that is
similar Kullback–Leibler divergence, but using...
Some Properties of Lin Wong Divergence on the Past Lifetime Data
Measures of statistical divergence are used to assess mutual similarities between distributions of multiple variables through a variety of methodologies including Shannon entropy and Csiszar divergence. Modified measures ...
On Conditional Applications of Matrix Variate Normal Distribution
. The presented t-type family is an extension to the work of Dickey [8]. A Bayes estimator for the column covariance matrix Σ of MVND is derived under Kullback Leibler divergence loss (KLDL). Further an application of the proposed result is given in the Bayesian...