Statistics - log-likelihood function (cross-entropy)

Thomas Bayes

About

The “log-likelihood function” is a probabilistic function.

<MATH> \sum_{i=1}^{N}(1-X_i^i)log(1-Pr[1|B_1^1,B_2^2,\dots,B_k^k])+X_i^i.log(Pr[1|B_1^1,B_2^2,\dots,B_k^k]) </MATH>

The “log-likelihood function” is also referred to as the cross-entropy





Discover More
Thomas Bayes
Statistics - Akaike information criterion (AIC)

AIC stands for Akaike Information Criterion. Akaike is the name of the guy who came up with this idea. AIC is a quantity that we can calculate for many different model types, not just linear models,...
Thomas Bayes
Statistics - Best Subset Selection Regression

The most direct approach in order to generate a set of model for the feature selection approach is called all subsets or best subsets regression. We compute the least squares t for all possible subsets...
Thomas Bayes
Statistics - Deviance

The deviance is negative two times the maximized log-likelihood. And in the case of least squares regression, the deviance and the residual sum of squares are equivalent but for other model types the...
Thomas Bayes
Statistics - Maximum likelihood

Maximum likelihood was introduced by Ronald Fisher back in the 1920s. Since each observation is meant to be independent of each other one, the probability of observed data is the probability of the observed...



Share this page:
Follow us:
Task Runner