Statistics Learning - Multi-variant logistic regression

Thomas Bayes

About

A logistic regression with multiple variables and two class outcome.

General linear model

<MATH> \begin{array}{rrrl} Pr(Y = 1|X) & = & p(X) & = & \frac{\displaystyle e^{\displaystyle B_0 + B_1 . X_1 + \dots + B_i . X_i}}{\displaystyle 1+ e^{\displaystyle B_0 + B_1 . X_1 + \dots + B_i . X_i }} \\ \end{array} </MATH>

Invert of the logit transformation: <MATH> \begin{array}{rrl} log \left (\frac {\displaystyle p(X)}{\displaystyle 1 - p(X)} \right ) & = & B_0 + B_1 X_1 + \dots + B_i X_i \\ \end{array} </MATH>

R

myLogisticRegressionModel <- glm ( targetVariable~�., data=myDataFrame , family = binomial )
summary ( myLogisticRegressionModel )
  • tilde means to be modeled as.
  • And dot means all the other variables in the data frame
  • A binomial family tells to fit the logistic regression model.

Interpretation

We're not too interested in the intercept.

It is difficult to interpret regression coefficients in a multiple regression model, because the correlations between the variables can affect the signs.

When we have correlated variables, these variables act as surrogates for each other and it can affect:

  • the sign of the coefficient
  • the p-value (significant or not)





Discover More
Logistic Regression Vs Linear
Machine Learning - (Univariate|Simple) Logistic regression

A Simple Logistic regression is a Logistic regression with only one parameters. For the generalization (ie with more than one parameter), see Logistic regression comes from the fact that linear regression...
Thomas Bayes
Machine Learning - Logistic regression (Classification Algorithm)

The prediction from a logistic regression model can be interpreted as the probability that the label is 1. linear regression can also be used to perform classification problem. Just by transforming the...



Share this page:
Follow us:
Task Runner