LINEAR REGRESSION
As seen below, the coding for logistic regression is glm and you provide the formula
fit <- lm(disease ~ BP + weight, data=mydata)
summary(fit)
Call:
lm(formula = disease ~ BP + weight, data = mydata)
Residuals:
Min 1Q Median 3Q Max
-1.12061 -0.29472 -0.04044 0.24307 0.95800
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -0.0603775 0.1313063 -0.460 0.647
BP -0.0037905 0.0020234 -1.873 0.064 .
weight 0.0071755 0.0008032 8.933 2.69e-14 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 0.3617 on 97 degrees of freedom
Multiple R-squared: 0.4915, Adjusted R-squared: 0.481
F-statistic: 46.88 on 2 and 97 DF, p-value: 5.69e-15
LOGISTIC REGRESSION
As seen below, the coding for logistic regression is glm and you provide the formula.
logistic<- glm(formula = disease ~ weight + BP, family = "binomial", data = mydata)
Deviance Residuals:
Min 1Q Median 3Q Max
-3.2966 -0.6991 0.0969 0.4862 2.1822
Coefficients:
Estimates Std. Error z value Pr(>|z|)
(Intercept) -4.771331 1.360626 -3.507 0.000454 ***
weight 0.055456 0.012031 4.609 4.04e-06***
BP -0.007826 0.017222 -0.454 0.649510
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
(Dispersion parameter for binomial family taken to be 1)
Null deviance: 138.269 on 99 degrees of freedom
Residual deviance: 77.706 on 97 degrees of freedom
AIC: 83.706
Number of Fisher Scoring iterations: 6