From e3e1bd81da989845368d30ebf2c993518c0bffd2 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Metehan=20G=C3=9CNG=C3=96R?= <102655648+gungorMetehan@users.noreply.github.com> Date: Fri, 12 Sep 2025 09:41:31 +0300 Subject: [PATCH] Update inf-model-logistic.qmd LaTeX parentheses --- inf-model-logistic.qmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/inf-model-logistic.qmd b/inf-model-logistic.qmd index 845218d4..d1e5dc36 100644 --- a/inf-model-logistic.qmd +++ b/inf-model-logistic.qmd @@ -67,7 +67,7 @@ email_variables |> Before looking at the hypothesis tests associated with the coefficients (turns out they are very similar to those in linear regression!), it is valuable to understand the technical conditions that underlie the inference applied to the logistic regression model. Generally, as you've seen in the logistic regression modeling examples, it is imperative that the response variable is binary. -Additionally, the key technical condition for logistic regression has to do with the relationship between the predictor variables $(x_i$ values) and the probability the outcome will be a success. +Additionally, the key technical condition for logistic regression has to do with the relationship between the predictor variables ($x_i$ values) and the probability the outcome will be a success. It turns out, the relationship is a specific functional form called a logit function, where ${\rm logit}(p) = \log_e(\frac{p}{1-p}).$ The function may feel complicated, and memorizing the formula of the logit is not necessary for understanding logistic regression. What you do need to remember is that the probability of the outcome being a success is a function of a linear combination of the explanatory variables.