Another beauty. (The logistic regression thing isn’t that big a deal, though—the logistic function only makes a difference at the extremes, and the fact that the RR is very close to one means it’s right in the middle.)
Credit should go to Andrew Gelman, who also points out (in his book with Jennifer Hill on hierarchical modeling) that the logistic regression coefficients do have a straightforward interpretation, at least when the probabilities are not too close to the extremes. (I’d have to look it up.)
I don’t have Gelman’s book, but: logistic regression says p = 1 / (1 + exp(-z)) where z is a linear combination of 1 and the independent variables. But then z is just the “log odds”, log(p/(1-p)); you can think of the coefficient of 1 as being the log prior odds ratio and the other coefficients as being the amount of evidence you get for X over not-X per unit change in each independent variable.
Another beauty. (The logistic regression thing isn’t that big a deal, though—the logistic function only makes a difference at the extremes, and the fact that the RR is very close to one means it’s right in the middle.)
Good point. And logistic regression coefficients are hard to interpret, so maybe logistic regression would be a poor choice in this case.
Credit should go to Andrew Gelman, who also points out (in his book with Jennifer Hill on hierarchical modeling) that the logistic regression coefficients do have a straightforward interpretation, at least when the probabilities are not too close to the extremes. (I’d have to look it up.)
I don’t have Gelman’s book, but: logistic regression says p = 1 / (1 + exp(-z)) where z is a linear combination of 1 and the independent variables. But then z is just the “log odds”, log(p/(1-p)); you can think of the coefficient of 1 as being the log prior odds ratio and the other coefficients as being the amount of evidence you get for X over not-X per unit change in each independent variable.